More jobs:
Job Description & How to Apply Below
Senior Data Engineer – Databricks
Job
Location:
Toronto, Ontario – Hybrid
Overview
As a Senior Data Engineer, you will work on an enterprise-wide Centralized Data Platform (CDP) built on Databricks, contributing to our data infrastructure, API integrations, and future AI initiatives, while collaborating with data scientists, analysts, and business stakeholders to transform raw financial data into actionable insights; you will build and maintain data pipelines on the Databricks Lakehouse Platform using agile methodologies, develop scalable ETL/ELT processes, implement data quality controls, and ensure data governance standards are met, all within our AWS cloud environment to power critical business operations with the highest standards of data security and compliance.
This role offers the opportunity to work with cutting-edge technologies while solving complex data challenges in a global financial services environment. You’ll be part of a team driving technical excellence and innovation within the data engineering practice.
What you will be doing
Design and develop Databricks solutions leveraging Lakehouse architecture for enterprise data processing and analytics
Develop and optimize ETL/ELT pipelines
Create and manage structured streaming pipelines for real-time data processing
Configure and optimize Databricks clusters and Spark jobs for optimal performance
Utilize Delta Live Tables for data ingestion and transformations
Apply Unity Catalog features and IAM best practices for security governance and access control
Support infrastructure and resource management using Terraform
Participate in Agile/Scrum development process and collaborate with team members
Implement monitoring solutions for pipeline performance and data quality
Contribute to code reviews and knowledge-sharing sessions
What you must have
6+ years of experience in data engineering
2+ years of hands‑on experience with Databricks platform
Strong expertise in Python and Spark programming
Demonstrable experience in using AI in development
Proven experience with AWS or other similar cloud services
Deep understanding of data modeling and SQL
Experience with Delta Lake and Lakehouse architecture
Strong knowledge of ETL/ELT principles and patterns
Experience with version control systems (Git)
Demonstrated ability to optimize data pipelines
Strong problem‑solving and analytical skills
Excellent communication and collaboration abilities
Nice to have
Financial services industry experience
Experience with multiple cloud providers
Knowledge of AI/ML implementation patterns
API development experience
Experience with real-time data processing
Data governance framework experience
Salary/Rate Range: $ – $
Thank you for your interest in this opportunity. If you are selected to move forward in the process, we will contact you directly. If you do not hear from us, we encourage you to continue visiting our website for other roles that may be a good fit.
For more information about TEEMA and to consider other career opportunities, please visit our website at
#J-18808-Ljbffr
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×