×
Register Here to Apply for Jobs or Post Jobs. X

Senior Data Engineer; Global Security

Job in Toronto, Ontario, C6A, Canada
Listing for: ODAIA
Full Time position
Listed on 2026-01-09
Job specializations:
  • IT/Tech
    Data Engineer, Data Science Manager, Big Data, Data Analyst
Salary/Wage Range or Industry Benchmark: 80000 - 100000 CAD Yearly CAD 80000.00 100000.00 YEAR
Job Description & How to Apply Below
Position: Senior Data Engineer, (Global Security)

Job Description

For a Senior Data Engineer to join our collaborative cyber security team, focusing on building scalable data solutions that directly enhance our application ecosystem. This role will work together with our application developers to integrate a robust data capability into our application to transform how our products leverage data for better design and business value. You’ll be instrumental in developing a comprehensive data strategy that seamlessly bridges the gaps between data and application functionality.

What

will you do?
  • Design, develop, and maintain end-to-end data pipelines in Azure Databricks using Spark (SQL, PySpark).
  • Implement and optimize ELT/ELT workflows using Databricks Workflows or Apache Airflow ensuring data integrity, quality, and reliability.
  • Manage Delta Lake solutions for data versioning, incremental loads, and efficient application data access. Apply best practices in data governance, ensuring compliance using Unity Catalog for access management and data lineage tracking.
  • Monitor, troubleshoot, and optimize Spark jobs for performance, addressing data pipelines bottlenecks that impact application responsiveness.
  • Build automated monitoring, alerting, and incident management solution to ensure data reliability, availability, and performance.
  • Work directly with Python application developers to integrate data capabilities into existing and new applications.
  • Collaborate with our developers and cross‑functional teams to integrate data capabilities into existing and new applications.
  • Build APIs and data services that applications can consume for real‑time and batch data processing.
  • Develop and maintain comprehensive documentation for data pipelines, transformations and data models. Foster knowledge and collaborative functions.
What do you need to succeed?
  • Bachelor’s or master’s degree in computer science, Data Engineering, or a related field.
  • 5+ years of proven experience in data engineering
    , delivering business‑critical software solutions for large enterprises with a consistent track record of success.
  • Strong expertise in Databricks (Delta Lake, Unity Catalog, Lakehouse Architecture, Table Triggers, Delta Live Pipelines, Databricks Runtime, Cluster management etc.)
  • Strong experience in Spark and Py Spark for big data processing.
  • Strong experience in application development in python, building and maintaining microservices.
  • Knowledge of SCM, Infrastructure‑as‑code, and CI/CD pipelines.
Nice to Have
  • Databricks certifications (e.g., Databricks Certified Data Engineer, Spark Engineer).
  • Exposure to Kubernetes, Docker, and Terraform.
  • Strong understanding of business intelligence and reporting tools.
  • Familiarity with Cyber Security Concepts.
What’s in it for you?

We thrive on the challenge to be our best, progressive thinking to keep growing, and working together to deliver trusted advice to help our clients thrive and communities prosper. We care about each other, reaching our potential, making a difference to our communities, and achieving success that is mutual.

  • A comprehensive Total Rewards Program including bonuses and flexible benefits, competitive compensation, commissions, and stock where applicable.
  • Leaders who support your development through coaching and managing opportunities.
  • Ability to make a difference and lasting impact.
  • Work in a dynamic, collaborative, progressive, and high‑performing team.
  • A world‑class training program in financial services.
  • Opportunities to do challenging work. Opportunities to take on progressively greater accountabilities. Opportunities to building close relationships with clients.
  • Access to a variety of job opportunities across business and geographies.
Job Skills
  • Big Data Management
  • Cloud Computing
  • Database Development
  • Databricks Platform
  • Data Engineering
  • Data Mining
  • Data Pipelines
  • Data Warehousing (DW)
  • ETL Development
  • ETL Processing
  • Group Problem Solving
  • Microservice Framework
  • Microsoft Azure Databricks
  • Python (Programming Language)
  • Quality Management
  • Requirements Analysis
Note

Applications will be accepted until 11:59 PM on the day prior to the Final date to receive applications date above.

Inclusion and Equal Opportunity Employment

At RBC, we believe an inclusive…

Position Requirements
10+ Years work experience
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary