Job Description
What is the opportunity?Are you a talented, creative, and results-driven professional who thrives on delivering high-performing applications. Come join us!Global Functions Technology (GFT) is part of RBC’s Technology and Operations division. GFT’s impact is far-reaching as we collaborate with partners from across the company to deliver innovative and transformative IT solutions. Our clients represent Risk, Finance, HR, CAO, Audit, Legal, Compliance, Financial Crime, Capital Markets, Personal and Commercial Banking and Wealth Management. We also lead the development of digital tools and platforms to enhance collaboration.
We are looking for a MLOps Engineer to help design and build a production-grade machine learning pipeline for financial risk model training and inference. The pipeline will support model training/testing/inference using Python and PySpark, on public cloud (AWS) and on-premises infrastructure.
This role is ideal for an engineer who combines Python programming, system design, and cloud engineering skills with a solid understanding of machine learning model lifecycle management from data preparation through training, validation, registration, and operational inference.
You’ll collaborate closely with data scientists, Dev Ops, and risk IT teams to build a reliable, automated, and auditable MLOps platform that meets enterprise standards for security, governance, and scalability.
What will you do?
Design and implement end-to-end reusable MLOps pipelines with a team of engineers to train, test, register, and deploy machine learning models
Build and automate model lifecycle management workflows including versioning, promotion, approval, and deprecation.
Develop and integrate a model registry (e.g., MLflow, Sage Maker Model Registry, or custom solution) to manage model metadata, lineage, and reproducibility.
Orchestrate data and training workflows using tools such as Airflow, AWS Step Functions, stonebranch, or Prefect.
Implement CI/CD pipelines using Git Hub Actions, Jenkins, or AWS Code Pipeline, ensuring consistent and automated deployment processes.
Build data preparation and training scripts in Python and PySpark, optimized for performance and scalability on AWS EMR, Cloudera Data Platform, or similar.
Manage model artifacts, dependencies, and environments across AWS and on-premis.
Ensure strong observability and auditability through structured logging, metrics, and model performance tracking.
Collaborate with Dev Ops and data engineering teams to ensure secure integration, data governance, and production readiness.
Must Have:
Knowledge of AWS data and ML services e.g., S3, EMR, Lambda, Step Functions, ECS/EKS, Sage Maker, Cloud Watch, IAM.
Understanding of model lifecycle management from training and testing to deployment, monitoring, and retraining.
Experience with CI/CD practices, using tools like Git Hub Actions, Jenkins, or Code Pipeline.
Familiarity with hybrid deployment environments (AWS and on-prem) and related networking/security considerations.
Knowledge of Python scripting for automation and ML workflow integration.
Knowledge of PySpark for distributed data processing and model training.
3+ years of experience in software engineering, data engineering, or MLOps.
1+ year experience working with AWS components
Experience working with containers and infrastructure automation.
Experience working with Linux systems, shell scripting, and environment management.
AWS Certified Cloud Practitioner - Amazon Web Services
Bachelor’s degree in computer science, engineering, data science, or related quantitative and technical fields.
AWS Certified Machine Learning Engineer Associate, or Certified Solution Architect Associate, or Cloud Ops/Sys Ops Engineer Associate
Experience implementing model monitoring and drift detection.
Familiarity with distributed training and parallel compute frameworks (Ray, Spark, Dask).
Experience with feature stores, data lineage, or metadata tracking systems.
Exposure to financial risk modeling workflows.
We thrive on the challenge to be…
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search: