More jobs:
Data Engineer - AI/ML
Job in
Toronto, Ontario, C6A, Canada
Listed on 2026-02-28
Listing for:
Astra-North Infoteck Inc. ~ Conquering today’s challenges, achieving tomorrow’s vision!
Full Time
position Listed on 2026-02-28
Job specializations:
-
IT/Tech
Data Engineer, AI Engineer, Data Analyst, Data Scientist
Job Description & How to Apply Below
Data Engineer
Toronto - Hybrid (3-4 days from Office)
Primary Skill
Digital :
Python:
AI & Gen AI - Products & Tools
The Data Engineer – Regulatory Reporting is responsible for designing, building, and maintaining data pipelines and AI powered solutions that support regulatory reporting requirements. This role combines strong data engineering foundations with emerging GenAI and ML technologies to ensure accurate, timely, and compliant reporting across the enterprise. The position also includes training and work with the Axiom regulatory reporting tool, supporting automation and data quality efforts, and collaborating with technical and business stakeholders.
Key Responsibilities- Design, develop, and maintain large scale data pipelines and data architectures using Python.
- Integrate GenAI models (e.g., ChatGPT) to enhance data processing and reporting automation.
- Build scalable, reusable, and secure data solutions aligned with regulatory reporting needs.
- Perform data analytics to extract insights from large datasets.
- Support regulatory reporting teams with data investigations, validation, and root cause analysis.
- Develop and deploy AI/ML models using GenAI technologies, with a focus on NLP and machine learning.
- Apply models to streamline and enhance regulatory reporting workflows.
- Receive training on and work with the Axiom regulatory reporting tool.
- Integrate Axiom with existing data pipelines and support ongoing regulatory reporting requirements.
- Strong SQL knowledge required to effectively learn and use Axiom.
- Cohere Model Experience (Nice to Have):
Ability to leverage Cohere models for NLP use cases. - Unix Experience (Nice to Have):
Familiarity with Unix systems for automation and Axiom related tasks.
- Troubleshoot data pipeline failures and data quality issues.
- Optimize data processing performance and ensure end to end data accuracy for reporting.
- Maintain clear and up to date documentation covering data pipelines, architectures, integration logic, and ML models.
- Support knowledge sharing across engineering and compliance teams.
- Stay current with trends in GenAI, AI/ML, Python engineering, regulatory reporting, and data tooling.
- Recommend and implement improvements to data quality, automation, and regulatory workflows.
Experience
- 4–6 years of experience in data engineering, with strong exposure to Python and GenAI technologies.
- Hands on experience using ChatGPT or other GenAI models.
- Strong SQL expertise and experience working with large datasets.
Technical Skills
- Python, SQL, data pipeline engineering.
- Understanding of AI/ML fundamentals and NLP models.
- Experience with Unix (preferred).
- Familiarity with Cohere models (nice to have).
- Ability to analyze logs, troubleshoot issues, and support production pipelines.
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×