Intermediate Data Architect with Power BI, Microsoft Fabric and Databricks experience to support one of our major banking clients- /
Location Address:
Hybrid
- Toronto – 3-4 days/week (Wednesday, Thursday, Friday)
Contract Duration: ASAP to 12/31/ – approx. 9 months (Possibility of extension)
Schedule
Hours:
9am-5pm Monday-Friday; standard 37.5 hrs/week
Role:
Senior Data & BI Developer / Architect (Power BI, MS Fabric, Databricks)
Story Behind the Need
Business group:
Credit Risk Data & Analytics - Business Banking Credit Risk (BBCR) – Data & Analytics - The BBCR Data & Analytics team supports analytics and reporting across four major business lines (International, GBM Commercial, Wealth Management) for the corporate credit portfolio. 7-8 team members: business intelligence managers, PMs, architects, and within analytics
Project:
We are looking for a skilled Senior Data & Business Intelligence Developer/Architect to drive data initiatives in Business Banking Credit Risk (BBCR).
This role focuses on modernizing data architecture, boosting analytics, and developing scalable, cloud-native BI solutions using Power BI, Microsoft Fabric, and Databricks.
The Senior BI Developer/Architect will modernize BBCR’s data pipelines, semantic models, and reporting workflows using Power BI, Microsoft Fabric, and Databricks.
Typical Day in Role:
• Design, build, and optimize data pipelines using Power BI Premium, Microsoft Fabric, and Databricks.
• Develop semantic models, star schemas, tabular models, and clean data architecture for enterprise analytics.
• Build high‑quality Power BI datasets, dashboards, and reports aligned with business requirements.
• Implement data quality controls, lineage, metadata standards, and observability metrics.
• Collaborate with business partners, data engineers, risk analysts, and architects to support analytics needs.
• Manage workspace governance, performance tuning, and capacity optimization across Power BI Premium/Fabric.
• Contribute to BBCR’s BI platform modernization, including model redesign and data warehousing improvements.
Candidate Value Proposition:
The successful candidate will:
• Work in a top 5 Canadian bank with high visibility across business, risk, data, and technology functions.
• Gain experience with leading modern data stack technologies (Fabric, Databricks, Lakehouse).
• Influence data models and architecture that support enterprise credit risk analytics.
• Work at the intersection of business strategy, analytics, and data engineering, impacting decisions across the corporate portfolio.
Candidate Requirements/Must Have
Skills:
1. 5-8+ years in Data Engineering, BI Development, or Data/BI Architecture roles.
2. 5-8+ years hands-on with Power BI (DAX, M, data modeling, Premium capacities).
3. 5-8+ years of advanced SQL (query optimization, stored procedures, performance tuning).
4. 5-8+ years working with Databricks (PySpark, Delta Lake, notebooks, workflow orchestration).
5. 5-8+ years with Dimensional modeling (Kimball), semantic modeling, data governance, star schemas.
6. 5-8+ years implementing Lakehouse architecture and modern data engineering patterns.
7. 5-8+ years working with CI/CD for analytics, version control (Git), and Dev Ops pipelines.
Nice-To-Have
Skills:
1) Microsoft Fabric (Data Engineering, Data Factory, Lakehouse, Warehouse, KQL).
2) Data Lake file formats:
Parquet, Delta, ORC, DuckDB (nice to have).
3) Experience in Risk Analytics, Banking, or Credit Risk domains.
4) Familiarity with Power Shell, Power Platform (Power Apps, Power Automate), Python.
5) Architect and refine Lakehouse and Medallion-based data structures using Delta/Parquet/DuckDB formats.
6) Knowledge of Security, Privacy, and Compliance standards in financial institutions.
7) Experience working in enterprise environments with structured SDLC processes.
Education:
Bachelors in technical field
Data Analytics / Science certifications an asset
Best VS. Average Candidate:
Best Candidate
Strong architect-level thinker with hands‑on Power BI + Databricks expertise.
Can design future-proof models and scalable BI assets.
Communicates clearly with both technical and non‑technical teams.
Strong balance of data modeling, governance, and modern analytics engineering.
Average Candidate
Can build dashboards but struggles with semantic modeling or data architecture.
Understands Databricks but lacks depth in PySpark or Lakehouse patterns.
Needs significant guidance on governance, model design, or performance tuning.
Candidate Review & Selection – Interview Process
2 rounds – In person – 40 King Street West – up to 45 minutes
1st – with senior managers and Director – could include a technical test on SQL
2nd – with senior managers and Director – further evaluation
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search: