×
Register Here to Apply for Jobs or Post Jobs. X

Senior Data & BI Developer​/Architect

Job in Toronto, Ontario, M5A, Canada
Listing for: Lancesoft
Full Time position
Listed on 2026-02-28
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst, Data Science Manager, Data Warehousing
Job Description & How to Apply Below
Position: Senior Data & BI Developer / Architect
Location Address:
Hybrid - Toronto –3-4 days/week (Wednesday, Thursday, Friday)
Subject to change: 3–4 days onsite may be required based on business needs
Contract Duration: ASAP to 12/31/2026 –approx. 9 months
Possibility of extension (on other projects) & conversion to FTE
Schedule

Hours:

9am-5pm Monday-Friday;standard 37.5 hrs/week
Role:
Senior Data & BI Developer / Architect (Power BI, MS Fabric, Databricks)

Story Behind the Need
Business group:
Credit Risk Data & Analytics - Business Banking Credit Risk (BBCR) –Data & Analytics - The BBCR Data & Analytics team supports analytics and reporting across four major business lines (International, GBM Commercial, Wealth Management) for the corporate credit portfolio. 7-8 team members: business intelligence managers, PMs, architects, and within analytics
Project:
We are looking for a skilled Senior Data & Business Intelligence Developer/Architect to drive data initiatives in Business Banking Credit Risk (BBCR).
This role focuses on modernizing data architecture, boosting analytics, and developing scalable, cloud-native BI solutions using Power BI, Microsoft Fabric, and Databricks.
The Senior BI Developer/Architect will modernize BBCR’s data pipelines, semantic models, and reporting workflows using Power BI, Microsoft Fabric, and Databricks.

Candidate Value Proposition:
The successful candidate will:


• Work in a top 5 Canadian bank with high visibility across business, risk, data, and technology functions.

• Gain experience with leading modern data stack technologies (Fabric, Databricks, Lakehouse).

• Influence data models and architecture that support enterprise credit risk analytics.

• Work at the intersection of business strategy, analytics, and data engineering, impacting decisions across the corporate portfolio.

Candidate Requirements/Must Have Skills:
1. 5+ years in Data Engineering, BI Development, or Data/BI Architecture roles.
2. 5+ years hands-on with Power BI (DAX, M, data modeling, Premium capacities).
3. 5+ years of advanced SQL (query optimization, stored procedures, performance tuning).
4. 5+ years working with Databricks (PySpark, Delta Lake, notebooks, workflow orchestration).
5. 5+ years with Dimensional modeling (Kimball), semantic modeling, data governance, star schemas.
6. 5+ years implementing Lakehouse architecture and modern data engineering patterns.
7. 5+ years working with CI/CD for analytics, version control (Git), and Dev Ops pipelines.

Nice-To-Have Skills:
1) Microsoft Fabric (Data Engineering, Data Factory, Lakehouse, Warehouse, KQL).
2) Data Lake file formats:
Parquet, Delta, ORC, DuckDB (nice to have).
3) Experience in Risk Analytics, Banking, or Credit Risk domains.
4) Familiarity with Power Shell, Power Platform (Power Apps, Power Automate), Python.
5) Architect and refine Lakehouse and Medallion-based data structures using Delta/Parquet/DuckDB formats.
6) Knowledge of Security, Privacy, and Compliance standards in financial institutions.
7) Experience working in enterprise environments with structured SDLC processes.

Soft Skills Required:

• Strong communication with ability to translate technical details for business stakeholders.

• Proven ability to work independently in a fast-paced, multi-project environment.

• Problem solver mindset with focus on quality, accuracy, and analytical rigor.

Collaborative team player who contributes to architectural discussions and recommendations.

Education:
Bachelors in technical field
Data Analytics / Science certifications an asset

Best VS. Average Candidate:
Best Candidate
Strong architect-level thinker with hands on Power BI + Databricks expertise.
Can design future-proof models and scalable BI assets.
Communicates clearly with both technical and non technical teams.
Strong balance of data modeling, governance, and modern analytics engineering.
Average Candidate
Can build dashboards but struggles with semantic modeling or data architecture.
Understands Databricks but lacks depth in PySpark or Lakehouse patterns.
Needs significant guidance on governance, model design, or performance tuning.

Candidate Review & Selection –Interview Process
2 rounds –In person –40 King Street West –up to 45 minutes
1st –with senior managers and Director –could include a technical test on SQL
2nd –with senior managers and Director –further evaluation

Hiring Manager’s availability to interview: ASAP

**
* NOTE:

*** will review any requests for accommodations put forward by Suppliers***
Position Requirements
10+ Years work experience
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary