×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

Local to WA - Azure Data analyst​/Engineer Fabric

Job in Bellevue, King County, Washington, 98009, USA
Listing for: E-Solutions
Full Time position
Listed on 2026-02-18
Job specializations:
  • IT/Tech
    Data Engineer
Salary/Wage Range or Industry Benchmark: 80000 - 100000 USD Yearly USD 80000.00 100000.00 YEAR
Job Description & How to Apply Below
Position: Local to WA only - Azure Data analyst/Engineer with Fabric

Azure Data analyst/Engineer with Fabric (Local to WA)

Bellevue - Washington Hybrid

Mandatory Skills

Microsoft Fabric - Warehousing, Microsoft Fabric - Data Engineering

Requirements Gathering Solution Design

Engage with business stakeholders to understand analytical, operational and compliance needs

Convert business requirements into functional design source‑to‑target mappings, transformation logic and technical specifications

Validate requirements against enterprise data models and recommend architecture patterns:
Lakehouse, Warehouse, Realtime Hub

Data Modeling Fabric Semantic Layer

Design, build and govern Fabric Semantic Models, Direct Lake Import, DQ Hybrid modes

Define enterprise wide canonical models: shared dimensions, hierarchies, KPIs and reusable DAX measures

Optimize semantic models for performance using aggregations, incremental refresh and partitioning strategies

Enable certified datasets, semantic governance and role‑level security within Fabric

ETL / ELT Engineering with Fabric Pipelines, Dataflows, Notebooks

Build ingestion and transformation processes using Data Factory pipelines, Dataflows (Gen2), Warehouse pipelines and PySpark notebooks

Maintain metadata‑driven ETL patterns and reusable frameworks for ingestion, harmonization and transformation

Fabric Notebook Engineering

Use Fabric Notebooks to perform

  • PySpark transformations
  • Data validation and data quality checks
  • ML feature engineering and lightweight model operations

Automate notebook execution via pipelines, triggers and Fabric scheduling

Integrate notebooks with Lakehouse tables, Warehouse tables and ML model outputs

Near Real Time (NRT) Data Processing in Fabric

Design and implement near‑real‑time data ingestion pipelines using

  • Fabric Real Time Hub
  • Event Streams
  • KQL Databases
  • Streaming Dataflows

Build streaming transformations and real‑time analytical models leveraging Kusto Query Language (KQL) and PySpark Structured Streaming

Ensure low‑latency ingestion to Lakehouse/Warehouse for downstream consumption

Optimize real‑time workloads for durability, recovery and performance under high throughput

Build dashboards and semantic models that support near‑real‑time refresh scenarios

Performance Optimization

Optimize SQL queries, Lakehouse Delta tables, semantic models, DAX expressions and Power BI datasets

Review and tune pipeline throughput, notebook execution performance and refresh schedules

Improve Direct Lake performance by optimizing storage layouts, file size distributions and columns

Perform workload monitoring using Fabric capacity metrics and logs

Build high‑quality Power BI dashboards and enterprise reports integrated with centralized semantic models

Develop DAX calculations, KPIs, UX/UI standards, drillthroughs and row‑level security

Drive semantic model reuse and promote governed gold datasets

Governance, Security & Compliance (Purview Integration)

Implement data governance and cataloging using Microsoft Purview for Fabric assets

Manage lineage tracking, glossary management, classification and metadata enrichment

Define enterprise security controls: RBAC, masking, PII handling, encryption, retention policies

Ensure compliance with GDPR, CCPA, HIPAA, SOX and internal audit controls

Govern Fabric workspace structure, capacity usage, data certification processes and lifecycle management

Technical Leadership & Program Delivery

Lead data engineers, BI developers and analysts across multiple initiatives

Review designs, STTMs, code, semantic models and performance benchmarks

Own sprint planning, estimation, milestone tracking and stakeholder communication

Promote documentation, technical standards, reusable design frameworks and automation

Required Skills & Experience

Technical Expertise

8 years of data engineering or BI experience with 2 years in Microsoft Fabric

Strong hands‑on experience with

  • Fabric Lake houses, Warehouses
  • Fabric Semantic Models, Direct Lake Import, Hybrid
  • Real Time Hub, Event Streams, KQL Databases
  • Notebooks, PySpark data transformations, optimization
  • Data Factory pipelines, Dataflows (Gen2)

Solid understanding of Delta Lake, Spark performance tuning and workload optimization

Expertise in implementing source‑to‑target mappings, transformation logic and validation rules

Preferred Skills

Knowledge of Data Ops, CICD, Git, Dev Ops pipelines and unit testing

Familiarity with Fabric AI Copilot for Power BI and AI‑driven engineering accelerators

Soft Skills

Excellent communication, articulation and stakeholder management

Strong leadership skills and the ability to mentor others

Problem‑solving mindset with a focus on scalability, efficiency and accuracy

#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary