×
Register Here to Apply for Jobs or Post Jobs. X

Senior Data Engineer; Fabric

Job in Bradford, West Yorkshire, NE70, England, UK
Listing for: FGH (Freemans Grattan Holdings)
Full Time position
Listed on 2026-01-30
Job specializations:
  • IT/Tech
    Data Engineer, Data Science Manager, Cloud Computing, Data Analyst
Job Description & How to Apply Below
Position: Senior Data Engineer (Fabric)

FGH Business Centre 66-70 Vicar Ln, Bradford BD1 5AJ

Hybrid flexibility: 2 office days per week

About

The Role

At FGH, data is not a reporting afterthought - it is a strategic enabler for growth, efficiency, and AI-driven decision-making. As a Senior Data Engineer, you will play a pivotal role in transforming how data is designed, delivered, and consumed across a digital retail organisation, shaping and ope rationalising our modern data and AI platform, working at the intersection of architecture, engineering, and innovation.

This role offers the opportunity to build at scale, influence platform direction, and directly enable advanced analytics and AI use cases across the business.

You’ll work closely with our Data Architect, BI teams, and cross functional data product owners to build a next generation data platform using Microsoft Fabric, Lakehouse architecture.

Accountabilities Solution Delivery
  • You will be working with the FGH business to design & build end-to-end ETL data solutions using Microsoft Fabric as well as real-time data processing.
  • You will design and build data pipelines using Data Flows (Gen2) and Fabric Notebooks (Spark SQL & Python) to ingest and transform data from on-prem and third-party data solutions.
  • Design and develop Data Lakehouse’s using Medallion architecture standards.
  • Implement Semantic Layers using Star Schema Modelling (Kimball) & DAX in collaboration with the BI Team/Lead.
  • Deployment of versioned artifacts using Dev Ops CI/CD.
  • Support data product teams with reusable components and infrastructure.
  • Optimise data storage and retrieval for analytics, AI, and operational use cases.
Data Governance & Compliance
  • Embed data quality, lineage, and observability into all pipelines.
  • Support metadata management and data cataloguing initiatives.
  • Ensure compliance with data protection standards (e.g., GDPR, ISO 27001).
  • Collaborate with Info Sec and Risk teams to implement secure data handling practices.
Data Integration & Management
  • Integrate data from internal and third-party sources (e.g., CRM, ERP, APIs).
  • Ensure consistency, interoperability, and performance across data flows.
  • Monitor and troubleshoot pipeline health and data reliability.
  • Support real-time and batch processing environments.
  • Apply engineering principles that support modularity, scalability, and resilience.
  • Automate deployment, testing, and monitoring of data pipelines.
  • Contribute to platform sustainability and energy efficiency.
  • Align engineering practices with enterprise architecture and business goals.
Relationship Management
  • Collaborate with data architects, analysts, and business stakeholders.
  • Engage with platform teams to ensure infrastructure readiness.
  • Support data product teams with technical enablement and onboarding.
  • Evaluate and manage third-party data tools and services.
Personal & Professional Development
  • Stay current with emerging data engineering tools and cloud services.
  • Pursue relevant certifications and continuous learning (Fabric Data Engineer - DP-700).
  • Contribute to knowledge sharing and mentoring within the data community.
  • Promote a culture of data reliability, automation, and innovation.
About You
  • Able to commute to Bradford City Centre
  • A relevant computer degree or Microsoft Certification, e.g. DP-700:
    Designing and Implementing a Data Analytics Solution Using Microsoft Fabric
  • Evidence of formal training, certification or several years of experience in SQL, Python, or Spark. Familiarity with data mapping frameworks.
Engineering & Technical Skills
  • Data Pipeline Development:
    Proven experience designing and building scalable ETL/ELT pipelines.
  • Data Platform:
    Direct experience working with the Microsoft Fabric Platform (Data flow Gen2, Notebooks & Semantic Models) and storage solutions (e.g., Data Lake, Delta Lake).
  • Programming & Scripting:
    Proficiency in SQL and Python for data manipulation, transformation, and automation; familiarity with Spark-SQL
  • Data Integration:
    Experience integrating structured and unstructured data from diverse sources including APIs, flat files, databases, and third-party platforms (e.g., CRM, ERP).
  • Data Observability & Quality:
    Ability to implement monitoring, logging, and alerting for data pipelines.
  • Abi…
Position Requirements
10+ Years work experience
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary