×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer

Job in Birmingham, West Midlands, B1, England, UK
Listing for: TXP
Full Time position
Listed on 2026-02-28
Job specializations:
  • IT/Tech
    Data Engineer, Data Science Manager, Cloud Computing
Salary/Wage Range or Industry Benchmark: 60000 - 80000 GBP Yearly GBP 60000.00 80000.00 YEAR
Job Description & How to Apply Below

We are TXP. We help businesses and organisations move forward, at pace and  believe in the transformative power of combining technology and people. By providing consulting expertise, development services and resourcing, we work closely with organisations to solve their most complex business problems.

Our work transforms organisations – and we take that responsibility seriously. We focus on success, pursue excellence and take ownership of everything we do.

We are seeking a Microsoft Fabric Data Engineer to join our technology function. You'll be joining a collaborative team, co-creating tools, supporting each other, providing governance, and building a community. The successful candidate will partner with Public and Private sector clients to deliver end‑to‑end data engineering solutions on Microsoft Fabric, from ingesting and transforming raw data to shaping curated Lakehouse layers that power analytics and reporting.

You will transform complex data estates into clean, governed and high‑performing platforms that give clients confidence in the insights they rely on.

We are proud of our culture and values, which guide everything we do:

  • Client Focus – We put our clients at the heart of every decision.
  • Adaptability – We embrace change and thrive in dynamic environments.
  • Responsibility – We take ownership and deliver with integrity.
  • Excellence in Delivery – We are committed to delivering outstanding results.
  • Success & Celebration – We celebrate achievements and learn from every experience.

As a Data Engineer you will embody these values in your leadership, decision‑making, and client interactions.

Key Responsibilities:

ETL and Data Pipeline Development

  • Design and build scalable ETL and ELT pipelines in Microsoft Fabric using PySpark in Fabric Notebooks and Dataflows Gen2 to ingest and transform data from ERP and business applications.
  • Implement robust ingestion patterns and orchestration using Fabric Data Factory capabilities, handling varied formats and refresh frequencies while writing curated outputs to Lakehouse tables.
  • Develop transformation logic that standardises, cleanses and harmonises data across business units, publishing to managed Delta tables for downstream analytics.
  • Apply incremental load and near real time replication strategies, including Mirroring where appropriate, to optimise runtime and latency for operational reliability.

Integration and Quality

  • Integrate data from diverse sources into One Lake and Lakehouse, enforcing data quality checks, validation rules and reconciliation steps in pipelines and notebooks to maintain accuracy and integrity.
  • Build error handling, logging and monitoring into pipelines, and document lineage using Fabric’s item lineage and the One Lake catalogue to support operational reliability and auditability.
  • Collaborate with stakeholders to translate business data requirements into transformation rules and curated models that meet reporting and analytics needs across Fabric workloads.

Lakehouse Development and Performance

  • Design and maintain medallion architecture in the Microsoft Fabric Lakehouse on One Lake with bronze, silver and gold layers, promoting clear contracts and progressive data quality.
  • Optimise storage formats, partitioning and table maintenance for Delta performance, applying practices such as V‑Order and efficient file sizing to improve query speed and cost effectiveness.
  • Prepare gold layer datasets for downstream analytics with Direct Lake connected semantic models to deliver high performance BI without heavy refresh cycles.
  • Create reusable frameworks and templates for common ingestion and transformation patterns to accelerate team delivery in Fabric.

Collaboration, Governance and Continuous Improvement

  • Work with data leaders to implement standards for workspace design, security and lifecycle management across Fabric capacities, using One Lake and Fabric governance features.
  • Contribute to code reviews and CI/CD practices for Fabric items and notebooks, aligning with Microsoft’s analytics engineering guidance and development lifecycle.
  • Support testing activities including unit, integration and user acceptance testing for pipelines, notebooks and data models,…
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary