×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

Data Engineer

Job in Greater London, London, Greater London, W1B, England, UK
Listing for: Chambers & Partners
Full Time position
Listed on 2026-01-22
Job specializations:
  • IT/Tech
    Data Engineer
Salary/Wage Range or Industry Benchmark: 60000 - 80000 GBP Yearly GBP 60000.00 80000.00 YEAR
Job Description & How to Apply Below
Location: Greater London

Overview We’re looking for a mid‑level Data Engineer to design, build, and deploy high‑quality data solutions across Chambers’ products, platforms, and applications, ensuring they meet data engineering best practices and quality standards. In this role, you will champion engineering excellence and act as a subject matter expert for data‑related projects, ensuring performance, scalability, and compliance with standards across the data engineering team.

Equal

Opportunity Statement

We are committed to fostering and promoting an inclusive professional environment for all of our employees, and we are proud to be an equal opportunity employer. Diversity and inclusion are integral values of Chambers and Partners and are key in our culture. We are committed to providing equal employment opportunities for all qualified individuals regardless of age, disability, race, sex, sexual orientation, gender reassignment, religion or belief, marital status, or pregnancy and maternity.

This commitment applies across all of our employment policies and practices, from recruiting and hiring to training and career development. We support our employees through our internal INSPIRE committee with Executive Sponsors, Chairs and Ambassadors throughout the business promoting knowledge and effecting change.

Applicants who identify as Disabled and/or Neurodiverse will be entitled to an interview if they meet the minimum criteria as specified in the Job Description, additionally we will offer reasonable adjustments to those who require them. Some examples of reasonable adjustments are extra time in assessments, video interviews to combat travel-based issues and advice on expected interview topics/questions.

Main Duties and Responsibilities
  • Write clean and testable code using SQL and Python scripting languages, to enable our customer data products and business applications
  • Build and manage data pipelines and notebooks, deploying code in a structured, trackable and safe manner
  • Effectively create, optimise and maintain automated systems and processes across a given project(s) or technical domain
  • Analyse, profile and plan work, aligned with project priorities
  • Perform reviews of code, refactoring where necessary
  • Deploy code in a structured, trackable and safe manner
  • Document your data developments and operational procedures
  • Ensure adherence to data/software delivery standards and effective delivery.
  • Help monitor, troubleshoot and resolve production data issues when they occur
  • Contribute to the continuous improvement of the team
  • Contribute to the team’s ability to make and deliver on their commitments
  • Innovate and experiment with technology to deliver real business benefits.
  • Regularly launch products and services based on your work and be an integral part of making these a success.
  • Guide, influence and challenge the technology team and stakeholders to understand the benefits, pros and cons of various technical options.
  • Guide and mentor less experienced developers assigned on projects.
  • Promote an innovative thinking process and encourage it in others.
  • Working within the agile framework at Chambers
Skills and Experience
  • Strong proficiency in SQL
    , including Spark SQL and MS SQL Server
    , for querying, data manipulation, and performance optimization.
  • Hands‑on experience with Python and Py Spark for data processing, transformation, and automation tasks.
  • Proven ability to design, build, and maintain scalable data pipelines for batch and streaming data using modern frameworks.
  • Experience working with Databricks for big data processing, Spark‑based transformations, and collaborative analytics workflows.
  • Familiarity with Azure Data Factory (ADF) for data orchestration and integration across cloud environments.
  • Skilled in version control using Git Hub and implementing CI/CD pipelines in Azure Dev Ops for data engineering workflows.
  • Knowledge of data modeling and schema design to support efficient analytics and reporting.
  • Understanding of cloud‑based data platforms (e.g., Azure, AWS, GCP) and integration with modern data pipelines.
  • Ability to monitor, troubleshoot, and optimize pipeline performance
    , ensuring data quality and reliability.
  • Experience collaborating with dat…
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary