×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

Data Engineering Lead

Job in Greater London, London, Greater London, W1B, England, UK
Listing for: 9fin
Full Time position
Listed on 2026-02-28
Job specializations:
  • Software Development
    Data Engineer
Salary/Wage Range or Industry Benchmark: 80000 - 100000 GBP Yearly GBP 80000.00 100000.00 YEAR
Job Description & How to Apply Below
Location: Greater London

About 9fin

9fin is the AI platform powering global debt markets — the world’s largest asset class at over $145 trillion.

Debt markets are vast, global, and mission-critical, yet still run on fragmented data, PDFs, and manual workflows. 9fin replaces this broken infrastructure with a single platform that centralises proprietary credit data, deep analysis, and high-value workflows across global markets.

Today, 9fin powers teams at 300+ blue‑chip institutions worldwide, including global banks, asset managers, private equity firms, law firms, and advisors. The business is scaling at exceptional speed, with rapid expansion in the US and best‑in‑class retention driven by deep workflow adoption.

We’re at a defining inflection point. With proven product‑market fit and strong, global market pull, 9fin is accelerating toward becoming the category‑defining platform for debt markets worldwide.

What you’ll work on

As our Data Engineering Lead, you’ll own the technical roadmap for our data platform and raise the bar on reliability, scalability, and engineering quality. This is a hands‑on, player/coach role.

  • Own the Data Platform Roadmap
    :
    Set technical direction and deliver the highest‑leverage platform improvements across reliability, cost, developer experience, and scale. You’ll choose, prioritise and deliver the next 12 months of platform work.

  • Deliver customer‑facing APIs & Feeds
    :
    Support the design and delivery of API‑backed feeds and enrichment pipelines that become product features and revenue streams.

  • Level Up Orchestration in Dagster
    :
    Build and refine asset‑based pipelines, sensors, schedules, backfills, IO managers and monitoring patterns that are robust, idempotent and easy to operate. Define standard patterns for incremental jobs, full refreshes and reverse ETL.

  • Make ingestion boring (in the best way):
    Improve and scale ingestion across Airbyte OSS and DLT: handle schema drift, connector health, rate limits, retries, checkpointing and operational resilience so pipelines run without heroics.

  • Strengthen Big Query foundations
    :
    Own and evolve our best practices, partitioning & clustering strategy, slot/cost management and query performance guardrails.

  • Raise data quality & observability
    :
    Implement freshness SLOs, automated checks, validation, provenance and alerting so the business and customers can trust the data. Ship runbooks, incident playbooks and automated remediation where possible.

  • Enable customer pipelines & reverse ETL
    :
    Own correctness, availability and SLAs for customer‑facing workflows, including schema contracts and safe rollouts.

  • Infrastructure as Code & CI
    :
    Own Terraform modules, CI/CD flows for both infra and data code, and deployment safety gates that prevent costly mistakes.

  • Coach and grow the team
    : line‑manage and mentor engineers, raise standards through code reviews and testing, and build a high‑performance engineering culture that values operational excellence.

About You

We’re looking for a technical, high‑ownership Data Engineering Lead who can ship, operate and improve production systems — and translate platform work into real product and customer outcomes.

  • Hands‑on Data Engineering Leader
    :
    You’ve owned production data platforms end‑to‑end and can lead technical direction without stepping away from the code.

  • Dagster expertise
    :
    Demonstrable experience building asset‑based orchestration (assets, sensors, IO managers, resources) in production.

  • Warehouse & SQL depth
    :
    Advanced Big Query skills — partitioning, clustering, materialised views, slot/cost management and query optimisation.

  • Ingestion reality
    :
    You’ve operated Airbyte or similar, built resilient API/FTP/SFTP ingestion patterns, and handled rate limits, pagination, partial failures and schema evolution.

  • Production API & feeds experience
    :
    You’ve shipped customer‑consumable APIs or feeds (SLAs/contracts, monitoring, provenance) and can design stable downstream contracts and rollout plans.

  • Python & engineering rigor
    :
    You write clean, testable Python (typing/pydantic where appropriate), author unit and integration tests, and enforce CI and linting standards.

  • Multi‑cloud & IaC
    :
    Comfortable with AWS (ECS/Fargate, RDS, S3) and GCP (Big Query, GCS),…

Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary