×
Register Here to Apply for Jobs or Post Jobs. X

Software Engineer, Data Engineer

Job in Greater London, London, Greater London, W1B, England, UK
Listing for: Modo Energy Limited
Full Time position
Listed on 2026-02-28
Job specializations:
  • Software Development
    Software Engineer, Data Engineer
Job Description & How to Apply Below
Location: Greater London

At Modo Energy, we're building the global standard for benchmarking and valuing the world's electrification assets - unlocking returns for battery energy storage, solar, wind and data centres.

At Modo Energy, we're on a mission to build the information architecture for the energy transition - we want to be the only place to come to for information on the global journey to net zero. Take a look at our platform , where we provide open access to an array of content on the energy transition.

We're a dedicated and passionate team building a category-defining business, working on one of the world's most important priorities. We are looking for individuals who love product-building, want to work with pace at a mission-oriented startup, and will collaborate with us in shaping the culture of a rapidly growing team.

The role

The Data team is the foundational layer of the Modo platform. We ingest energy market data from providers across the globe, process it through our data lakehouse, and make it available to every team and product  operate across the full data stack, primarily in Python: ELT pipelines, distributed processing, real‑time streaming, and an API that serves our Terminal application, external customers, and our AI agent.

We're looking for a Backend Engineer to help us get our data in front of users. We've built the pipeline framework that lets our research teams create their own datasets, and now our focus is shifting to how we present this data  need to build a data‑presentation layer that will support our products and, ultimately, the broader energy industry. That means interactive plotting tools in the Modo Terminal, MCP servers, AI agents, direct data lake access, and the semantic layers that let users discover our data.

You'd be joining a small team with full ownership of the engineering stack and product decisions, which means your technical choices will matter and the standards you set will shape how the platform grows.

We are an AI‑native team. We expect everyone at Modo to make AI tools a core part of how they work, and that expectation will only grow. If you're already working this way, or genuinely excited to, you'll fit right in.

Responsibilities
  • API Design and Delivery: Design and build stable, well‑documented APIs that internal teams, paying customers, and our AI product depend on. Backwards compatibility, versioning, and developer experience matter here.
  • Data Presentation Layer: Build the query and response layers that make complex energy market data fast and accessible, thinking carefully about latency, caching, and how data needs to be shaped for different consumers.
  • Full Lifecycle Ownership: Work with the product team to decide what to build and how to measure success, scope and implement the work, deploy it, and keep it running. You own what you ship.
  • Pipeline and Data Architecture: You won't be building pipelines day‑to‑day, but you'll contribute to how our pipeline platform evolves and maintain the infrastructure it runs on.
  • Infrastructure and Dev Ops: Write and maintain Terraform, manage cloud infrastructure, and monitor platform health and API performance as a normal part of your working week.
  • Code Quality: Write well‑tested, maintainable code and contribute to engineering culture through thoughtful code review and clear documentation of design decisions.
Qualifications

Required experience:

  • 4+ years of professional software development experience with a strong track record of delivering production‑quality systems.
  • A track record of designing and shipping external‑facing APIs, with real attention to versioning, backwards compatibility, and OpenAPI/REST standards.
  • Hands‑on experience with infrastructure‑as‑code and cloud infrastructure day‑to‑day. We build on AWS with Terraform, but equivalent experience is fine.
  • Genuine excitement about making AI tools a core part of how you work.

Experience with any part of our data stack would be beneficial:
Django REST Framework, Apache Airflow, Apache Spark, Apache Iceberg, Apache Kafka, Terraform, and AWS (EMR, ECS). We manage much of this ourselves, so experience running your own Airflow cluster or working with Spark at the infrastructure level…

Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary