×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer

Job in Greater London, London, Greater London, EC1A, England, UK
Listing for: Octopus Energy Group
Full Time position
Listed on 2026-01-13
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst
Job Description & How to Apply Below
Location: Greater London

About Octopus Electroverse

We’re making electric vehicle charging as smart and as simple as possible, by building the giant, virtual charging platform of the future.

In just four years, Octopus Electroverse has grown to become one of the largest eMobility players globally, with over 900,000 connected electric vehicle chargers and a customer ecosystem spanning web, iOS, Android, Car Play, Android Auto & Automotive OS, and more.

But it’s just the start: we’re busy expanding internationally, working with more automotive and tech partners, building exciting new features at scale, and creating the integrated charging experiences of the future - all in the name of making public EV charging super simple for customers.

Electroverse is a multifunctional team made up of product, development, commercial, operations, marketing, partnerships and more - all focused together on making Octopus the go‑to name in EV charging.

We are seeking a passionate data professional to drive the electrification of transport by transforming complex information into strategic assets. You will work with a rich variety of datasets, including charge point telemetry, user behaviour, marketing campaigns, sales figures, and operational metrics. Your mission will be to uncover actionable insights that enhance efficiency and guide decision‑making across the entire business.

This role offers end‑to‑end ownership of the data pipeline. As a key member of a small, high‑growth team, you will do more than just manage data; you will be instrumental in building our data function. This includes establishing best practices, creating automated pipelines, and shaping the foundational framework for how we leverage data to succeed.

What you’ll do:
  • You’ll develop and maintain data pipelines and automated processes in Airflow and Python
  • You’ll create SQL data models with dbt to power dashboards and applications
  • You’ll integrate third‑party APIs and databases into our data flows
  • You’ll perform in‑depth analysis and data transformations with SQL, Python, and Jupyter Notebooks
  • You’ll prototype internal data applications and tools (Streamlit, Jupyter)
  • You’ll ensure data quality and reliability throughout the lifecycle
  • You’ll collaborate with product, technology, and strategy teams to deliver high‑impact insights and tools
What you’ll need:
  • Data Modelling

    Experience:

    Proficiency in using dbt (ideally) for data modelling
  • Dashboard and Data Product Development:
    Experience in creating data dashboards and developing data products.
  • Collaborative Projects:
    Experience working on collaborative projects with business teams and familiarity with agile or similar methodologies.
  • Autonomous Problem‑Solving:
    Ability to work independently, scope problems, and deliver pragmatic solutions.
  • Data Lifecycle Expertise:
    Versatility in handling the entire data lifecycle, from ingestion to visualisation and presentation.
  • Tool Building:
    Passion for creating robust and usable tools to enhance team efficiency.
  • Data Analysis

    Skills:

    Proficiency in SQL, python, pandas/numpy, Databricks/Jupyter Notebooks, and experience with large datasets in production environments.
  • Curiosity and Drive:
    Self‑motivated with a strong desire to learn and improve without lots of help and support.
  • Growth mindset:
    Learns fast and is enthusiastic about learning new technologies
  • Time Management:
    Ability to manage multiple projects simultaneously in a fast‑paced environment.
  • Passion for Net Zero:
    You don’t need to be deeply familiar with the EV market and products (we can teach that), but a passion for the transition to net zero is an excellent start
What we work with:
  • dbt for data modelling
  • Databricks Delta Lake for data lake and warehouse storage and querying
  • Python as our main programming language
  • Jupyter and Jupyter Hub for notebook analytics and collaboration
  • Circle CI for continuous deployment
  • AWS cloud infrastructure
  • Kubernetes for data services and task orchestration
  • Google Analytics, Amplitude and Firebase for client applications event processing
  • Airflow for job scheduling and tracking
  • Parquet and Delta file formats on S3 for data lake storage
  • Streamlit for data applications
Why else you’ll love it here:
  • Wondering what the salary for…
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary