Senior Data Engineer - GCP
Listed on 2026-02-28
-
IT/Tech
Data Engineer, Cloud Computing, Data Science Manager, Data Analyst
Senior Data Engineer
Location: London - Weekly / Bi Weekly Travel to Germany
Sector: Events and Live Entertainment
Work Type: Contract - 6 Months Inside IR35 - £660 per day
The ClientMy client is a leading organisation within the events and live entertainment space, delivering large-scale live experiences to millions of customers each year. With a growing digital and data function, they are investing in a modern, cloud-first data platform to power commercial insight, audience analytics, operational reporting and real-time decision-making across venues and events.
The RoleAs a Senior Data Engineer, you will design, build and optimise scalable data pipelines on Google Cloud Platform. You will take ownership of core data engineering patterns, evaluate and implement modern open-source frameworks, and help shape the evolution of the organisation's cloud data architecture.
This role requires a proactive, self-starter mindset with the ability to operate independently while collaborating closely with Data Modelling, Analytics and wider technology teams.
Key Responsibilities- Design, build and maintain scalable ETL and ELT pipelines
- Develop and optimise data processing workflows on Google Cloud Platform
- Implement robust and scalable data ingestion frameworks
- Work with structured and semi-structured data sources across multiple business domains
- Collaborate closely with Data Modelling and Analytics teams to enable reliable downstream reporting and insight
- Ensure data reliability through monitoring, observability and quality controls
- Automate deployment processes and data workflows wherever possible
- Contribute to tooling decisions, framework selection and data platform standards
- Strong experience designing and building ETL and ELT pipelines in production environments
- Hands-on experience with GCP data services such as Big Query, Cloud Storage, Dataflow or equivalent
- Advanced SQL and strong data transformation capability
- Experience with orchestration tools and automated pipeline scheduling
- Experience working within modern data architectures including warehouse and lakehouse patterns
- Demonstrated ownership mindset with strong problem-solving capability
- Exposure to Data Vault 2.0 modelling concepts
- Experience optimising performance and cost within Big Query
- Experience evaluating and implementing open-source data engineering frameworks
- Experience implementing CI/CD for data pipelines
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search: