Data Engineer
Listed on 2026-02-28
-
IT/Tech
Data Engineer, Cloud Computing
Data Engineer
Role:
Data Engineer
Location:
Newcastle Upon Tyne
Salary: TBC – Depending on experience
Levels:
Senior Analyst, Specialist
Hybrid Working: 3 days per week in our Newcastle, Cobalt business park office
Please Note: Any offer of employment is subject to satisfactory BPSS and SC security clearance which requires 5 years continuous UK address history (typically including no periods of 30 consecutive days or more spent outside of the UK) and declaration of being a British or EU passport holder or hold Indefinite Leave to remain within the UK at the point of application.
Note:
The above information relates to a specific client requirement.
Our Advanced Technology Centre is a hub of innovation where we deliver high‑quality data and technology services to clients across both the public and private sectors. You’ll join a collaborative culture that values diverse thinking, continuous learning, and opportunities for career growth within a global network of experts.
Role OverviewAs a Data Engineer, you will design, build, and maintain scalable data solutions that enable analytics, AI, and operational insights. You’ll work alongside client and internal teams to create robust data pipelines, ensure data reliability, and support cloud‑based architectures that power intelligent decision‑making.
Key Responsibilities Data Pipeline Development- Build, optimize, and maintain scalable data pipelines using Java (primary), plus exposure to Python, Flink, Kafka, or Spark.
- Develop and support real‑time streaming pipelines and event‑driven integrations.
- Integrate data from multiple sources (streaming, batch, APIs) using AWS managed services (e.g., Kinesis, MSK, Lambda, Glue).
- Contribute to data modelling, data architecture best practices, and modern patterns (e.g., medallion architecture).
- Ensure data quality, lineage, governance, and security controls are applied consistently.
- Deploy and maintain data applications using CI/CD tooling (Azure Dev Ops, Git Hub Actions, Jenkins).
- Use Infrastructure as Code (e.g., Terraform, Cloud Formation) to manage cloud environments.
- Work with container technologies such as Docker and Kubernetes‑based workloads.
- Work closely with analytics, ML/AI, and product teams to deliver clean, well‑structured datasets.
- Participate in code reviews and internal knowledge‑sharing sessions.
- Provide guidance to junior engineers where needed.
- Strong programming proficiency in Java (preferred) or Python.
- Hands‑on experience with at least one of:
Kafka, Flink, Spark (Flink/Kafka preferred for streaming). - Solid understanding of stream processing concepts (e.g., event time, state, back pressure).
- Understanding of software engineering best practices: testing, design patterns, CI/CD, Git.
- Experience building ETL/ELT or streaming data pipelines.
- Exposure to microservices and distributed system concepts.
- Experience working with cloud platforms, ideally AWS, but Azure/GCP also acceptable.
- Understanding of distributed compute, large‑scale data systems, and performance considerations.
- Experience with CI/CD tools (Azure Dev Ops, Git Hub Actions, Jenkins etc.).
- Infrastructure‑as‑Code (Terraform preferred).
- Experience with containerisation (Docker) and orchestration platforms (Kubernetes/EKS).
- Exposure to enterprise data platforms (Databricks, Snowflake, Big Query, or similar).
- Cloud certifications (AWS, Azure, GCP) are beneficial but not required.
- Minimum 3 years’ experience working on data engineering or large‑scale data solutions.
- Comfortable working in Agile delivery teams.
- Strong communication skills and ability to collaborate with technical and non‑technical stakeholders.
- Experience in client‑facing or consulting environments.
- Professional cloud or data engineering certifications.
- Experience mentoring or supporting junior engineers.
- Background in designing or operating real‑time, low‑latency systems.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search: