More jobs:
Secure Data Engineer
Job in
Newcastle upon Tyne, Newcastle, Tyne and Wear, SY7, England, UK
Listed on 2026-02-28
Listing for:
Capgemini
Full Time
position Listed on 2026-02-28
Job specializations:
-
IT/Tech
Data Engineer, Cloud Computing, Systems Engineer
Job Description & How to Apply Below
Choose a partner with intimate knowledge of your industry and first-hand experience of defining its future.
Your location
Your location
Industries
Choose a partner with intimate knowledge of your industry and first-hand experience of defining its future.
Newcastle, Birmingham, Bristol, London, Manchester# Secure Data Engineer##
** The Focus Of Your Role
** As a Solution Architect with an Azure and Databrick focus, you will be an integral part of our team dedicated to building scalable and secure data platforms. You will leverage your expertise in Databricks, Apache Spark, and Azure to design, develop, and implement data warehouses, data lake houses, and AI/ML models that fuel our data-driven operations.
We are looking for a code first Data Engineer to design and build scalable and resilient data applications for Defence customers. This role sits at the intersection of Software Engineering and Data Engineering, working on mission critical systems where reliability, security and auditability are essential.
You will build data applications that process high volume and high velocity data, orchestrate complex workflows, and deploy your own solutions into secure containerised environments. You will work across the full lifecycle from development through to production, shaping architectures that support operational users, analytics, and AI enabled capabilities.
This role is hands on and engineering. You will write production grade code, contribute to secure platform patterns, and ensure that your data services run predictably in tightly governed Defence environments.
* ** Core Engineering
** Expert level Python and strong foundations in software engineering such as object oriented design, automated testing and version control.
* ** Data Engineering Stack
** Experience building pipelines using streaming frameworks, distributed processing engines and relational or analytical storage technologies.
Examples include Kafka, Spark, Postgre
SQL.
* ** Orchestration
* * Experience defining and running data workflows using modern orchestration frameworks.
Examples include Airflow, Dagster or Prefect.
* ** Data Quality and Lineage
** Familiarity with tools and techniques for data testing, documentation and lineage.
Examples include Great Expectations or dbt.
* ** AI and MLOps
** Understanding of how to ope rationalise machine learning models in production, including model packaging, monitoring and controlled deployment.
* ** Containerisation and Kubernetes
** Confidence deploying applications in containerised environments, including defining services, pods and deployment configurations.
Examples include Docker and Kubernetes.
* ** Dev Ops Mindset
** Hands on experience with CI/CD approaches and a belief in owning the services you build.
Examples include Git Lab CI, Git Hub Actions or Argo.
* ** Nice to have
** Experience with Infrastructure as Code or configuration management tools.
Examples include Terraform or Ansible.
Experience working in secure, restricted or air gapped environments, including Defence networks or MODCloud aligned platforms.
Familiarity with Google Distributed Cloud (GDC) or other edge and on premises cloud platforms used in constrained or disconnected settings.
Your work will involve
** Building Data Applications
** Developing modular and maintainable software components that process, transform and expose data for analytical, operational or AI driven use cases. You will follow strong engineering practices, with testing and observability built in from the start.
** Streaming and Real Time Architecture
** Designing and implementing data ingestion and event driven patterns that support real time or near real time flows, ensuring they remain reliable even under demanding operational conditions.
** Workflow Orchestration
** Defining data workflows programmatically and managing complex dependencies, scheduling and error recovery behaviours within secure and assured environments.
** Deployment and Ownership
** Containerising your own services and deploying them into secure Kubernetes or cloud environments, using CI/CD principles adapted for Defence delivery. You will own your applications in production and contribute to secure patterns…
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×