Solutions Architect - Big Data and DevOps
Listed on 2026-01-12
-
IT/Tech
Data Engineer, Cloud Computing, Systems Engineer
Location: Germany
stackable - who are we?
Stackable is an innovative technology company focused on delivering cutting‑edge, opensource Big Data solutions for data lake houses, data mesh, event processing and AI. We specialise in designing, deploying, and managing scalable data infrastructures that empower businesses to harness the power of data effectively. Our core product, the Stackable Data Platform, enables companies of all sizes to easily integrate and manage their sovereign data platforms in their respective IT infrastructure.
Yourmission at stackable:
We are looking for a skilled Solutions Architect with a strong background in Big Data to joinour team. The ideal candidate will have expertise in open source Big Data technologies and Kubernetes and Dev Ops practices
. This role will be instrumental in designing, implementing,and optimizing cloud‑based and on‑premise data solutions for our clients.
- You will architect and implement Stackable
Data Platform solutions
, ensuring scalability, reliability, and security. - You will develop and maintain Kubernetes-based infrastructures to support data processing and analytics workloads.
- You will design and deploy CI/CD pipelines
, infrastructure automation, and monitoring solutions. - You will work closely with clients and internal teams to translate business requirements into technical solutions.
- You will ensure best practices in Dev Ops, infrastructure as code, and security
. Collaborate with cross‑functional teams, including data engineers, software developers, and enterprise architects.
- Collaboration with an international team based in UK and Germany
- Working either remote or onsite with our customers from UK
- A technology‑focused start‑up culture
- Choice of development tools
- The chance to shape a sustainable and future‑oriented open source product
- Our engineering mindset: our aim is bring architectures and software available in the most automated, maintainable and robust way possible!
- Diverse training opportunities and social benefits (e.g. UK pension schema)
- Strong hands‑on experience working with modern Big Data technologies such as Apache Spark, Trino, Apache Kafka, Apache Hadoop, Apache HBase, Apache Nifi, Apache Airflow, Open search
- Proficiency in cloud‑native technologies such as containerization and Kubernetes
- Strong knowledge of Dev Ops tools (Terraform, Ansible, ArgoCD, Git Ops, etc.)
- Proficiency in software development using Rust (ideally), Java, or Python
- Experience with Solution Architecture and designing enterprise‑grade data platforms
- Understanding of networking, security, and access controls in on‑prem environments.
- Fluent English skills
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).