Data Engineer
Listed on 2026-03-01
-
IT/Tech
Data Engineer, Cloud Computing
About NEOZO
At our workplace, you'll find a supportive and friendly atmosphere where IT professionals enjoy sharing their expertise and exchanging ideas. Our top priority is maintaining a positive mindset, coupled with ideal working conditions, to help you grow technologically and reach your full potential. We take pride in providing a secure environment where you can thrive and succeed.
What you can expect from us- Home‑office
- Free public transport ticket
- Free parking
- Mentoring programme for employees
You design and develop high‑performance, scalable, and robust data platforms that serve as the backbone for data‑driven decision‑making in enterprise environments. In doing so, you integrate complex data sources, implement sustainable data architectures, and ensure consistent, high‑quality data through intelligent transformations and validations.
You work independently while collaborating closely with your colleagues from Data Science, Analytics, and Dev Ops – and you can always rely on mutual support. You actively contribute to greenfield projects and the modernization of existing data infrastructures, helping to shape our Lakehouse architecture based on modern open‑source and cloud technologies.
Our clients come from regulated industries, logistics, e‑commerce, and international trade – sectors where data quality, governance, and real‑time capabilities are critical. To meet these demands, we rely on technologies such as Apache Spark, dbt, Delta Lake, Apache Airflow, Kafka, Snowflake, Python, SQL, and Terraform. We operate our platforms in cloud environments like AWS, Azure, or GCP, ensuring seamless integration with existing enterprise systems.
If you want to learn about technology and use it effectively in a non‑hierarchical group setting
, you're in the right place.
We offer professional IT engineers high‑end hardware with admin rights and state‑of‑the‑art equipment and tools of their choice
, providing them with the space and freedom to work efficiently and successfully.
- Data‑driven mindset & knowledge of technologies (Spark, Big Query, Object Storage, Hadoop)
- Experience with at least one Lakehouse (Databricks, Amazon Red Shift, Snowflake)
- Experience with at least one programming language (Python, Scala, Java)
Then apply via our application form or send us your documents to (Use the "Apply for this Job" box below). you have any questions, please don't hesitate to reach out to us via Whats App.
We are excited to receive your application!
#J-18808-Ljbffr(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).