More jobs:
Data Engineer
Remote / Online - Candidates ideally in
Zürich, 8058, Zurich, Kanton Zürich, Switzerland
Listed on 2026-02-28
Zürich, 8058, Zurich, Kanton Zürich, Switzerland
Listing for:
SwissPeak Partners
Part Time, Remote/Work from Home
position Listed on 2026-02-28
Job specializations:
-
Software Development
Software Engineer, Python, Data Engineer
Job Description & How to Apply Below
For a major transformation project in the insurance industry, we are looking for an experienced Data Engineer.
Start date: 1st of March
In this role, you will actively contribute to the design and implementation of modern data solutions, take real ownership of critical components, and operate confidently within a complex, agile, and multidisciplinary project environment.
Responsibilities:
- You take ownership of the implementation of production-ready data pipelines using PySpark on Databricks, based on specifications provided by business analysts.
- You collaborate closely with other data engineers and solution architects to meet both functional and non-functional requirements.
- You ensure high standards of quality, performance, security, and maintainability across data solutions.
- You apply clean code principles consistently within large and complex codebases.
- You contribute proactively within agile teams and support continuous improvement of data engineering practices.
- You communicate clearly and effectively to align technical implementation with business needs.
Requirements:
- You hold a bachelor’s or master’s degree in computer science or a related field.
- You have at least 5 years of experience developing complex software systems as a software engineer.
- You bring strong software engineering skills, including the application of design patterns and engineering best practices.
- You have solid experience in data engineering using high-level programming languages such as Python, Java, or C#, with a strong focus on Python.
- You have at least 3 years of hands-on experience with Apache Spark (PySpark), working with IDEs such as VS Code or PyCharm.
- You have experience working with Delta Lake and optimizing Spark workloads.
- You have a strong understanding of relational data models and SQL.
- You are highly proficient in English, both written and spoken.
- You will join a team of 10–12 professionals, primarily based in Switzerland and Italy.
- 3 days per week on site, 2 days per week remote work.
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×