×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer

Job in Genf, Geneva, Switzerland
Listing for: Proton
Full Time position
Listed on 2026-01-15
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing
Salary/Wage Range or Industry Benchmark: 30000 - 80000 CHF Yearly CHF 30000.00 80000.00 YEAR
Job Description & How to Apply Below
Location: Genf

Join Proton and build a better internet where privacy is the default

At Proton we believe that privacy is a fundamental human right and the cornerstone of democracy. Since our inception in 2014 founded by a team of scientists from CERN we have dedicated ourselves to providing free and open-source technology to millions worldwide ensuring access to privacy security and freedom online.

Our journey began with Proton Mail the largest secure email service globally and has since expanded to include Proton VPN Proton Calendar Proton Drive and Proton Pass. These tools empower individuals and organizations to take control of their personal data break away from Big Techs invasive practices and defeat censorship. Our work impacts hundreds of millions of lives from activists on the front lines defending freedom to leaders in governments protecting sensitive some cases Protons services have even been instrumental in saving lives by enabling secure and private communications in high-risk situations.

Proton is a profitable company that does not rely upon VC funding supporting over 100 million user accounts with a growing team of over 500 people from over 50 different countries from the worlds top companies and universities. We value intelligence learning potential and ambition in our hiring process. Adaptability is key as we navigate uncharted territories and redefine how business is conducted online.

Hiring at Proton is highly selective with less than 1% of candidates hired. We believe smaller teams of exceptional talent will always prevail over larger teams with lower talent density. You will have the opportunity work with many of the worlds top minds in their fields ranging from former international math and science olympiad winners to chess champions.

We have a global mindset and big ambitions but remain a start-up  value empowerment and flexibility and keep our structure flat to keep moving fast and avoid unnecessary politics. Tired of blending into the crowd Join us and do work you can truly be proud of. Check our open-source projects here !

Purpose of the role

The Proton Data Platform team is responsible for everything that enables the company to make data-driven decisions with our on-premise custom data platform at the center. We are looking for a Data Engineer to join the team and help building and maintaining reliable batch and streaming data pipelines and tools learn best practices for data modeling and quality and support analytics / ML use cases.

Youll work with mentors on the team and grow into owning pipelines end-to-end.

What you will do

Implement and maintain ingestion & transformation jobs (Kafka Spark) with guidance.

Write clean testable code in Scala and Python for data pipelines and utilities.

Query and optimize datasets in Click House SQL (partitioning basic tuning).

Add data quality checks and monitoring; help triage pipeline issues.

Contribute to CI / CD and containerized jobs (Docker Git Lab CI; exposure to Kubernetes).

Collaborate with analysts / ML engineers to turn requirements into well-scoped tasks.

Document workflows and share learnings in code reviews and short design notes.

Job requirements

13 years in data engineering or backend engineering with data-heavy systems (internships or projects count).

Solid software engineering foundations : version control testing code reviews readable code.

Working knowledge of one of :
Kafka or another streaming system;
Spark or another distributed compute engine.

Proficiency in Python ; willingness to learn Scala (or vice versa).

Comfortable with SQL ; curiosity to learn Click House specifics.

Clear communication ownership mindset and eagerness to learn.

Bonus points for :

Exposure to Click House (engines table layouts partitioning).

Python data stack : pandas matplotlib dask .

Data quality / governance tools (e.g. Great Expectations DBT tests) and basic lineage / metadata.

Infra & Dev Ops basics :
Docker Kubernetes Git Lab CI ; observability (Prometheus / Grafana).

Orchestration tools (Airflow Dagster Argo) and feature / ML pipelines.

Experience with payment providers APIs and Charge Bee.

What We Offer
  • Office First :
    Collaboration is easier and more effective in…
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary