×
Register Here to Apply for Jobs or Post Jobs. X

Sr Specialist Scientific Data Engineering

Job in 1001, Lausanne, Canton de Vaud, Switzerland
Listing for: Nestlé
Full Time position
Listed on 2026-01-29
Job specializations:
  • IT/Tech
    Data Engineer, Data Scientist
Salary/Wage Range or Industry Benchmark: 80000 - 100000 CHF Yearly CHF 80000.00 100000.00 YEAR
Job Description & How to Apply Below

Strategic Research Operations

Welcome to Nestlé Research, a global organization, with most of our passionate team located in Lausanne, Switzerland! Our work is powered by five specialized Nestlé Institutes and key strategic units. Here, we embrace fresh thinking and collaborate to create amazing solutions. How about taking a step inside our organization and discover the exciting research activities and expertise that drive us forward?

Want to learn more!

At Nestlé Research, we place people at the core of everything we do, driven by a passion for innovation that inspires us to embrace change.

Our commitment to staying at the forefront of science allows us to make a meaningful impact on our world
, fostering a culture of creativity and growth
. Join us in our mission to inspire people and ignite innovation as we work together to shape a better future.

Position Snapshot:
Your opportunity to #breakthrough with us!

Location:

Nestlé Research, Lausanne, Switzerland

Business Unit:
Strategic Research Operations, Computational Science Department

Company:
Société des Produits Nestlé S.A.
Act. Rate:
Full-Time Act. Rate 100%
Type of contract:
Permanent

What we offer at Nestlé

Exciting opportunities to develop your career your way

Flexible working arrangements - facilitating creativity and collaboration.

A culture of respect, with diversity, equity and inclusion at its core.

A dynamic international environment empowering you to learn, develop and grow.

Don’t hesitate to connect with us during the recruitment process to learn more

Position Summary

Within our Computational Sciences department, as a Senior Specialist Scientific Data Engineering you will be responsible for delivering advanced data engineering capabilities to support Nestlé Research projects. Your main objective will be to design, build, and optimize robust data pipelines and architectures that enable efficient data access, integration, and analysis across diverse scientific domains. You will play a key role in enabling data-driven research by ensuring high-quality, scalable, and secure data infrastructure.

A

Day in the Life of a Senior Specialist Scientific Data Engineering
  • Design and implement end-to-end scientific data pipelines, e.g. for Bioinformatics or Clinical or Omics, including data ingestion, transformation, storage, and analytics layers, tailored to scientific research use cases.
  • Develop and deploy scalable data architectures on premise (Linux) or cloud platforms, ensuring performance, reliability, and compliance with data governance standards.
  • Onboard and drive external developers as well as team up with other internal data specialists (data architects, data scientists, AI engineers, software developers…) to accelerate delivery in priority projects.
  • Work in cross-functional project teams in a research environment to define project objectives and deliverables, evaluate the need and identify the right technical approaches to solve business need.
  • Partner with other teams to gather functional requirements, and improve data quality, metadata management, and data discoverability.
What Will Make You Successful
  • Bachelors or Master’s or PhD in Bioinformatics, or Computer Science combined with life sciences or a related field
  • Significant professional experience ‘- 7+ years - in designing and implementing data pipelines and architectures, in a research or scientific context, ideally in food or pharma industry
  • Familiarity with software engineering practices and development frameworks (Scrum, Agile, Dev Ops).
  • You have the following expertise and technical skills:
    • Solid foundations in data modeling, ETL/ELT processes, and distributed data systems.
    • Proficiency in Python and SQL for data manipulation and pipeline development.
    • Experience with Dev Ops tool stacks (e.g., Git, CI/CD).
    • Experience with cloud platforms (e.g., Azure, AWS) and orchestration tools (e.g., Airflow, Azure Data Factory).
    • Experience of working with data lake and data warehouse technologies - Snowflake, Databricks
    • Experience working with Linux and container technologies such as Openshift or Docker or Podman.
    • Experience with Bioinformatics or Clinical or Omics data pipelines
    • Familiarity with Data analysis and…
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary