Data Engineer – Snowflake & dbt
Listed on 2026-01-12
-
IT/Tech
Data Engineer, Cloud Computing
Company Description
We're Nagarro. We are a digital product engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale — across all devices and digital mediums, and our people exist everywhere in the world (18 000+ experts across 39 countries, to be exact). Our work culture is dynamic and non-hierarchical.
We're looking for great new colleagues. That's where you come in! By this point in your career, it is not just about the tech you know or how well you can code. It is about what more you want to do with that knowledge. Can you help your teammates proceed in the right direction? Can you tackle the challenges our clients face while always looking to take our solutions one step further to succeed at an even higher level?
Yes? You may be ready to join us.
We are seeking a highly skilled Data Engineer with strong expertise in Snowflake, ETL/ELT concepts, and dbt to design, build, and optimize scalable data pipelines. The ideal candidate will have advanced SQL skills, experience with cloud-based data platforms, and a strong understanding of data warehousing best practices.
Key Responsibilities- Design, develop, and maintain scalable data pipelines using Snowflake and dbt
- Write and optimize advanced SQL queries for performance and reliability
- Implement ETL/ELT processes to ingest and transform data from multiple sources
- Develop Python scripts for automation, data processing, and API integrations
- Build and manage data workflows using AWS services such as Glue, Lambda, S3, and Cloud Formation
- Design and maintain data warehouse models, schemas, and transformations
- Collaborate with cross-functional teams to understand data requirements and deliver analytical solutions
- Implement and maintain version control, CI/CD pipelines, and best development practices
- Monitor, troubleshoot, and optimize data pipelines for performance and cost efficiency
- Strong hands-on experience with Snowflake
- Advanced SQL proficiency
- Strong understanding of ETL/ELT concepts and data pipelines
- Hands-on experience with dbt
- Solid knowledge of data warehousing concepts, including schema design and data modeling
- Proficiency in Python for scripting and automation
- Experience with AWS services (Glue, Lambda, S3, Cloud Formation)
- Familiarity with Git and CI/CD practices
- Understanding of APIs and CRUD operations
- Exposure to cloud-native data architectures
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).