×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer – Snowflake & dbt

Job in Coos Bay, Coos County, Oregon, 97458, USA
Listing for: Nagarro
Full Time position
Listed on 2026-01-12
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing
Job Description & How to Apply Below

Company Description

We're Nagarro. We are a digital product engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale — across all devices and digital mediums, and our people exist everywhere in the world (18 000+ experts across 39 countries, to be exact). Our work culture is dynamic and non-hierarchical.

We're looking for great new colleagues. That's where you come in! By this point in your career, it is not just about the tech you know or how well you can code. It is about what more you want to do with that knowledge. Can you help your teammates proceed in the right direction? Can you tackle the challenges our clients face while always looking to take our solutions one step further to succeed at an even higher level?

Yes? You may be ready to join us.

Job Description

We are seeking a highly skilled Data Engineer with strong expertise in Snowflake, ETL/ELT concepts, and dbt to design, build, and optimize scalable data pipelines. The ideal candidate will have advanced SQL skills, experience with cloud-based data platforms, and a strong understanding of data warehousing best practices.

Key Responsibilities
  • Design, develop, and maintain scalable data pipelines using Snowflake and dbt
  • Write and optimize advanced SQL queries for performance and reliability
  • Implement ETL/ELT processes to ingest and transform data from multiple sources
  • Develop Python scripts for automation, data processing, and API integrations
  • Build and manage data workflows using AWS services such as Glue, Lambda, S3, and Cloud Formation
  • Design and maintain data warehouse models, schemas, and transformations
  • Collaborate with cross-functional teams to understand data requirements and deliver analytical solutions
  • Implement and maintain version control, CI/CD pipelines, and best development practices
  • Monitor, troubleshoot, and optimize data pipelines for performance and cost efficiency
Qualifications Required Skills
  • Strong hands-on experience with Snowflake
  • Advanced SQL proficiency
  • Strong understanding of ETL/ELT concepts and data pipelines
  • Hands-on experience with dbt
  • Solid knowledge of data warehousing concepts, including schema design and data modeling
  • Proficiency in Python for scripting and automation
Good to Have Skills
  • Experience with AWS services (Glue, Lambda, S3, Cloud Formation)
  • Familiarity with Git and CI/CD practices
  • Understanding of APIs and CRUD operations
  • Exposure to cloud-native data architectures
#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary