Job Description & How to Apply Below
We are seeking a Data Engineer with strong expertise in Snowflake (AWS), Python, and ETL/ELT pipeline development to build scalable, high-performance data solutions supporting enterprise analytics. This role requires experience with Kimball dimensional modeling, query optimization, and cost-efficient data warehouse design, along with the ability to leverage AI tools daily to improve productivity, quality, and delivery speed.
What You'll Do
• Design, develop, and maintain ETL/ELT pipelines using Python and enterprise ETL tools
• Build and optimize Snowflake data warehouse objects and data models
• Write and tune complex SQL for large-scale datasets
• Implement performance and cost optimization strategies in Snowflake
• Integrate data from multiple sources including AWS, APIs, and databases
• Use AI copilots and automation tools to accelerate development, debugging, testing, and documentation
• Support CI/CD, code reviews, and production pipeline monitoring
What We're Looking For
• Strong hands-on experience with Snowflake, SQL, and Python
• Experience with ETL tools (Matillion, Informatica, Talend, AWS Glue, or similar)
• Solid understanding of Kimball dimensional modeling and SCDs
• Experience using AI coding assistants or automation tools in engineering workflows
• Strong focus on performance, scalability, and data quality
Position Requirements
10+ Years
work experience
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×