Job Description & How to Apply Below
Job Description
Key Responsibilities:
Design, develop, and maintain data pipelines and ETL/ELT workflows using GCP-native tools and services.
Build and optimize data warehouses using Snowflake.
Write complex and efficient SQL queries for data transformation, analysis, and reporting.
Collaborate with analysts, data scientists, and business stakeholders to understand data needs and deliver reliable solutions.
Implement data governance, security, and monitoring best practices across GCP projects.
Tune queries and optimize performance of large-scale datasets.
Automate workflows using Cloud Composer (Airflow) or similar orchestration tools.
Required Skills &
Qualifications:
3+ years of experience in a data engineering or data platform role.
Strong hands-on experience with Snowflake data warehousing
Expert-level skills in SQL — able to write optimized, scalable, and complex queries.
Experience with data modeling (star/snowflake schema), partitioning, clustering, and performance tuning in a data warehouse.
Familiarity with modern ELT tools such as dbt, Fivetran, or Cloud Data Fusion.
Experience in Python or similar scripting language for data engineering tasks.
Understanding of data governance, privacy, and Google Cloud Platform services, especially Big Query, Cloud Storage, Dataflow, Pub/Sub, Composer.
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×