More jobs:
Job Description & How to Apply Below
Key Responsibilities:
Design, develop, and optimize data pipelines on Google Cloud Platform
Build and manage ETL/ELT workflows for large-scale data processing
Work with structured and unstructured data from multiple sources
Ensure data quality, reliability, and performance across pipelines
Collaborate with analytics, BI, and application teams
Implement best practices for data security, governance, and cost optimization
Mandatory
Skills:
5+ years of experience in Data Engineering
Strong hands-on experience with Google Cloud Platform (GCP)
Expertise in Big Query
Experience with Cloud Dataflow / Dataproc
Proficiency in SQL and Python
Experience with Cloud Storage, Pub/Sub
Strong understanding of data modeling and data warehousing concepts
Good to Have:
Experience with Apache Airflow / Cloud Composer
Knowledge of dbt
Exposure to CI/CD for data pipelines
Experience with real-time streaming data
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×