More jobs:
Job Description & How to Apply Below
Data Engineering (GCP Big Data) | 3-12 Years | Bangalore / Gurgaon
Roles & Responsibilities :
Design, develop, and manage scalable data pipelines and ETL/ELT workflows using Big Query, Dataflow, Dataproc, and Cloud Composer (Airflow) .
Work extensively on Big Data Analytics solutions leveraging Hadoop, Hive, Spark, and GCP services.
Build and optimize data models and SQL queries for performance and scalability.
Collaborate with cross-functional teams to enable data integration, transformation, and reporting solutions .
Implement and manage data pipelines using Airflow/Cloud Composer, Kafka, Git, Jenkins, and CI/CD frameworks.
Troubleshoot, monitor, and improve system performance and ensure high availability of data platforms.
Contribute to Agile/Scrum-based development cycles , providing inputs during sprint planning and reviews.
Drive continuous improvement and innovation in data engineering practices .
Key Skills Required :
Google Cloud Platform (GCP): Big Query, Dataflow, Dataproc, Cloud Composer (Airflow)
Programming & Processing: PySpark, Python
BI/Reporting Tools: Tableau or Micro Strategy (MSTR)
Big Data Ecosystem: Hadoop, Hive, Spark
Databases & SQL: Advanced SQL, data model optimization
Data Engineering Tools: Airflow, Kafka, Git, Jenkins, CI/CD pipelines
Note:
Immediate joiners to 15 days only
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×