More jobs:
Job Description & How to Apply Below
10+ years of experience in design, architecture, implementation, and optimization of data engineering solutions over large volume (TB, PB scale) of data.
Have expertise in designing and implementing end-to-end data architectures using Google Cloud Dataproc, including data ingestion pipelines, transformation logic, and data warehousing strategies to handle large-scale batch and real-time data processing.
Proven expertise in GCP services including Dataproc, Dataflow, Cloud Storage, Big Query, Cloud Composer, and Cloud Functions. Experience building scalable data lakes and pipelines.
Strong hands-on experience processing large volumes of data; proficiency in PySpark, Python, Spark SQL, and automating workflows.
Have good exposure to implementing robust data governance using Dataplex and security measures.
Have proficiency in requirements analysis, solution design, development, testing, deployment, and ongoing support, including cloud migration projects for large-scale data platforms.
Location:
Noida / Gurgaon / Indore / Bangalore / Hyderabad / Pune
Notice:
Immediate to 30 days
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×