×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer

Job in 243601, Gurgaon, Uttar Pradesh, India
Listing for: Talentmatics
Full Time position
Listed on 2026-02-08
Job specializations:
  • IT/Tech
    Data Engineer, Big Data, Cloud Computing
Job Description & How to Apply Below
Job Title:

GCP Data Engineer / Specialist – GCP Data Architecture

Location:

Gurgaon, India (Onsite / Hybrid as per business needs)

Experience:

5–8 years

Job Summary:

We are looking for an experienced GCP Data Engineer / Specialist – GCP Data Architecture with strong hands-on expertise in Google Cloud Platform (GCP), particularly Big Query, PySpark, and scalable data architectures.

The ideal candidate will be responsible for designing, developing, and optimizing cloud-based data solutions, handling large-scale datasets, and building robust data pipelines to support analytics and business intelligence use cases.

Key Responsibilities:

- Design, develop, and optimize scalable data architectures using GCP, with a strong focus on Big Query
- Build, maintain, and optimize high-performance data pipelines and ETL/ELT workflows
- Develop and optimize advanced SQL queries and Big Query transformations
- Implement and manage data solutions across the GCP ecosystem
- Develop data processing solutions using PySpark and Scala
- Ensure data quality, governance, security, and compliance with organizational standards
- Monitor, troubleshoot, and improve performance, scalability, and reliability of existing data platforms
- Collaborate with cross-functional teams to translate business requirements into technical solutions
- Mentor junior engineers and provide technical leadership
- Stay updated with emerging trends and best practices in GCP data and analytics services

Required Skills &

Qualifications:

Mandatory

Skills:

- Strong hands-on experience with Google Cloud Platform (GCP)
- Expertise in GCP Big Query
- Advanced SQL skills for large-scale data processing
- Strong programming experience in PySpark and Scala
- Experience building and optimizing data pipelines and ETL processes
- Experience working with large-scale data environments

Good to Have:

- Working knowledge of other GCP services (e.g., Cloud Storage, Dataflow, Dataproc, Pub/Sub)
- Experience in designing cloud-native data architectures
- Strong problem-solving and analytical skills
- Good communication and stakeholder management skills

Interested candidates can share their resume at:

shivani.
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary