×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer

Job in Bengaluru, 560001, Bangalore, Karnataka, India
Listing for: UST
Full Time position
Listed on 2026-03-03
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing
Job Description & How to Apply Below
Location: Bengaluru

About the Company

We are seeking a highly skilled Senior Data Engineer with strong expertise in Python and Google Cloud Platform (GCP) to design, build, and maintain scalable, high-performance data pipelines and integration solutions.

About the Role

The ideal candidate is a hands-on engineer with deep knowledge of data architecture, ETL/ELT development, and real-time/batch data processing. You will collaborate closely with analytics, development, Dev Ops, and business teams to ensure secure, reliable, and efficient data delivery across the organization.

Responsibilities

Design, develop, and maintain scalable data pipelines and ETL/ELT workflows using Python and GCP services.
Build batch and real-time data ingestion pipelines using APIs, CDC tools, and orchestration frameworks.
Develop and optimize data models, schemas, and cloud-based data architectures.
Implement data transformation and data quality validation frameworks.
Work with analytics and business teams to deliver high-quality, reliable datasets.
Monitor, troubleshoot, and optimize data workflows for performance and scalability.
Implement CI/CD pipelines for data engineering workflows.
Ensure compliance, governance, and security best practices across cloud data systems.
Perform root cause analysis and system performance tuning.
Stay updated with emerging data engineering and cloud technologies.

Qualifications

8+ years of experience in Data Engineering or related roles.
Bachelor’s or Master’s degree in Computer Science, Information Systems, Data Engineering, or related field.

Required Skills

Strong Python programming skills with hands-on experience in data pipeline development.
Proven experience with Google Cloud Platform (GCP) services, including:

Big Query
Dataflow
Pub/Sub
Dataproc
Cloud Composer
Cloud Functions
Cloud Scheduler
Data stream (CDC)
Google Cloud Storage (GCS)

Experience with Apache Beam or Apache Spark for distributed data processing.
Strong SQL skills and solid understanding of relational and cloud-native databases.
Experience building REST API-based ingestion pipelines and handling JSON-based integrations.
Strong understanding of data warehousing concepts, data modeling, and ETL/ELT principles.

Experience with CI/CD tools such as Git Hub, Terraform, and Cloud Build.
Knowledge of data security, access control, and governance in cloud environments.
Experience working in large-scale, cloud-based, or enterprise environments.

Preferred Skills

Professional certifications in Google Cloud Platform (GCP) or Big Data Engineering.

Experience with Change Data Capture (CDC) architectures.

Experience with performance tuning and system optimization in distributed environments.
Knowledge of Shell or Perl scripting.
Exposure to Dev Ops collaboration in globally distributed teams.
Experience designing real-time streaming architectures.
Strong documentation and communication skills.
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary