More jobs:
Job Description & How to Apply Below
Senior Data Engineer
•
Location:
Mumbai, Pune, Bengaluru (On-site work)
•
Experience:
4-5 years
• Notice Period: Immediate to 15 days
• Budget: INR 1.10 Lac per month (Open for full time or contract both)
Role Overview :
We are hiring a Senior Data Engineer to design and build a cloud-native data Lakehouse platform on GCP.
This is a hands-on individual contributor role, focused on implementation and delivery.
The engineer will work on a greenfield project, building scalable data pipelines processing 4 to 5 million records per day using PySpark, Big Query, Databricks, DBT, Airflow, Delta Lake and Apache Iceberg .
Key Requirements:
• Build end-to-end batch data pipelines from Excel files, third-party APIs, and SQL databases.
• Develop scalable transformations using PySpark and Big Query, Databricks
• Implement analytics-ready data models using DBT.
• Orchestrate workflows using Apache Airflow.
• Work with Delta Lake / Apache Iceberg tables for scalable lakehouse storage.
• Optimize data pipelines for performance, reliability, and cost.
• Write clean, testable, and well-documented production code.
Required Experience:
• 5+ years of data engineering experience.
• Minimum 3+ years in Big Data (Spark / distributed processing)
• Strong hands-on experience with:
oGCP & Big Query & Databricks
oPy Spark
oDBT
oApache Airflow
• Strong SQL and Python skills
• Experience working on early-stage / greenfield data platforms
Position Requirements
10+ Years
work experience
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×