×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer

Job in 400001, Mumbai, Maharashtra, India
Listing for: Simple Logic IT Private Limited
Full Time position
Listed on 2026-02-04
Job specializations:
  • IT/Tech
    Data Engineer, Big Data, Cloud Computing, Data Science Manager
Job Description & How to Apply Below
Hiring for GCP Data Engineer role with Simple Logic IT Pvt Ltd at Mumbai/Bangalore

Position  :  GCP Data Engineer
Department  : IT Data Engineering Team
Location  :
Mumbai/Banglore
Work Mode  :
Work From Office

Skill Set : GCP, Pyspark, Python, SQL, Bigquery

Note :
Immediate Joiners preferred for Mumbai & Banglore location.

What are we looking for?
We are looking for a passionate and experienced Data Engineer to join our team and help build
scalable, reliable, and efficient data pipelines on cloud platforms like Primarily on Google Cloud
Platform (GCP) and secondary on Amazon Web Services (AWS). You will work with
cutting-edge technologies to process structured and unstructured data, enabling data-driven
decision-making across the organization.

What does the job entail?

● Design, develop, and maintain robust data pipelines and ETL/ELT workflows
using PySpark, Python, and SQL.

● Build and manage data ingestion and transformation processes from various sources
including Hive, Kafka, and cloud-native services.

● Orchestrate workflows using Apache Airflow and ensure timely and reliable data delivery.

● Work with large-scale big data systems to process structured and unstructured datasets.

● Implement data quality checks, monitoring, and alerting mechanisms.

● Collaborate with cross-functional teams including data scientists, analysts, and product
managers to understand data requirements.

● Optimize data processing for performance, scalability, and cost-efficiency.

● Ensure compliance with data governance, security, and privacy standards.

● Demonstrate strong understanding in functional areas

Educational Qualification : BTech/BE
Work Experience : 4-5 Yrs

Preferred Skills :
Technical

Skills:

● 4-5 years of experience in data engineering or related roles.

● Strong programming skills in Python and PySpark.

● Proficiency in SQL and experience with Hive.

Hands-on experience with Apache Airflow for workflow orchestration.

● Experience with Kafka for real-time data streaming.

● Solid understanding of big data ecosystems and distributed computing.

● Experience with GCP (Big Query, Dataflow, Dataproc

● Ability to work with both structured (e.g., relational databases) and unstructured (e.g., logs,
images, documents) data.

● Strong experience in Data Warehousing and sound understanding of data modelling

● Familiarity with CI/CD tools and version control systems (e.g., Git).

● Knowledge of containerization (Docker) and orchestration (Kubernetes).

● Exposure to data cataloging and governance tools (e.g., AWS Lake Formation, Google Data
Catalog).

Soft Skills:

● Understanding of data modeling and architecture principles.

● Strong analytical and problem-solving abilities.

● Excellent communication and collaboration skills.

● Ability to work in Agile/Scrum environments.

● Ownership mindset and attention to
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary