×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer

Job in 400601, Thane, Maharashtra, India
Listing for: DMart - Avenue Supermarts Ltd
Full Time position
Listed on 2026-02-26
Job specializations:
  • IT/Tech
    Data Engineer, Data Warehousing, Cloud Computing, Data Analyst
Job Description & How to Apply Below
We are looking for a skilled Snowflake/Data warehouse Data Engineer to design, build, and optimize our Data warehouse data architecture. In this role, you will be responsible for developing scalable data pipelines, managing our Snowflake data warehouse, and ensuring that our data is accurate, secure, and highly available for analysts and business stakeholders.
Pipeline Development:  Design, construct, and maintain scalable data pipelines (ETL/ELT) to ingest data from various source systems (APIs, relational databases, flat files, event streams) into Snowflake.
Snowflake Architecture & Optimization:  Build and manage Snowflake environments. Optimize virtual warehouses, scale compute resources efficiently to manage costs, and handle performance tuning for complex SQL queries.
Data Modelling:  Design robust logical and physical data models (e.g., Star Schema, Data Vault) optimized for analytical workloads and reporting.
Advanced Snowflake Features:  Implement and manage Snowflake-native features such as Snowpipe for continuous data ingestion, Streams and Tasks for automated workflows, Time Travel, Zero-Copy Cloning, and Secure Data Sharing.
Orchestration & Automation:  Schedule and orchestrate data workflows using tools like Apache Airflow, Prefect, or Dagster.
Data Quality & Governance:  Implement data validation checks, monitor pipeline health, and ensure compliance with data security and privacy standards (e.g., GDPR, CCPA) using Snowflake's role-based access control (RBAC) and dynamic data masking.

Required Skills &

Qualifications:

Experience:

4+ years of experience in Data Engineering, with at least 1-2 years of hands-on, dedicated experience managing a Snowflake environment.

Languages:

Advanced proficiency in  SQL  (complex joins, analytical functions, query optimization) and strong programming skills in  Python  (or Scala/Java).
Cloud Platforms:  Hands-on experience with at least one major cloud provider (AWS, GCP, or Azure) and their native data services (e.g., AWS S3, Google Cloud Storage, Azure Blob).
Modern Data Stack:  Practical experience with modern data transformation and orchestration tools (Airflow, Fivetran, Snowflake tasks/streams).
ERP:  Knowledge of SAP ERP system would be a added advantage
Problem Solving:  Strong analytical skills with the ability to troubleshoot complex data pipeline failures and data discrepancies.
Streaming Data:  Experience working with real-time data streaming technologies like Apache Kafka, Amazon Kinesis, or Snowpipe Streaming.

Work experience in  Retail  industry would be a added advantage.
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary