×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer ETL

Remote / Online - Candidates ideally in
201301, Noida, Uttar Pradesh, India
Listing for: Vistec Partners
Remote/Work from Home position
Listed on 2026-02-26
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst, Data Science Manager
Job Description & How to Apply Below
Position:  Data Engineer

Experience:

5–6 Years

Location:

Work From Home (WFH)
Office Requirement:  Once a week – Noida
Time Overlap:  Mandatory overlap with US EST

Role Summary
We are seeking an experienced  Data Engineer  with 5–6 years of industry experience, including a minimum of 2 years of hands-on expertise in  Databricks  and  Azure Data Factory (ADF) . The role involves designing, building, and optimizing scalable data pipelines and analytics solutions on Azure. Collaboration with US-based stakeholders requires daily overlap with the EST time zone.

Key Responsibilities
Design, develop, and maintain ETL/ELT pipelines using  Azure Data Factory  and  Databricks .
Build scalable  batch and streaming data processing workflows .
Develop  Databricks notebooks, jobs, and Delta Lake tables .
Perform  performance tuning  and  cost optimization  of data workloads.
Implement robust  data quality checks, validations, and monitoring .
Develop  Python-based data transformation and automation scripts .
Write and optimize complex  SQL queries  for analytics and reporting.
Collaborate with  Analytics, BI, and Product teams .
Document  technical designs, workflows, and operational procedures .

Required Skills & Experience
Databricks:  Minimum 2 years of hands-on experience with Spark, notebooks, workflows, Delta Lake.
Azure Data Factory (ADF):  Minimum 2 years of experience building production-grade pipelines.
Python:  Strong scripting and transformation capabilities.
SQL:  Advanced querying, joins, window functions, and optimization.
Data Modeling:  Knowledge of data warehousing concepts (star/snowflake schemas).
Azure Cloud:  Familiarity with Azure data services and architecture.
Version Control:  

Experience with Git / Dev Ops workflows.
Debugging:  Strong troubleshooting and problem-solving skills.
Communication:  Ability to collaborate effectively with US-based stakeholders.

Preferred / Good-to-Have

Experience with  streaming technologies  (Kafka / Event Hub).
CI/CD pipelines using  Azure Dev Ops .
Expertise in  Data Lake / Delta Lake architecture .
Exposure to  BI tools  such as Power BI.
Experience in  performance and cost optimization initiatives .

Work Conditions
Primarily  Work From Home (WFH) .
Mandatory  once-a-week office visit in Noida .
Mandatory  daily overlap with US EST time zone .
Flexibility to work  evening hours  as required.

#Data Engineer #Databricks #Azure Data Factory  #Python #SQL #Data Engineering #Hiring #Noida Jobs #WFH #Azure #ETL #Big Data
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary