×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer

Job in 243601, Gurgaon, Uttar Pradesh, India
Listing for: EXL
Full Time position
Listed on 2026-03-01
Job specializations:
  • IT/Tech
    Data Engineer, Data Warehousing, Cloud Computing, Data Analyst
Job Description & How to Apply Below
Role:
Azure Data Engineer
Exp: 4+ Years

Location:

Pune/Gurugram/Bangalore, Hyderabad (Hybrid)

About the Role:

Azure Data Engineer with Insurance Domain knowledge.

Key Responsibilities:

Design and develop ETL/ELT pipelines using Azure Data Factory, Snowflake, and DBT.
Build and maintain data integration workflows from various data sources to Snowflake.
Write efficient and optimized SQL queries for data extraction and transformation.
Work with stakeholders to understand business requirements— especially within insurance processes such as policy, claims, underwriting, billing, and customer data —and translate them into technical solutions.
Monitor, troubleshoot, and optimize data pipelines for performance and reliability.
Maintain and enforce data quality, governance, and documentation standards.
Collaborate with data analysts, architects, and Dev Ops teams in a cloud-native environment.
Must-Have

Skills:

Strong experience with Azure Cloud Platform services.
Proven expertise in Azure Data Factory (ADF) for orchestrating and automating data pipelines.
Proficiency in SQL for data analysis and transformation.
Hands-on experience with Snowflake and SnowSQL for data warehousing.
Practical knowledge of DBT (Data Build Tool) for transforming data in the warehouse.
Experience working in cloud-based data environments with large-scale datasets.

Mandatory:  Strong insurance domain knowledge, including understanding of policy administration, claims processing, underwriting workflows, actuarial data, and regulatory/compliance standards (e.g., IRDAI, HIPAA where applicable).

Good-to-Have

Skills:

Experience with Data Stage, Netezza, Azure Data Lake, Azure Synapse, or Azure Functions.
Familiarity with Python or PySpark for custom data transformations.
Understanding of CI/CD pipelines and Dev Ops for data workflows.
Exposure to data governance, metadata management, or data catalog tools.
Knowledge of business intelligence tools (e.g., Power BI, Tableau).

Qualifications:

Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Systems, or a related field.
4+ years of experience in data engineering roles using Azure and Snowflake.
Strong problem-solving, communication, and collaboration skills.
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary