×
Register Here to Apply for Jobs or Post Jobs. X

Senior Data Engineer

Job in Bengaluru, 560001, Bangalore, Karnataka, India
Listing for: DigiKey Global Capability Center
Full Time position
Listed on 2026-02-05
Job specializations:
  • IT/Tech
    Data Engineer, Data Science Manager, Cloud Computing, Data Analyst
Job Description & How to Apply Below
Location: Bengaluru

About the Role
We are seeking a highly experienced and strategic  Sr Data Engineer  for the development and optimization of our modern cloud data platform infrastructure. This role is ideal for someone who thrives in a fast-paced environment, is passionate about data architecture, and has a deep understanding of data transformation, modeling, and orchestration using modern tools like  dbt-core ,  Snowflake , and  Python .

Key Responsibilities
Design and implement scalable data pipelines  using dbt-core, Python, and SQL to support analytics, reporting, and data science initiatives.
Design and optimize data models  in Snowflake to support efficient querying and storage.
Development and maintenance of our data warehouse , ensuring data quality, governance, and performance.
Collaborate with cross-functional teams  including data analysts, data architects, data scientists, and business stakeholders to understand data needs and deliver robust solutions.
Establish and enforce best practices  for version control (Git), CI/CD pipelines, and data pipeline monitoring.
Mentor and guide junior data engineers , fostering a culture of technical excellence and continuous improvement.
Evaluate and recommend new tools and technologies to enhance the data platform.
Provide on-going support for the existing ELT/ETL processes and procedures.
Identify tools and technologies to be used in the project as well as reusable objects that could be customized for the project Coding

Required Qualifications
Bachelor's degree in computer science or related field (16 years of formal education related to engineering)
6+ years of experience in data engineering or a related field.
Expert-level proficiency in  SQL  and  Python  for data transformation and automation.

Experience with  dbt-core  for data modeling and transformation.
Strong hands-on experience in cloud platforms ( Microsoft Azure)  and cloud data platforms ( Snowflake ).
Proficiency with  Git  and collaborative development workflows. Familiarity with  Microsoft   VSCode  or similar IDEs. Knowledge of  Azure Dev Ops  or  Gitlab  development operations and job scheduling tools.
Solid understanding of modern  data warehousing architecture , dimensional modeling, ELT/ETL frameworks and data modeling techniques.
Excellent communication skills and the ability to translate complex technical concepts to non-technical stakeholders.
Proven expertise in designing and implementing batch and streaming data pipelines to support near real-time and large-scale data processing needs.

Preferred Qualifications
Experience working in a cloud-native environment (AWS, Azure, or GCP).
Familiarity with data governance, security, and compliance standards.
Prior experience with Apache Kafka (Confluent).
Artificial Intelligence (AI) experience is a plus.
Hands-on experience with orchestration tools (e.g., Airflow, Prefect) is a plus.
Position Requirements
10+ Years work experience
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary