×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer; Analyst

Job in 110006, Delhi, Delhi, India
Listing for: Canopus Infosystems - A CMMI Level 3 Company
Full Time position
Listed on 2026-02-28
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst, Data Warehousing, Data Science Manager
Job Description & How to Apply Below
Position: Data Engineer (Analyst)
Job Title:

Data Engineer (Analyst)

Experience:

2.5 to 5 Years

Location:

PAN India (Remote/On-site as applicable)

About the Role:

We are looking for  Data Engineer (Analyst)  to build and maintain reliable data pipelines and analytics-ready datasets that power BI reporting, product insights, and business decision-making. You’ll work across multiple data sources, model clean reporting layers, and ensure data quality end-to-end.

Key Responsibilities:

Build and maintain scalable  ETL/ELT pipelines  (batch and incremental) using  SQL + Python
Integrate data from  databases, APIs, SaaS tools, event data, and flat files
Design analytics-ready  data models  (star schema/marts) for self-serve reporting
Create and optimize transformations in a cloud warehouse/lakehouse (e.g.,  Snowflake, Big Query, Redshift, Synapse, Databricks )
Partner with stakeholders to define  KPIs, metric logic, and reporting requirements
Maintain dashboards and reporting outputs in tools like  Power BI, Tableau, Looker, or Sigma
Implement  data quality checks , monitoring, alerts, and documentation to keep datasets trusted
Tune performance and cost (incremental loads, partitioning, query optimization, file formats)

Required Skills:

Strong  SQL  skills (CTEs, window functions, joins, aggregations, optimization)
Strong  Python  skills for transformations and automation
Hands-on experience with  at least one cloud platform: AWS / Azure / GCP

Experience with a modern  data warehouse/lakehouse  (Snowflake/Big Query/Redshift/Synapse/Databricks)
Solid understanding of  ETL/ELT  patterns (incremental loads, retries, idempotency, basic CDC)
Comfort with  data modeling  for analytics and BI reporting
Experience building stakeholder-friendly reporting in a  BI tool  (Power BI/Tableau/Looker/Sigma)

Nice to have:
Orchestration tools:  Airflow, dbt, Dagster, Prefect, ADF, Glue , etc.
Streaming/event data:  Kafka, Kinesis, Pub/Sub
Monitoring/logging:  Cloud Watch, Azure Monitor, GCP Monitoring, Datadog
CI/CD + Git-based workflows for data pipelines
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary