More jobs:
Data Engineer
Job in
500016, Prakāshamnagar, Telangana, India
Listed on 2026-02-07
Listing for:
ValueLabs
Full Time
position Listed on 2026-02-07
Job specializations:
-
IT/Tech
Data Engineer, Cloud Computing, Data Warehousing
Job Description & How to Apply Below
Value Labs is Hiring for Data Engineer
Experience:
7 Years
Location:
Hyderabad
Mode of Work:
Hybrid
Shift Timings: 2PM-11PM
Mandatory skill set: SQL Server, SSIS, SSRS, AWS Glue, Snowflake, DBT
Summary:
Design, build, and optimize end‑to‑end data pipelines and analytics solutions across on‑prem SQL Server and modern cloud platforms. Develop reliable ETL/ELT processes, curate data models, and deliver high‑quality reports and datasets to enable data-driven decision-making.
Key Responsibilities:
Design and develop SQL Server database objects (schemas, tables, views, stored procedures), optimize queries and indexes, and ensure data quality, performance, and security.
Build and orchestrate ETL workflows using SSIS (on‑prem) and AWS Glue (serverless PySpark jobs, Crawlers, Data Catalog) to ingest, transform, and integrate data from diverse sources.
Implement ELT transformations using dbt (models, tests, documentation, CI/CD) following modular, version-controlled best practices.
Develop and maintain SSRS paginated reports and dashboards; translate business requirements into performant datasets and visualizations.
Engineer scalable solutions on Snowflake (virtual warehouses, micro‑partitioning, Time Travel, Snowpipe) and tune workloads for cost and performance.
Automate jobs, monitor data pipelines, handle errors/retries, and ensure SLAs and data reliability (logging, alerting, and observability).
Collaborate with product, analytics, and BI teams to define data contracts, semantic layers, and governance standards.
Qualifications:
Strong SQL expertise and performance tuning in SQL Server ; proficiency with SSIS packages and SSRS reporting.
Hands-on experience with AWS Glue (PySpark) , dbt (Core/Cloud) , and Snowflake (SnowSQL, roles, warehouses, streams/tasks, Snowpipe).
Solid understanding of data modeling (3NF/Star/Snowflake), ELT vs ETL patterns, and CI/CD for data.
Familiarity with Git, job orchestration and cost/performance optimization in the cloud.
Excellent communication and stakeholder management for requirements gathering and documentation
PoweBI for reporting
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×