More jobs:
Job Description & How to Apply Below
Hyderabad
Work Mode:
Work from Office
Client:
Singapore based
Job Description:
Required Qualifications:
- Experience as a Data Engineer: 6+ years
- Skills:
AWS (S3, Glue), Snowflake, Airflow, SQL, Python/Pyspark
- Bachelor’s degree in computer science, Engineering, or STEM-related field
- At least 4+ years of hands-on experience working with AWS data services
- Understanding of data warehousing, ETL concepts, and data quality checks.
- Familiarity with AWS IAM, logging, and monitoring (Cloud Watch).
- Experience with CI/CD pipelines and Git is a plus
Responsibilities:
- Design, develop, and maintain ETL/ELT pipelines using AWS Glue (PySpark / Python)
- Build and orchestrate workflows using AWS Airflow (MWAA) with Python.
- Manage and optimize data storage and access in Amazon S3.
- Work with AWS DMS to support data migration and replication from source systems.
- Develop and maintain Cloud Formation templates for infrastructure provisioning.
- Manage and monitor EC2 instances, including performance tuning and troubleshooting.
- Write shell scripts for automation, monitoring, and operational tasks.
- Perform data ingestion, transformation, and validation from Oracle and Postgre
SQL databases along with Oracle plsql.
- Experience on Snowflake and snowpipes along with integration with AWS.
- Write and optimize SQL queries for data extraction, reconciliation, and reporting.
- Monitor job executions, handle failures, and implement retry and alerting mechanisms.
- Collaborate with cross-functional teams to understand data requirements and deliver reliable data solutions.
- Ensure data security, compliance, and best practices across AWS services
Good to Have
- Experience with CI/CD pipelines for data workloads.
- Knowledge of partitioning, performance tuning, and cost optimization in AWS.
- Exposure to data governance, metadata management, or auditing frameworks.
- Prior experience in large-scale data migration or modernization projects
Key
Skills:
- Strong AWS knowledge in designing, supporting, and optimizing data architecture(Amazon S3, AWS Glue, AWS RDS ,AWS DMS, MWAA/Airflow, EC2, IAM)
- Hands-on experience with AWS Glue scripting (Python / PySpark) for ETL processing
- Good knowledge of Apache Airflow and AWS MWAA for job orchestration and monitoring
- Experience in AWS DMS for data replication and migration use cases
- Working knowledge of Amazon RDS and database integrations
- Experience managing and troubleshooting EC2 instances in production environments
- Hands-on experience in Cloud Formation templates for infrastructure provisioning
- Strong knowledge in Python, SQL, PySpark, and UNIX shell scripting
- Basic to intermediate SQL knowledge in Oracle and Postgre
SQL.
- Intermediate SQL knowledge in Oracle and Postgre
SQL.
- Working experience on Oracle plsql.
- Working experience with Snowflake.
- Good understanding of ETL concepts, data pipelines, data ingestion, and transformations
- Knowledge of cloud optimization techniques for performance, reliability, and cost
- document processes, workflows, and operational runbooks
Position Requirements
10+ Years
work experience
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×