Cloud Data QA Engineer
Job in
Bay City, Bay County, Michigan, 48706, USA
Listed on 2026-03-04
Listing for:
Go Digital Technology Consulting LLP
Full Time
position Listed on 2026-03-04
Job specializations:
-
IT/Tech
Data Engineer, Data Analyst, IT QA Tester / Automation, Data Warehousing
Job Description & How to Apply Below
Role Overview
We are seeking an ETL QA Analyst with strong hands‑on experience in validating data pipelines within AWS‑based cloud environments. The ideal candidate will have expertise in SQL‑based data validation, exposure to PySpark transformations, and experience ensuring data integrity across source, staging, and reporting layers. This role focuses on maintaining data accuracy, completeness, and consistency across enterprise data platforms and collaborating closely with data engineering teams in an Agile environment.
Key Responsibilities ETL & Data Validation- Perform ETL testing for AWS-based data pipelines.
- Validate data movement across source, staging, transformation, and reporting layers.
- Write complex SQL queries to validate:
- Data transformations and business logic
- Aggregations, joins, filters, and calculations
- Record counts and reconciliation checks
- Perform source-to-target validation and data reconciliation.
- Test full loads, incremental loads, and data refresh processes.
- Identify data anomalies, integrity issues, and transformation errors.
- Validate data transformations executed using PySpark.
- Support testing of distributed data processing workflows in AWS environments.
- Ensure correctness and consistency of large datasets processed in cloud platforms.
- Use Python to automate repetitive data validation and reconciliation tasks.
- Maintain regression test cases as pipelines evolve.
- Work within modern data engineering tool chains (e.g., version control, CI/CD, orchestration tools) to support automated validation workflows.
- Work closely with data engineers and business stakeholders to review transformation logic and requirements.
- Log, track, and verify resolution of data defects.
- Participate in Agile/Scrum ceremonies and provide QA updates.
- Strong hands‑on ETL testing experience.
- Advanced SQL skills for data validation and reconciliation.
- Experience validating PySpark‑based transformations.
- Working experience in AWS‑based data environments (e.g., S3, Redshift, Glue, EMR, etc.).
- Solid understanding of ETL concepts and data warehousing fundamentals.
- Proficiency in Python for automation.
- Experience working in Agile delivery environments.
- Experience designing or contributing to automated data validation frameworks on AWS.
- Exposure to modern data tool chains and cloud‑native workflows.
- Basic understanding of data modeling concepts.
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×