More jobs:
ETL Developer
Job in
Austin, Travis County, Texas, 78716, USA
Listed on 2026-03-01
Listing for:
TechDigital Group
Full Time
position Listed on 2026-03-01
Job specializations:
-
IT/Tech
Data Engineer, Data Analyst, Data Warehousing
Job Description & How to Apply Below
Required Skills
Strong ETL ,SQL expert with knowledge on python.
- ETL concepts
- Data Warehousing concepts
- Advanced SQL Concepts
- Data Validation/Data Quality Check
- CI/CD techniques.
- Programming language: JAVA, Python
- Cloud Platform: GCP
Roles & Responsibilities
- Manage the end-to-end lifecycle of data pipelines, including extraction, transformation, and loading into the Google Warehouse (GCP).
- Conduct in-depth data analysis and apply statistical methods to derive insights.
- Execute code development within the designated DEV environment.
- Perform comprehensive validation and testing prior to deploying code to UAT or other staging environments.
- Document and maintain records of all test outcomes.
- Facilitate the code submission process by preparing Change Lists for peer review.
- Oversee the final deployment of code across multiple environments, including UAT, PREPROD, and PROD.
- Develop and manage ETL data pipelines to populate the data warehouse using various custom and third-party systems.
- Create, deploy, and refine comprehensive full-stack Data and BI solutions, covering everything from extraction and storage to transformation and visualization.
- Utilize SQL and Python to build and maintain robust data analysis scripts.
- Provide ongoing support and development for dashboards and reports via Google PLx and Looker Studio.
- Enhance existing business intelligence tools and create new dashboards to drive organizational growth.
- Conduct detailed data examinations and apply statistical analysis techniques.
- Monitor performance and implement necessary infrastructure optimizations
- Demonstrate excellent collaboration, interpersonal communication and written skills with ability to work in a team environment.
- Candidates must possess at least 6-8 years of professional experience.
- Due to the high-velocity nature of this project, individuals with extensive experience will achieve the most effective results.
- Design, develop, and maintain scalable and robust ETL/ELT processes and data pipelines using various tools and technologies.
- Build and optimize data warehouses, data lakes, and other data storage solutions to support analytical and operational needs.
- Implement data quality checks and monitoring to ensure the accuracy, completeness, and consistency of data.
- Work with large datasets, performing data modeling, schema design, and performance tuning.
- Create data models that are easy for BI tools to consume and build dashboard.
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×