More jobs:
Engineer, Data
Job in
Tucson, Pima County, Arizona, 85718, USA
Listed on 2026-01-12
Listing for:
Holley Performance
Full Time
position Listed on 2026-01-12
Job specializations:
-
IT/Tech
Data Engineer, Data Warehousing
Job Description & How to Apply Below
Job Description
Overview:
This role focuses on backend development and integrations for building and maintaining enterprise data warehouses and data lakes. The ideal candidate will possess a deep understanding of data architecture, ETL pipelines, and integration technologies, ensuring seamless data flow and accessibility across the organization.
- Design, develop, and maintain scalable backend systems to support data warehousing and data lake initiatives.
- Build and optimize ETL/ELT processes to extract, transform, and load data from various sources into centralized data repositories.
- Develop and implement integration solutions for seamless data exchange between systems, applications, and platforms.
- Collaborate with data architects, analysts, and other stakeholders to define and implement data models, schemas, and storage solutions.
- Ensure data quality, consistency, and security by implementing best practices and monitoring frameworks.
- Monitor and troubleshoot data pipelines and systems to ensure high availability and performance.
- Stay up-to-date with emerging technologies and trends in data engineering and integration to recommend improvements and innovations.
- Document technical designs, processes, and standards for the team and stakeholders.
- Bachelor’s degree in Computer Science, Engineering, or a related field; equivalent experience considered.
- Proven experience as a Data Engineer with 5 or more years of experience; or in a similar backend development role.
- Strong proficiency in programming languages such as Python, Java, or Scala.
- Hands‑on experience with ETL/ELT tools and frameworks (e.g., Apache Airflow, Talend, Informatica, etc.).
- Extensive knowledge of relational and non‑relational databases (e.g., SQL, No
SQL, Postgre
SQL, Mongo
DB). - Expertise in building and managing enterprise data warehouses (e.g., Snowflake, Amazon Redshift, Google Big Query) and data lakes (e.g., AWS S3, Azure Data Lake).
- Familiarity with cloud platforms (AWS, Azure, Google Cloud) and their data services.
- Experience with API integrations and data exchange protocols (e.g., REST, SOAP, JSON, XML).
- Solid understanding of data governance, security, and compliance standards.
- Strong analytical and problem‑solving skills with attention to detail.
- Excellent communication and collaboration abilities.
- Certifications in cloud platforms (AWS Certified Data Analytics, Azure Data Engineer, etc.)
- Experience with big data technologies (e.g., Apache Hadoop, Spark, Kafka).
- Knowledge of data visualization tools (e.g., Tableau, Power BI) for supporting downstream analytics.
- Familiarity with Dev Ops practices and tools (e.g., Docker, Kubernetes, Jenkins).
Please note:
Relocation assistance will not be available for this position.
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×