Senior Data Engineer
Listed on 2026-03-05
-
IT/Tech
Data Engineer
Overview
Great Day Improvements - Senior Data Engineer (Twinsburg, OH / Hybrid)
Since its founding 13 years ago, Great Day Improvements, LLC has grown rapidly toward its vision of becoming one of the largest home improvement companies in the U.S. Headquartered in Twinsburg, Ohio, Great Day Improvements is a $1.5 billion, vertically integrated, direct-to-consumer provider of premium home improvement products.
The company’s family of brands includes Patio Enclosures®, Champion Windows and Home Exteriors®, Universal Windows Direct®, Apex Energy Solutions®, Stanek Windows®, Hartshorn Custom Contracting, Your Home Improvement Company, K Designers, Leafguard®, Englert®, and The Bath Authority.
With an expanding workforce of over 4,800 employees across 130 metropolitan markets throughout the U.S. and Canada, Great Day Improvements continues to rank among the top home improvement companies nationwide and is one of the fastest growing private companies in America.
Job Summary
The Senior Data Engineer spearheads digital transformation initiatives at Great Day Improvements, liaising with system architects and integrating data from various sources. This role is centered on the Databricks Lakehouse platform and requires deep expertise in Unity Catalog for data governance and access control, Delta Live Tables (DLT) for pipeline orchestration, and metadata-driven development patterns that maximize reuse, configurability, and maintainability across the data estate.
The ideal candidate will have proven experience in constructing and managing RDBMS and No
SQL ETL, migrations, and data management, with a focus on developing optimized data architecture. The Senior Data Engineer provides critical support to software developers, data analysts, and data scientists in data-related initiatives, ensuring that the architecture for data delivery is maintained optimally throughout all ongoing projects.
This role requires an individual who is self-motivated, embraces AI-assisted development workflows, and is adept at addressing the data needs of multiple teams, systems, and products. The ideal candidate will be enthusiastic about contributing to the design and enhancement of the data infrastructure to support the continuous growth of Great Day’s portfolio of brands.
Location: Twinsburg, OH (Hybrid)
Pay: $160,000 per year
ResponsibilitiesData Pipeline Design & Development
- Design, develop, and maintain scalable and reliable data pipelines that integrate data from multiple sources (CRM, ERP, etc.) into a cohesive data ecosystem
- Collaborate with stakeholders to understand data requirements and deliver comprehensive data models that support business needs
- Build processes supporting data transformation, data structures, metadata, dependency, and workload management
Architecture & Standards
- Analyze and improve existing data architectures to enhance performance and scalability within the Databricks Lakehouse platform
- Build and maintain metadata-driven pipeline frameworks that use external configuration (tables, JSON, YAML) to control pipeline behavior, schema mappings, transformations, and data flow—minimizing hardcoded logic and maximizing reusability
- Contribute to developing and documenting internal and external standards for pipeline configurations, naming conventions, partitioning strategies, and more
- Stay abreast of industry trends and technologies to drive innovation within the data management space
Data Governance & Quality
- Develop and enforce data governance policies and procedures to ensure data integrity and security
- Configure and maintain Unity Catalog securable (catalogs, schemas, tables, volumes) with appropriate grants and privilege hierarchies to enforce least-privilege access
- Ensure high operational efficiency and quality of data platform datasets for project reliability and accuracy through DLT expectations, data quality checks, and monitoring
- Implement and manage Master Data Management (MDM) strategies and solutions to ensure data accuracy, completeness, and consistency across the organization
- Leverage Unity Catalog’s data lineage and audit logging capabilities to support compliance, impact analysis, and operational transparency
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).