×
Register Here to Apply for Jobs or Post Jobs. X

ETL Team Lead

Job in Greenville, Greenville County, South Carolina, 29610, USA
Listing for: CanalConnect
Full Time position
Listed on 2026-01-15
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing
Job Description & How to Apply Below

Canal Insurance Company specializes in insurance for commercial trucking and specialty transportation operations. Canal was founded in 1939 and is located in Greenville, South Carolina. At Canal, we recognize that our success would not be possible without the hard work and dedication of our employees. We know that happiness and productivity go hand in hand, and to that end, we consciously cultivate a culture that enables us to recruit and retain the very best talent in the business.

A

Culture With YOU in Mind
  • Located in beautiful downtown Greenville, SC
  • Employee referral program
  • Casual dress code
  • Innovation-focused & customer-centric
  • 80+ years of industry expertise
  • Committed to giving back to our community
  • Unquestioned integrity and commitment
Benefits at Canal
  • Basic & Voluntary Life Insurance Plans
  • Short Term & Long Term Disability
  • 401(k) plan with company match up to 6%
  • Flexible Spending Accounts
  • Employee Assistance Programs
  • Generous PTO Plan
Major Accountabilities
  • Production Support, Operations & Reliability
    • The ETL Team Lead owns end-to-end operational support for Canal's existing data stack:
      • Monitor daily ETL loads across SQL jobs, DHIC (GW Data Hub and Info Center) and legacy SSIS packages.
      • Work with AMS team where necessary to troubleshoot pipeline failures, performance issues, schema mismatches, permissions issues, and cloud resource failures.
      • Work with AMS team where necessary to perform root-cause analysis and to implement permanent fixes.
      • Ensure SLA adherence and on-time delivery of critical reporting datasets for scheduled ETL jobs .
      • Provide direction for both AMS and ETL Developers for Legacy & Current ETL maintenance.
      • Refactor or retire outdated or redundant ETL processes.
  • Maintain and improve existing pipelines that utilize the following technologies:
    • Microsoft SQL Server Database programming
    • T‑SQL Scripting
    • SQL Server Integration Services
    • Microsoft Powershell
    • Guidewire Data Hub and Info Center
    • Oracle Database programming
    • Oracle PL‑SQL Scripting
    • SAP BODS (SAP Business Objects Data Services)
    • Postgre

      SQL Scripting
  • Operational Excellence
    • Work with AMS team to assist with the creation and/or enhancement of operational runbooks, SOPs, monitoring dashboards, and incident response workflows.
    • Partner with other IT operational segments, business SMEs, and AMS team to minimize downtime and to ensure that business SLAs are met.
    • Improve upon existing, as well as implement new proactive for daily processing.
  • Business Continuity
    • Work with AMS team to ensure development support coverage for critical data pipelines (rotation-based).
    • Support month‑end and quarter‑end financial reporting cycles.
    • Coordinate production releases and validate deployments.
    • The ETL Tech Lead will become the steady‑state technical owner of the entire data operations layer during the Canal modernization journey.
  • Technical Leadership & Collaboration
    • Serve as technical lead guiding onshore/offshore developers.
    • Review code, enforce best practices, and mentor junior engineers.
  • Partner with Scrum Masters, Project Mangers, Enterprise Architecture, QA Automation, Change Management, and AMS support teams.
  • Data Ingestion, ETL/ELT Development & Optimization
    • Develop reusable ingestion patterns for Guidewire Data Hub and Info Center, Hub Spot, Telematics, and other facets of the business.
    • Work with Canal Architects to modernize existing ETL workloads using Delta Lake, Medallion Architecture, and Fabric Lakehouse.
    • Build scalable data ingestion pipelines using potential upcoming technology (Azure Data Factory, MS Fabric , Databricks, Synapse Pipelines, etc.).
    • Work to bring in Internal and External integrations data into the platform.
  • Proven experience designing and implementing real-time data pipelines using Event Hub, Fabric Real-Time Analytics, Databricks Structured Streaming, and KQL‑based event processing.
  • Develop and enable real-time operational insights and automation capabilities—through event‑driven architectures and streaming analytics.
  • Lead the strategy, design, and engineering of Canal’s modern Azure data ecosystem using next‑generation tools and Medallion Architecture, including:
    • Implementing Medallion Architecture (Bronze/Silver/Gold) across Fabric Lakehouse, Warehouse,…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary