×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

Lead Data Engineer – Qlik​/Snowflake​/DBT

Job in Raleigh, Wake County, North Carolina, 27601, USA
Listing for: First Citizens Bank
Full Time position
Listed on 2026-02-12
Job specializations:
  • IT/Tech
    Data Engineer
Salary/Wage Range or Industry Benchmark: 60000 - 80000 USD Yearly USD 60000.00 80000.00 YEAR
Job Description & How to Apply Below
Position: Lead Data Engineer – Qlik/ Snowflake/ DBT

Overview

This is a remote role that may only be hired in the following locations: NC, TX, AZ

This position is responsible for architecting, designing, implementing, and optimizing complex data solutions. This role leads the design, development of sophisticated data pipelines, warehouses, and analytics platforms using modern, cloud‑based technology solutions in an agile environment while mentoring junior team members and collaborating with stakeholders to drive data‑driven decision‑making while enforcing.

Responsibilities
  • Data Architecture and Strategy - Design and implement scalable, efficient data architectures. Lead the development of data strategy aligned with business objectives. Evaluate and integrate new technologies to enhance data capabilities
  • Hands‑on Data Pipeline Development – As a technical lead, implement complex data pipelines for real‑time and batch processing. Optimize data flows for high‑volume, high‑velocity data environments. Develop advanced ETL processes for diverse data sources
  • Data Governance and Quality Management – Drive, establish and enforce data governance policies and best practices. Implement data quality frameworks and monitoring systems. Ensure compliance with data regulations and standards
  • Performance Optimization and Troubleshooting – Take charge, analyze and optimize system performance for large‑scale data operations. Troubleshoot complex data issues and implement robust solutions
  • Mentorship and Knowledge Sharing - Mentor junior data engineers and provide technical guidance. Be current on data technologies, recommend best practices and standards. Collaborate with cross‑functional teams, leadership to drive data literacy
  • Testing & Automation – Enforce/write unit test cases, validate/review the data integrity & consistency results, drive automated data pipelines using Git Lab, Github, CICD tools
  • Code Deployment & Release Management – Review & approve code promotions, enforce release management procedures to promote code deployment to various environments including production, disaster recovery, and support activities.
  • Collaborate with business stakeholders to understand data requirements and translate them into an efficient architecture solution.
Qualifications

Bachelor's Degree and 6 years of experience in Advanced data engineering, enterprise architecture, project leadership OR High School Diploma or GED and 10 years of experience in Advanced data engineering, enterprise architecture, project leadership

Preferred Technical/Business Skills
  • Deep expertise in designing, building robust metadata‑driven, automated data pipeline solutions leveraging modern cloud‑based data technologies, tools for large data warehouse, database platforms.
  • Deep experience leveraging data security, governance methodologies meeting data compliance requirements.
  • Strong hands‑on experience designing, building medallion architecture ELT pipelines, snowpipe, streaming frameworks using Qlik Replicate, DBT Cloud transformations, snowflake, Git Lab with CICD.
  • Strong experience designing, building data integrity solutions across multiple data sources and targets like SQL Server, Oracle, Mainframe-DB2, files, Snowflake.
  • Strong design, development experience in Python/Pyspark, advanced SQL for ingestion frameworks and automation.
  • Strong experience architecting, building solutions using S3, Lambda, SQS, SNS, Glue, RDS AWS services.
  • Strong experience working with various structured and semi‑structured data files - CSV, fixed‑width, JSON, XML, Excel, and mainframe VSAM.
  • Strong orchestration experience using DBT cloud, Astronomer Airflow.
  • Design, implement logging, monitoring, alerting, observability using tools like Dynatrace.
  • Strong experience in problem solving, performance tuning.
  • Design and implement schema drift detection and schema evolution patterns.
  • Strong experience designing, implementing sensitive data protection strategies – tokenization, snowflake data masking policies, dynamic & conditional masking, and role‑based masking rules.
  • Strong experience designing, implementing RBAC, data access controls, adopting governance standards across Snowflake and supporting systems.
  • Strong experience in enforcing, adopting…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary