Data Engineer II
Raleigh, Wake County, North Carolina, 27601, USA
Listed on 2026-01-16
-
IT/Tech
Data Engineer, Database Administrator, Data Analyst, Data Warehousing
Overview
This is a remote role that may only be hired in the following locations: NC, AZ, TX
Enterprise data warehouse supports several critical business functions for the bank including Regulatory Reporting, Finance, Risk steering, and Customer 360. This role is vital for building and maintaining the enterprise data platform and data processes, and for supporting business objectives. It serves as production system support by resolving issues and ensuring ongoing functionality. May oversee the work of less experienced analysts or assist in special projects as needed.
Responsibilities- Responsible for designing, building, and maintaining a data platform that supports data integrations for the Enterprise Data Warehouse, Operational Data Store, or Data Marts with appropriate data access, security, privacy, and governance.
- Establish enterprise‑scale data integration procedures, pipelines, and frameworks across the data development life cycle. Suggest and implement appropriate technologies to deliver resilient, scalable, and future‑proof data solutions.
- Create data ingestion pipelines in data warehouses and other large‑scale data platforms.
- Create scheduled as well as trigger‑based ingestion patterns using scheduling tools.
- Create performance‑optimized DDLs for any row‑based or columnar databases such as Oracle, Postgres, and Netezza based on the logical data model.
- Performance tune complex data pipelines and SQL queries.
- Perform impact analysis of proposed changes on existing architecture, capabilities, system priorities, and technology solutions.
- Manage deliverables of developers, perform design reviews, and coordinate release‑management activities.
- Estimate and provide timelines for project activities; identify, document, and communicate technical risks, issues, and alternative solutions discovered during the project.
- Drive automation, identify inefficiencies, optimize processes and data flows, and recommend improvements.
- Use agile engineering practices and various data development technologies to rapidly develop and implement efficient data products.
- Collaborate with Product Owners to understand PI goals, PI planning, requirement clarification, and delivery coordination.
- Provide technical support for production incidents and failures.
- Work with global technology teams across different time zones (primarily US) to deliver timely business value.
Bachelor's Degree and 2 years of experience in data engineering, database management, or a related field OR High School Diploma or GED and 6 years of experience in data engineering, database management, or a related field.
Preferred:
Functional
Skills:
- Team Player:
Support peers, team, and department management. - Communication:
Excellent verbal, written, and interpersonal communication skills. - Problem Solving:
Excellent problem‑solving skills, incident management, root cause analysis, and proactive solutions to improve quality. - Partnership and
Collaboration:
Develop and maintain partnerships with business and IT stakeholders. - Attention to Detail:
Ensure accuracy and thoroughness in all tasks.
Technical/Business
Skills:
- Data Engineering:
- Experience designing and building data warehouses and data lakes. Good knowledge of data warehouse principles and concepts.
- Technical expertise working in large‑scale data‑warehousing applications and databases such as Oracle, Netezza, Teradata, and SQL Server.
- Experience with public cloud‑based data platforms, especially Snowflake and AWS.
- Data integration skills:
- Expertise in designing and developing complex data pipelines.
- Solutions using any industry‑leading ETL tools such as SAP Business Objects Data Services, Informatica Cloud Data Integration Services, IBM Data Stage.
- Experience with ELT tools such as DBT, Fivetran, and AWS Glue.
- Expert in SQL – development experience in at least one scripting language (Python, etc.) and adept at tracing and resolving data integrity issues.
- Strong knowledge of data architecture, data design patterns, modeling, and cloud data solutions (Snowflake, AWS Redshift, Google Big Query).
- Data Model:
Expertise in logical and physical data modeling using relational or dimensional modeling practices for high‑volume ETL/ELT processes. - Performance tuning of data pipelines and database objects to deliver optimal performance.
- Experience with Git Lab version control and CI/CD processes.
- Experience working in the financial industry is a plus.
Benefits are an integral part of total rewards and First Citizens Bank is committed to providing a competitive, thoughtfully designed and quality benefits program to meet the needs of our associates. More information can be found at
#J-18808-Ljbffr(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).