×
Register Here to Apply for Jobs or Post Jobs. X

Senior Data Engineering Specialist

Job in Charlotte, Mecklenburg County, North Carolina, 28245, USA
Listing for: U.S. Bank
Full Time position
Listed on 2026-01-24
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing
Job Description & How to Apply Below

At U.S. Bank, we’re on a journey to do our best. Helping the customers and businesses we serve to make better and smarter financial decisions and enabling the communities we support to grow and succeed. We believe it takes all of us to bring our shared ambition to life, and each person is unique in their potential. A career with U.S. Bank gives you a wide, ever-growing range of opportunities to discover what makes you thrive at every stage of your career.

Try new things, learn new skills and discover what you excel at—all from Day One.

Job Description Data Lake Architecture and Experience
  • Guide the team to migrate from on‑prem Cloudera to an Azure cloud environment.
  • Design and implement scalable data lake solutions using Snowflake and Databricks; develop and optimize data pipelines for ingestion, transformation, and storage.
  • Manage data governance, quality, and security across cloud environments; implement performance tuning, automation, and CI/CD for data workflows.
  • Collaborate with cross‑functional teams to support cloud migration activities.
Cloudera Cluster Management
  • Install, configure, manage, and monitor Cloudera Hadoop clusters, ensuring high availability, performance, and security.
  • Manage HDFS, YARN, and other ecosystem components.
Performance Optimization
  • Tune Hadoop, Hive, and Spark jobs and configurations for optimal performance, efficiency, and resource utilization.
  • Optimize queries, manage partitions, and leverage in‑memory capabilities.
Troubleshooting and Support
  • Diagnose and resolve issues related to Linux servers, networks, cluster health, job failures, and performance bottlenecks.
  • Provide on‑call support and collaborate with other teams to ensure smooth operations.
Security, Governance, and Secrets Management
  • Implement and manage security measures within the Cloudera environment, including Kerberos, Apache Ranger, and Atlas, to ensure data governance and compliance.
  • Setup and manage Hashi Corp Vault for secure key and secret management.
  • Utilize Cyber Ark for privileged access management and secure administrative tasks on the cluster.
Data and Application Migration
  • Move Hadoop, Hive, and Spark data and applications to Azure cloud services such as Azure Synapse Analytics, Azure Databricks, or Snowflake.
  • Ensure data integrity, performance tuning, and validation.
  • Develop automation scripts (e.g., shell, Ansible, Python) for administrative tasks, deployments, and monitoring.
  • Work with users to develop, debug, and optimize Hive, Spark, and Python programs that connect to the Cloudera environment.
Documentation
  • Create and maintain documentation for system configurations, operational procedures, and troubleshooting knowledge bases.
Vendor Collaboration
  • Work closely with the Cloudera vendor to stay current with releases, perform upgrades, and address vulnerabilities.
Basic Qualifications
  • Advanced degree in Computer Science, Engineering, or a related field.
  • Deep expertise in Data Engineering and Management technologies, synthetic data, automation, and advanced analytics.
  • 10+ years of hands‑on experience in data engineering, cloud platform management, and performance optimization.
  • Hands‑on experience with Hadoop, Hive, Spark, and migration of Big Data into Azure cloud services.
  • Experience with Databricks and Snowflake for data integration and lake architectures.
  • Working knowledge of Microsoft Azure cloud and big data migration to cloud platforms.
  • Experience with Hashi Corp Vault and Cyber Ark for secrets and privileged access management.
  • Proficiency in Linux, clustering, and distributed systems.
  • Expertise in Hive, Spark, Hadoop ecosystem components (HDFS, YARN, Sqoop).
  • Proficiency in shell, Ansible, C/C++, Java, Python, and PySpark for automating workflows, deployments, and monitoring.
  • Expertise in Linux, network, Python scripting, DNS, Kerberos, LDAP/AD, MySQL, Postgre

    SQL, and Jupyter Hub.
  • Experience in creating and maintaining documentation for system configurations, operational procedures, and troubleshooting knowledge bases.
  • Strong problem‑solving skills and the ability to diagnose and resolve system failures and performance bottlenecks.
  • Excellent communication and collaboration skills to work effectively with…
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary