Data Lake Architect Auburn Hills, MI
Job in
Auburn Hills, Oakland County, Michigan, 48326, USA
Listing for:
ESR Healthcare
Full Time
position
Listed on 2026-01-06
Job specializations:
-
IT/Tech
Data Engineer, Cloud Computing
Job Description & How to Apply Below
Data Lake Architect Auburn Hills, MI - Day 1 Onsite (Need local profiles only)
Duration: Long Term
Experience level: 12+ Years
Job Description:
Minimum of 10 years’ experience in advanced technologies including a minimum of 5+ years as data lake admin/architectManage and maintain Data Lake clusters infrastructure on premise and in cloud: installation, configuration, performance tuning and monitoring of Hadoop clustersBS degree, preferably in Computer science or equivalentGood communication skills with right attitude to blend in with teamMinimum 5 years work experience in Hadoop ecosystems (Horton HDP or Cloudera’s CDP)Strong concepts in Unix/Linux, Windows OS, cloud platforms (AWS, GCP), Kubernetes, Open Shift & DockerGood exposure to Cloudera manager, Cloudera Navigator or similar cluster management toolCollaborate and assist developers in successful implementation of their code, monitor and fine tune their process for optimum resource utilization on cluster, ability to automate run time processGood knowledge of HDFS, Ranger/Sentry, Hive, Impala, Spark, HBase, Kudu, Kafka, Kafka Connect, Schema Registry, Ni-Fi, Sqoop and other Hadoop related servicesExposure to Data Science collaborative tools such as data science workbench, CML, anaconda, etc.Strong Networking concepts: topology, proxy, F5, firewallStrong security concepts:
Active directory, Kerberos, LDAP, SAML, SSL, data encryption at restProgramming language concepts:
Java, Perl, Python, PySpark and Unix shell scriptingExperience in cluster management, perform cluster upgrade, migration, and testingPeriodic updates to cluster and keeping the stack currentAbility to expand clusters by adding new nodes and rebalance cluster storage systemsManage application databases, application integration, users, roles, permissions within clusterCollaborate with Open Shift, Unix, network, database and security teams on cluster related mattersMonitor cluster for maximum uptime, ability to research on cluster issues via logs and collaborate with support in a proactive wayTechnical
Experience:
Solid experience in Cloudera data lake environments both on prem and cloudSolid experience in administration and set up including security topics related to a data lakeStrong experience architecting and designing solutions for new business needsThorough understanding and hands-on experience with implementing robust logging and tracing implementation for end to end systems traceabilityFamiliarity with Cloudera’s BDR tool to perform and monitor backups of critical data and able to restore data when in needWilling and ready to get hands on code development with dev team for developing and troubleshooting, doing quick proof of concepts for exploring new solutions, products etc.Experienced in working with technical teams to discuss, analyze, understand and negotiate business requirements, able to explain to architects about the technical considerations and associated implications on the user journey/experience/requirements.Experience in tuning and optimizing Hadoop environment in keeping clusters healthy and available for end users and applications with maximum cluster uptime as defined in SLADeep knowledge and related experience with Hadoop and its ecosystem components i.e. HDFS, Yarn, Hive, Map Reduce, Pig, Sqoop, Oozie, Kafka, Spark, Presto and other Hadoop components#J-18808-Ljbffr
Position Requirements
5+ Years
work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here: