×
Register Here to Apply for Jobs or Post Jobs. X

Hadoop Developer

Job in Riverwoods, Lake County, Illinois, USA
Listing for: Info-Ways
Full Time position
Listed on 2026-02-28
Job specializations:
  • IT/Tech
    Data Engineer, Big Data
Salary/Wage Range or Industry Benchmark: 80000 - 100000 USD Yearly USD 80000.00 100000.00 YEAR
Job Description & How to Apply Below

BGV will be done for the selected candidates.

Job Duties
  • The Senior/Lead Hadoop Developer is responsible for designing, developing, testing, tuning and building a large‑scale data processing system, for Data Ingestion and Data products that allow the Client to improve quality, velocity and monetization of our data assets for both Operational Applications and Analytical needs.
  • Design, develop, validate and deploy the ETL processes.
  • Must have used HADOOP (PIG, HIVE, SQOOP) on HORTONWORKS Distribution.
  • Responsible for the documentation of all Extract, Transform and Load (ETL) processes.
  • Maintain and enhance ETL code, work with the QA and DBA team to fix performance issues.
  • Collaborate with the Application team to design and develop required ETL processes, performance tune ETL programs/scripts.
  • Work with business partners to develop business rules and business rule execution.
  • Perform process improvement and re‑engineering with an understanding of technical problems and solutions as they relate to the current and future business environment.
  • Design and develop innovative solutions for demanding business situations.
  • Help drive cross team design / development via technical leadership / mentoring. Work with Offshore team of developers.
  • Analyze complex distributed production deployments, and make recommendations to optimize performance.
Essential Skills
  • Minimum 3 years ETL experience with RDBNS and Big Data strongly preferred; may consider experience with Informatica or Datastage as an alternate.
  • Minimum 2+ years of experience in creating reports using TABLEAU.
  • Proficiency with HORTONWORKS Hadoop distribution components and custom packages.
  • Proven understanding and related experience with Hadoop, HBase, Hive, Pig, Sqoop, Flume, Hbase and/or Map/Reduce.
  • Excellent RDBMS (Oracle, SQL Server) knowledge for development using SQL/PL SQL.
  • Basic UNIX OS and Shell Scripting skills.
  • 6+ years’ experience in UNIX and Shell Scripting.
  • 3+ years’ experience in job scheduling tools like Auto Sys.
  • 3+ years’ experience in Pig and Hive Queries.
  • 3+ years’ experience Hand on experience with Oozie.
  • 3+ years’ experience in Importing and exporting the data using Sqoop from HDFS to Relational Database systems/mainframe and vice‑versa.
  • 3+ years’ experience in Importing and exporting the data using Sqoop from HDFS to Relational Database systems/Teradata and vice‑versa.
  • Must have 2+ Experience in working with Spark for data manipulation, preparation, cleansing.
Additional Information

All your information will be kept confidential according to EEO guidelines.

#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary