×
Register Here to Apply for Jobs or Post Jobs. X

Software Engineering III

Job in Charlotte, Mecklenburg County, North Carolina, 28245, USA
Listing for: Equitable Advisors
Full Time position
Listed on 2026-03-01
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst
Salary/Wage Range or Industry Benchmark: 90000 - 120000 USD Yearly USD 90000.00 120000.00 YEAR
Job Description & How to Apply Below

Job Description - Software Engineering III )

  • Provide Big Data Solutions for Datalake design, data ingestion and processing on Hadoop Cluster using Spark, Map Reduce, Hive/Hbase, Flume, Kafka, Syncsort along with programming language such as python/scala.
  • Work with Customer Data Product Owner to gather and translate business requirement, study the priority and criticality of those to advise the order of deliverables that fits the customer focus on key deliverables intact.
  • Working on Databricks platform to execute and optimize the data pipelines end to end.
  • Responsible for data gathering and analysis; systems design and implementation; logical design; detailed design; ensuring data security in the design; and system evaluation, integration, vetting, modification, troubleshooting, and optimization.
  • Serve as subject matter expert (SME) for Data Lake infrastructure and services.
  • Maintain current Data Lake applications and develops procedures, where necessary, to improve the environment as required. Complies with all security and audit standards.
  • Provide technical expertise to the development and implementation of Data Lake solutions.
  • Liaise with business unit customers and vendors depending on assignment and interact with IT Senior Executives.
  • Responsible for design specifications of one or more large or critical applications or systems.
  • Provide technical, functional and systems design for all work related to a system development project.
  • Lead the process of compiling, analyzing, designing, testing and prioritizing system design components and implementation.
  • Assists with technical testing, ensuring that the system and unit tests were performed and reviews the test results.
  • Provide production support for new/existing systems of high complexity and scope.
  • Use Linux, Hadoop, Scoop, HIVE, Impala, Tableau, Python and Databricks to carry out job duties.

Experience must include:
Sourcing & ETL Development support to build multiple Data Products for analytical and actuarial purpose;
Hadoop Technologies (Sqoop, Python, Databricks);
Azure Data Platform Handling;
Building predictive models using sentiment scores to forecast market trends and assess the correlation between market movements; HDFS, Hive, Impala, Sqoop, Spark, Python, Azure; and ETL, Vertica Map Reduce, Spark, Kafka Hive, Impala, flume, Storm, Zookeeper, Java, PL/SQL Oracle, Teradata, Scala, MySQL, and Eclipse

#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary