×
Register Here to Apply for Jobs or Post Jobs. X

ETL Developer; Remote

Remote / Online - Candidates ideally in
Springfield, Sangamon County, Illinois, 62777, USA
Listing for: Conexess Group
Remote/Work from Home position
Listed on 2026-02-28
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst, Cloud Computing
Salary/Wage Range or Industry Benchmark: 80000 - 100000 USD Yearly USD 80000.00 100000.00 YEAR
Job Description & How to Apply Below
Position: ETL Developer (Remote)

Conexess Group is aiding a large healthcare client in their search for an ETL Developer in a remote capacity. This is a long‑term opportunity with a competitive compensation package.

Please note we are unable to provide sponsorship or work c2c for this role.

This position is responsible for collaborating with Claims and Enrollment, Payment Integrity, and Quality Auditing engineering, infrastructure, and business teams to design, build, and maintain solutions for said business areas. This position will play an integral role as a part of 1‑2 scrum teams, handling software engineering needs under the leadership of a delivery manager. The ideal candidate will be expected to develop subject matter expert (SME) knowledge in various areas that the candidate’s teams own.

This role is critical for ensuring the reliability, efficiency, quality, and compliance of our data systems that support critical business operations and decisions.

Responsibilities
  • Design, develop, and maintain scalable and reliable ETL processes using SQL Server Integration Services (SSIS)
  • Create and optimize complex SQL queries, stored procedures, and functions to support data transformation and business logic
  • Modify and enhance existing ETL process to improve performance, scalability, and reliability
  • Perform data manipulation and ensure data quality and integrity across all systems
  • Work with large‑scale data warehousing environments, including experience with Teradata
  • Collaborate with cross‑functional teams in an agile (Scrum) environment to deliver on data‑related projects and initiatives.
  • Utilize Jira for task management, tracking progress, and reporting on development activities.
  • Manage and version control code using Git repositories, ensuring best practices for branching, merging, and code reviews.
  • Troubleshoot and resolve data‑related issues in a timely manner.
  • Provide production support for existing data pipelines and processes, ensuring high availability and performance.
  • Monitor, modify, and maintain ETL processes within the AWS ecosystem, leveraging services like EC2, Lambda, and Cloud Watch.
Requirements
  • Bachelor's Degree and 4 years in Information Technology or relevant experience OR Technical Certification and/or College Courses and 6 years Information Technology experience OR 8 years Information Technology experience.
  • ETL expertise: proven experience with SSIS, including designing and building complex ETL packages from the ground up.
  • SQL proficiency:
    Strong command of SQL, with the ability to write and optimize complex stored procedures, triggers, and queries.
  • Data manipulation: demonstrated ability to manipulate and transform large datasets accurately and efficiently.
  • AWS knowledge: foundational knowledge of the AWS ecosystem and hands‑on experience with EC2, Lambda, and Cloud Watch for monitoring and managing data processes.
  • Agile experience: experience working in an agile development environment, with a solid understanding of Scrum principles and practices.
  • Version control: proficiency with Git repositories for code management.
  • Collaboration tools:
    Familiarity with Jira for project and issue tracking.
  • Tri Zetto experience: experience with Tri Zetto products is highly desirable.
  • Business acumen: understanding of business priorities, industry trends, and market dynamics.
  • Process improvement: ability to simplify and standardize complex concepts and processes.
  • Communication:
    Excellent oral and written communication skills.
  • Decision‑making: ability to prioritize and make trade‑off decisions to drive cross‑functional execution.
  • Adaptability: ability to introduce and manage change effectively.
  • Teamwork: a strong sense of teamwork and collaboration.
  • Analytical skills: strong analytical and problem‑solving abilities, with a proven track record of resolving complex data‑related challenges.
  • Organizational skills: highly organized and detail‑oriented, with a commitment to delivering high‑quality, accurate solutions.
Preferred Qualifications
  • Bachelor’s Degree (Computer Science, MIS, or related degrees).
  • Teradata experience: hands‑on experience with the Teradata database platform is highly desirable.
  • Cloud platforms: experience with cloud data platforms such as Azure Data Factory, AWS Glue, or similar services.
  • Data modeling: understanding of data modeling principles and best practices.
  • Communication skills: excellent verbal and written communication skills, with the ability to articulate technical concepts to both technical and non‑technical audiences.
#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary