More jobs:
Technical Lead, IT/Tech, Data Engineer
Job in
Eugene, Lane County, Oregon, 97403, USA
Listed on 2026-02-28
Listing for:
Maximus
Full Time
position Listed on 2026-02-28
Job specializations:
-
IT/Tech
Data Engineer
Job Description & How to Apply Below
Essential Duties and Responsibilities
- Build and design high-level architecture documentation.
- Collaborate with other teams in the organization to define supporting infrastructure and tools needed.
- Learn and analyze new technologies and industry best practices to identify suitability for adoption by the organization.
- Articulate and present the implications of design / architectural decisions, issues and plans to leadership.
Essential Duties and Responsibilities
- Initiate, plan, document, manage and maintain technical projects.
- Lead and guide the work of technical staff and serve as the liaison between business and technical aspects of projects.
- Create, update, and manage high-quality project documentation including executive briefings and reports.
- Develop and maintain project schedule(s).
- Coach, mentor, motivate and supervise technical project team members and contractors, and influence them to take positive action and accountability for their assigned work.
- Lead the team in production and pre-production troubleshooting sessions to identify issues, performance bottlenecks and formulate a strategy to remediate as needed.
- Ensure team delivers all project artifacts as discussed and agreed upon with the IRS client.
- Maintain direct responsibility and ownership in developing and deploying components or complete application functionality using Databricks, Informatica, SQL and other tools.
- Design software components from business requirements in collaboration with other team members.
- Support high level architecture design.
Provide technical leadership for developing highly performant, robust and reliable solutions that scale.
- Position is remote within US.
Ideal candidates will be in areas surrounding Lanham MD, Farmers Branch, TX or Austin TX areas.
Local candidates may attend meetings occasionally at the IRS facility.
Minimum Requirements- Bachelor's degree in related field.
- 7 years of relevant professional experience required.
- Equivalent combination of education and experience considered in lieu of degree.
- Computer Professional Job Profile
Minimum Requirements
- Bachelor's Degree from an accredited college or university required; an additional four (4) years of related work experience can substitute for a degree.
- Minimum of four (4) years of experience leading a technical team in a production environment.
- At least ten (10) years of hands‑on experience working with databases / ETL applications building data pipelines.
- Must possess an active IRS MBI.
- Minimum of two (2) years' experience working with IRS systems and data such as IRTF, IRMF, CADE2, IMF, BMF, BRTF and others.
- Experience integrating a range of technologies in a large federal IT environment such as the IRS.
- Knowledge of cloud-based technologies and integration of on‑prem to on‑cloud migrations.
- Implementation knowledge of AWS, Java, Python, REST API's, various RDBMS like Oracle, Databricks, Red Shift and others.
- Experience with performance optimization of data pipelines.
- Expertise in building CLI based solutions using a variety of available tools.
- Knowledge of Databricks and ability to write SQL queries and Python scripts to query and manipulate data.
- Knowledge of ETL and BI tools and able to understand and support ETL and BI functionalities of the application.
- Good problem‑solving skills with architectural, design and hands‑on development experience.
- Experience working in and a general understanding of Red Hat Linux OS, or other Unix-like OS.
- Experience with Shell scripting.
Skills and Qualifications
- Experience building data pipelines moving data from on‑premises to Databricks running on AWS cloud in the IRS environment.
- Expertise with AWS DMS and AWS CLI tools.
- Experience building complex data pipelines using Control‑M as the orchestration tool.
- Hands‑on implementation experience with data migration, data wrangling and data manipulation.
- Proven experience with using COTS products and custom scripting to build ETL pipelines.
- Prior implementation and development knowledge of Bash scripts.
- Excellent programming language skills in Python on Linux environment.
- Understanding of application and transport layer security.
- Knowledge of SQL and performance…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×