More jobs:
Data Engineer III
Job in
Columbia, Howard County, Maryland, 21046, USA
Listed on 2026-01-12
Listing for:
eSimplicity
Full Time
position Listed on 2026-01-12
Job specializations:
-
IT/Tech
Data Engineer, Data Analyst
Job Description & How to Apply Below
Description
eSimplicity is a modern digital services company that partners with government agencies to improve the lives and protect the well‑being of all Americans, from veterans and service members to children, families, and seniors. Our engineers, designers, and strategists cut through complexity to create intuitive products and services that equip federal agencies with solutions to courageously transform today for a better tomorrow.
Responsibilities- Responsible for developing, expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams.
- Support software developers, database architects, data analysts and data scientists on data initiatives and ensure optimal data delivery architecture is consistent throughout ongoing projects.
- Creates new pipeline and maintains existing pipeline, updates Extract, Transform, Load (ETL) process, creates new ETL feature , builds PoCs with Redshift Spectrum, Databricks, AWS EMR, Sage Maker, etc.;
- Implements, with support of project data specialists, large dataset engineering: data augmentation, data quality analysis, data analytics (anomalies and trends), data profiling, data algorithms, and (measure/develop) data maturity models and develop data strategy recommendations.
- Operate large-scale data processing pipelines and resolve business and technical issues pertaining to the processing and data quality.
- Assemble large, complex sets of data that meet non-functional and functional business requirements
- Identify, design, and implement internal process improvements including re-designing data infrastructure for greater scalability, optimizing data delivery, and automating manual processes ?
- Building required infrastructure for optimal extraction, transformation and loading of data from various data sources using AWS and SQL technologies
- Building analytical tools to utilize the data pipeline, providing actionable insight into key business performance metrics including operational efficiency and customer acquisition?
- Working with stakeholders including data, design, product and government stakeholders and assisting them with data-related technical issues
- Write unit and integration tests for all data processing code.
- Work with Dev Ops engineers on CI, CD, and IaC.
- Read specs and translate them into code and design documents.
- Perform code reviews and develop processes for improving code quality.
- Perform other duties as assigned.
Required Qualifications
- All candidates must pass public trust clearance through the U.S. Federal Government. This requires candidates to either be U.S. citizens or pass clearance through the Foreign National Government System which will require that candidates have lived within the United States for at least 3 out of the previous 5 years, have a valid and non-expired passport from their country of birth and appropriate VISA/work permit documentation.
- Minimum of 8 years of previous Data Engineer or hands on software development experience with at least 4 of those years using Python, Java and cloud technologies for data pipelining.
- A Bachelor’s degree in Computer Science, Information Systems, Engineering, Business, or other related scientific or technical discipline. With ten years of general information technology experience and at least eight years of specialized experience, a degree is NOT required.
- Expert data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up.
- Self-sufficient and comfortable supporting the data needs of multiple teams, systems, and products.
- Experienced in designing data architecture for shared services, scalability, and performance
- Experienced in designing data services including API, meta data, and data catalogue.
- Experienced in data governance process to ingest (batch, stream), curate, and share data with upstream and downstream data users.
- Ability to build and optimize data sets, ‘big data’ data pipelines and architectures?
- Ability to perform root cause analysis on external and internal processes and data to identify opportunities for improvement and answer questions?
- Excellent analytic skills…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×