More jobs:
Senior Data Engineer/Remote Security Clearance
Remote / Online - Candidates ideally in
Reston, Fairfax County, Virginia, 20190, USA
Listed on 2026-01-08
Reston, Fairfax County, Virginia, 20190, USA
Listing for:
ICF
Remote/Work from Home
position Listed on 2026-01-08
Job specializations:
-
Software Development
Data Engineer, Software Engineer
Job Description & How to Apply Below
Description The company: ICF is a mission-driven company filled with people who care deeply about improving the lives of others and making the world a better place. Our core values include Embracing Difference; we seek candidates who are passionate about building a culture that encourages, embraces, and hires dimensions of difference. The Team:
Our Health Engineering Systems (HES) team works side by side with customers to articulate a vision for success, and then make it happen. We know success doesn't happen by accident. It takes the right team of people, working together on the right solutions for the customer. We are looking for a seasoned Senior Data Engineer who will be a key driver to make this happen.
Responsibilities:
* Design, develop, and maintain scalable data pipelines using Spark, Hive, and Airflow
* Develop and deploy data processing workflows on the Databricks platform
* Develop API services to facilitate data access and integration
* Create interactive data visualizations and reports using AWS Quick Sight
* Builds required infrastructure for optimal extraction, transformation and loading of data from various data sources using AWS and SQL technologies
* Monitor and optimize the performance of data infrastructure and processes
* Develop data quality and validation jobs
* Assembles large, complex sets of data that meet non-functional and functional business requirements
* Write unit and integration tests for all data processing code
* Work with Dev Ops engineers on CI, CD, and IaC
* Read specs and translate them into code and design documents
* Perform code reviews and develop processes for improving code quality
* Improve data availability and timeliness by implementing more frequent refreshes, tiered data storage, and optimizations of existing datasets
* Maintain security and privacy for data at rest and while in transit
* Other duties as assigned
Minimum Qualifications:
* Bachelor's degree in computer science, engineering or related field
* 7+ years of hands-on software or data development experience
* 4+ years of data pipeline experience using Python, PySpark and cloud technologies
* 2 years working in Spark and Hive or similar large data environments
* Candidate must be able to obtain and maintain a Public Trust clearance
* Candidate must reside in the US, be authorized to work in the US, and work must be performed in the US
* Must have lived in the US 3 full years out of the last 5 years
* Travel of up to once a quarter US domestically is required
Preferred Qualifications:
* "U.S. Citizenship or Green Card is highly prioritized due to federal contract requirements"
* Experience building job workflows with the Databricks platform (Strongly Preferred)
* Strong understanding of AWS products including S3, Redshift, RDS, EMR, AWS Glue, AWS Glue Data Brew, Jupyter Notebooks, Athena, Quick Sight, EMR, and Amazon SNS
* Familiar with work to build processes that support data transformation, workload management, data structures, dependency and metadata
* Experienced in data governance process to ingest (batch, stream), curate, and share data with upstream and downstream data users.
* Experienced in data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up.
* Demonstrated understanding using software and tools including relational No
SQL and SQL databases including Cassandra and Postgres; workflow management and pipeline tools such as Airflow, Luigi and Azkaban; stream-processing systems like Spark-Streaming and Storm; and object function/object-oriented scripting languages including Scala, C++, Java and Python.
* Familiar with Dev Ops methodologies, including CI/CD pipelines (Github Actions) and IaC (Terraform)
* Experience with Agile methodology, using test-driven development.
Job Location:
This position requires that the job be performed in the United States. If you accept this position, you should note that ICF does monitor employee work locations and blocks access from foreign locations/foreign IP addresses and also prohibits personal VPN connections. #DMX-HES #Li-cc1 #Indeed Working at ICF ICF is a global advisory and technology services provider, but we're not your typical consultants.
We combine unmatched expertise with cutting-edge technology to help clients solve their most complex challenges, navigate change, and shape the future. We can only solve the world's toughest challenges by building a workplace that allows everyone to thrive. We are an equal opportunity employer . Together, our employees are empowered to share their expertise and collaborate with others to achieve personal and professional goals.
For more information, please read our EEO policy. We will consider for employment qualified applicants with arrest and conviction records. Reasonable Accommodations are available, including, but not limited to, for disabled veterans, individuals with disabilities, and individuals with sincerely held religious beliefs, in all phases of the application and employment…
Position Requirements
10+ Years
work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×