Data Engineer; Data Scientist
Listed on 2026-03-01
-
IT/Tech
Cybersecurity, Data Engineer, Data Security, Data Analyst
Location: Fairfax, VA, Virginia, United States
Requisition Number: 27366
Required Travel: 0 - 10%
Employment Type: Full Time/Salaried/Exempt
Anticipated Salary Range: $ - $
Security Clearance: TS/SCI
Level of
Experience:
Mid
This opportunity resides with Warfare Systems (WS), a business group within HII’s Mission Technologies division. Warfare Systems comprises cyber and mission IT; electronic warfare; and C5
ISR systems.
HII works within our nation’s intelligence and cyber operations communities to defend our interests in cyberspace and anticipate emerging threats. Our capabilities in cybersecurity, network architecture, reverse engineering, software and hardware development uniquely enable us to support sensitive missions for the U.S. military and federal agency partners.
Meet HII’s Mission Technologies DivisionOur team of more than 7,000 professionals worldwide delivers all-domain expertise and advanced technologies in service of mission partners across the globe. Mission Technologies is leading the next evolution of national defense – the data evolution – by accelerating a breadth of national security solutions for government and commercial customers. Our capabilities range from C5
ISR, AI and Big Data, cyber operations and synthetic training environments to fleet sustainment, environmental remediation and the largest family of unmanned underwater vehicles in every class. Find the role that’s right for you. Apply today. We look forward to meeting you.
HII - Mission Technologies is currently seeking an Intermediate-level Data Engineer to work out of Fairfax, VA in support of the DoD/DoW Advana War Data Platform data integration and analytics environment designed to aggregate operational, intelligence, logistics, and sensor data from multiple domains, enable Joint All-Domain Command and Control (JADC2) by providing a common data fabric, and support AI/ML applications for predictive analytics, targeting, and mission planning.
This position is contingent on contract award.
Responsibilities: I want to and can do that!- Builds and maintains War Data Platform (WDP) Core Integration ingestion, transformation, and storage pipelines supporting operational analytics across Unclassified and NIPR, Secret and SIPR, and Top Secret and JWICS enclaves.
- Develops automated processing flows using Apache Airflow, Spark, AWS Glue, and Kafka to support reliable data movement into medallion-architecture zones.
- Implements data connections, schema mappings, and structured interfaces to support enterprise data marketplaces and downstream tooling teams.
- Applies data quality controls using rule engines, profiling utilities, and anomaly-detection checks to validate accuracy, completeness, and consistency at the source and lake layers.
- Configures encryption, access policies, and data-handling rules to meet DoD cybersecurity requirements for mission environments.
- Executes Tier-2 and Tier-3 support actions, performing incident diagnosis, SLA tracking, and pipeline restoration tasks.
- Builds utilities for standardized inbound and outbound data acquisition, supporting analytics teams, application developers, and operational mission users.
- Documents data pipelines, workflows, storage models, and dependency chains in repositories such as Confluence, Git Lab, and SharePoint.
- Produces operational metrics capturing throughput, failure rates, restart patterns, and quality scores to guide engineering improvements.
- Collaborates with platform, infrastructure, and data governance teams to incorporate architectural updates, security changes, and tool enhancements into ongoing operations, strengthening stability and mission value across all War Data Platform (WDP) Core Integration enclaves.
- 5 years relevant experience with Bachelors in related field; 3 years relevant experience with Masters in related field; 0 years experience with PhD or Juris Doctorate in related field; or High School Diploma or equivalent and 9 years relevant experience.
- Familiarity with automated processing flow tools such as Apache Airflow, Spark, AWS Glue, and Kafka and documenting data…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).