Data Engineer REMOTE
King of Prussia, Montgomery County, Pennsylvania, 19406, USA
Listed on 2026-01-19
-
Software Development
Data Engineer, AI Engineer
Basic Qualifications
- 6+ years professional data‑engineering experience; ≥3 years designing and operating AWS‑native data lakes for mission‑critical or high‑volume workloads.
- B.S. Computer Science, Data Science, Information Systems, or related field (M.S. preferred).
- Extensive knowledge of AWS, AWS Certified Data Analytics – Specialty and Solutions Architect preferred.
- Proven experience in SAFe/Agile environments, working with cross‑functional teams (AI/ML, UX, Systems Engineering).
E9814:
Data Engineer Stf
We are committed to work life balance by promoting this REMOTE telework option. These job requirements allow the employee to work their entire schedule somewhere other than a Lockheed Martin designated office or job site.
What We’re DoingDo you want to be part of a culture that inspires employees to think big, innovate, perform with excellence, and build incredible products? If you have the passion, drive, and courage to dream big, then we want to build a better tomorrow with you. Come and join our team!
Lockheed Martin’s Rotary and Mission Systems’ C4ISR team is looking for a proven and experienced Data Engineer to support a one‑of‑a‑kind project Cross Business Area campaign that exemplifies our commitment to our OneLM Strategy. The Work
Overview – We are delivering a production‑grade, AWS‑based Data Lake, Digital Framework, and seamless data integration for a non‑DoD customer. The Data Engineer will design, build, and operate the data pipelines that ingest, transform, catalog, and serve Lockheed Martin and third‑party data to AI/ML, Unified HMI, and command‑and‑control (C2) applications. This role works closely with the AWS Infrastructure Architect, AI/ML Engineers, Software Factory, MBSE team (Cameo → DOORS NEXT), and the Advisory Board to ensure data quality, security, and performance across the end‑to‑end mission workflow (Detection → Prediction → Response → Recovery).
Data Lake Architecture – Design the AWS Data Lake to store raw, curated, and analytics‑ready data. Define data‑zone concepts (raw, landing, trusted, analytics) and enforce lifecycle policies (retention, archival, deletion).
Ingestion & Streaming – Build high‑throughput ingestion pipelines and implement schema‑on‑write and schema‑on‑read strategies.
Other Functions – Data transformation, catalog & metadata management, and security & governance.
Who We AreThe DSO Team within our Digital Production Environment is building the next‑generation command‑and‑control operator workspace that fuses AI/ML insights, real‑time mission data, and a seamless AWS‑based data‑lake backend. We are seeking a UX Designer who can translate complex sensor and analytics streams into intuitive, mission‑focused visual experiences for a non‑DoD customer. You will work closely with system architects, software engineers, AI/ML specialists, and the Advisory Board to ensure the interface supports the full Detection → Prediction → Response → Recovery workflow.
WhoYou Are
- A self‑starter
- Experienced DSO Engineer
- An engineer committed to delivering high quality, cutting edge technology to be used by our customers and allies across the country and world
Joining our team offers you the opportunity to support a company and a team where your contributions are valued and you can develop your skills and expertise.
Our team also puts a high value on work‑life balance. Striking a healthy balance between your personal and professional life is crucial to your happiness and success here, which is why we aren't focused on how many hours you spend at work or online.
Instead, we're happy to offer a flexible schedule so you can have a more productive and well‑balanced life both in and outside of work, along with competitive pay, and comprehensive benefits.
Desired skills- Deep knowledge of S3, Glue, Lake Formation, Kinesis, MSK, Redshift, Athena, Quick Sight, Lambda, Step Functions, IAM, KMS
- Hands‑on with AWS Glue, Spark, DBT, Airflow (or Managed Workflows for Apache Airflow).
- Proficient in Python (PySpark, Boto3), Scala, SQL, and Shell/Bash.
- Experience integrating MBSE data from Cameo → DOORS NEXT into data‑lake pipelines.
- Knowledge of AI/ML data…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).