×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

Data Engineering Lead

Job in Suitland, Prince George's County, Maryland, 20746, USA
Listing for: Ignite IT, LLC
Full Time position
Listed on 2026-03-02
Job specializations:
  • IT/Tech
    Data Engineer
  • Engineering
    Data Engineer
Salary/Wage Range or Industry Benchmark: 80000 - 100000 USD Yearly USD 80000.00 100000.00 YEAR
Job Description & How to Apply Below

Position Overview

The Data Engineering Lead is responsible for designing and implementing modern, scalable data architectures to support migration of legacy, file-based analytical systems to AWS Cloud Native environments.

This role leads the transformation of legacy SAS-based data storage models—including flat files, batch outputs, and subsystem‑specific data artifacts—into structured, governed, and scalable data models optimized for cloud‑native processing.

The Data Engineering Lead will ensure data integrity, performance, and visibility across a system‑of‑systems modernization initiative, while providing technical leadership for data modeling, ingestion patterns, validation frameworks, and transparency reporting.

Expert‑level proficiency in Python and strong experience designing AWS‑based data architectures are required.

Key Responsibilities Legacy Data Discovery & Data Model Transformation
  • Participate in structured system inventory efforts to document:
    • Legacy file‑based storage structures
    • SAS dataset dependencies
    • Subsystem data flows
    • Manual gating and handoff processes
  • Analyze legacy storage models and design target‑state data models aligned to AWS Cloud Native architecture.
  • Replace file‐driven batch dependencies with:
    • API‑based ingestion
    • Event‑driven workflows
    • Database‑backed storage (e.g., Aurora/Postgres)
  • Define canonical data schemas and transformation standards.
Cloud‑Native Data Architecture Design
  • Architect scalable AWS data pipelines using services such as:
    • S3
    • Glue
    • Lambda
    • Event Bridge
    • SNS/SQS
    • Aurora/Postgres
    • Batch
    • Athena
  • Design data ingestion, staging, transformation, and validation workflows.
  • Establish schema management, versioning, and data lineage practices.
  • Optimize data storage for performance, scalability, and cost efficiency.
  • Support serverless and containerized data processing architectures.
Expert Python‑Based Data Engineering
  • Develop advanced Python‑based data transformation and validation pipelines.
  • Implement modular, reusable data processing components.
  • Optimize large‑scale data manipulation for distributed execution.
  • Develop high‑performance ETL/ELT frameworks.
  • Embed automated validation checks directly into data pipelines.
Expert‑level Python proficiency is required
  • High‑volume data processing
  • Data validation logic
  • Modular data engineering frameworks
Data Accuracy, Validation & Visibility
  • Design and implement automated data validation frameworks to ensure:
    • Functional equivalence during migration
    • Record‑level and aggregate‑level consistency
    • Downstream compatibility across subsystems
  • Develop dashboards and reporting mechanisms providing:
    • Data accuracy metrics
    • Pipeline health indicators
    • Variance detection summaries
  • Enable transparency into data transformation impacts across modernization phases.
  • Support regression validation through golden datasets and automated comparisons.
System‑of‑Systems Data Coordination
  • Coordinate with Senior Developers and Requirements Engineers to align data models with application modernization.
  • Ensure upstream/downstream data contract stability.
  • Prevent data thrashing during phased migration.
  • Support orchestration of gated workflows through automated triggers rather than manual file exchanges.
  • Collaborate across work streams to establish shared data standards.
Dev Sec Ops  & Governance Alignment
  • Integrate data pipelines into CI/CD frameworks.
  • Support infrastructure‑as‑code alignment (Terraform/Cloud Formation collaboration).
  • Ensure compliance with security controls (IAM, encryption, key management).
  • Produce documentation supporting:
    • Architecture review boards
    • Interface control documents
    • Data flow diagrams
  • Support ATO‑related data validation evidence.
Required Qualifications
  • 8+ years of experience in data engineering or data architecture.
  • Expert-level proficiency in Python for data engineering.
  • Demonstrated experience transforming legacy file-based systems into cloud-native data architectures.
  • Experience developing data models for high-volume, data-intensive applications.
  • Deep experience with AWS data services (Glue, Lambda, S3, Aurora/Postgres, Event Bridge, etc.).
  • Experience designing scalable ETL/ELT pipelines.
  • Experience building analytical dashboards (e.g., Quick Sight or equivalent).
  • Experience implementing…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary