×
Register Here to Apply for Jobs or Post Jobs. X

Data Integration Engineer

Remote / Online - Candidates ideally in
Virginia, St. Louis County, Minnesota, 55792, USA
Listing for: Leidos
Full Time, Remote/Work from Home position
Listed on 2026-02-28
Job specializations:
  • IT/Tech
    Data Engineer, Data Science Manager, Data Analyst, Data Security
Salary/Wage Range or Industry Benchmark: 107900 - 195050 USD Yearly USD 107900.00 195050.00 YEAR
Job Description & How to Apply Below

Description

The Leidos Digital Modernization Sector is seeking a Data Integration Engineer; this position will allow for full time telework from any U.S. based location

Position Summary:

We are seeking a highly motivated Data Integration Engineer to support the design, integration, and operationalization of enterprise data products and repositories across modern cloud data platforms.
This role will expand traditional physical data modeling, application data base design, and ETL capabilities into logical, semantic, and AI-ready data architectures.
The ideal candidate will work closely with data owners and CDAO technical staff to design scalable data products, data repositories, and integration patterns that support analytics, governance, and AI initiatives. This role will help modernize data structures within our cloud-native platforms and enable reusable, well-modeled, and secure access to enterprise data.

Primary Responsibilities:
  • Design and implement data integration solutions across enterprise systems and cloud data platforms (e.g., Oracle, Snowflake, AWS, Azure).
  • Extend existing physical data models into logical and semantic data models that support analytics and AI use cases.
  • Partner with data owners and CDAO technical staff to define, design, and refine enterprise data products, including domains, schemas, interfaces, SLAs, and consumption patterns.
  • Collaborate with the team to translate enterprise architecture standards and data governance guidelines into implementable models (logical, physical, domain) and integration patterns.
  • Work closely with Data Engineers to ensure pipelines are aligned to target logical, physical, domain, and semantic models.
  • Develop, implement, and maintain dimensional, relational, and domain-driven data product models and databases using the IDERA /Embarcadero suite of products. Ensure assets are optimized for performance, scalability, and AI-readiness within scalable cloud-native data platforms.
  • Collaborate with data owners and CDAO technical staff to develop and maintain Leidos data protection and data privacy policies governing data use.
  • Collaborate with CDAO technical staff to develop and maintain the IDERA / Embarcadero repository and portal data objects.
  • Support metadata registration and governance alignment within Collibra.
  • Implement data integration patterns including batch, streaming, API-based, and event-driven architectures.
  • Participate in data quality and validation processes to ensure trusted, production-ready data products.
  • Contribute to documentation, standards, and modeling best practices.
Basic Qualifications:
  • Bachelor’s degree in Computer Science, Information Systems, or related field and 8+ years of relevant experience.
  • Strong experience with data modeling (conceptual, logical, semantic, and physical) and using the IDERA /Embarcadero or similar products.
  • Hands‑on experience on assembling and implementing DDL for Tables, Views, SQL frameworks, and Security policies within relational database systems.
  • Understanding of Data Replication products, preferably Oracle’s Golden Gate replication.
  • Hands‑on experience with cloud data platforms such as Snowflake, AWS, Azure, or GCP.
  • Hands‑on experience working with ETL/ELT/API developers on the design and implementation of data integration pipelines; preferably experience with Informatica.
  • Proficiency in SQL and understanding of performance optimization techniques.
  • Experience working with Data Architects and cross‑functional technical teams.
  • Strong analytical and problem‑solving skills.
  • US Citizenship is required.
Preferred Qualifications:
  • Experience designing semantic data models to support AI, ML, and advanced analytics use cases.
  • Experience contributing to data product design within a modern data platform architecture.
  • Familiarity with medallion architecture, data mesh, or domain‑oriented data product strategies.
  • Experience working with Snowflake‑native capabilities (e.g., streams, tasks, Snowpark, dbt).
  • Familiarity with metadata management and governance tools such as Collibra.
  • Exposure to RAG pipelines or AI‑driven data consumption patterns.
  • Experience working in regulated or government environments.
  • Knowledge of Python or Spark for…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary