×
Register Here to Apply for Jobs or Post Jobs. X

Principal Data Architect

Job in Baltimore, Anne Arundel County, Maryland, 21276, USA
Listing for: Metric5
Full Time position
Listed on 2026-01-09
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing
Job Description & How to Apply Below

Metric5 provides a range of technical services to our clients, including data management, data architecture, data quality, and other data services, enabling our customers to maximize the value of their data and make better data‑informed decisions.

We need a client‑facing technical leader who can also assist in the development of Metric5 products and services that span our customer base. Our customers are users of Databricks and other COTS products, as well as a number of open‑source data tools and environments.

The Principal Data Architect is a high‑impact, senior‑level role responsible for defining and driving the technical vision, strategy, and roadmap for our modern data platform that supports Federal Government customers, including DHS/USCIS, USPTO, and Treasury.

This role requires a blend of deep architectural expertise, strategic oversight, and hands‑on technical leadership, along with exceptional coordination, communication, and collaboration with internal teams and customer stakeholders.

The consultant will serve as the technical authority and a key liaison, ensuring that platform development aligns precisely with customer priorities and mission objectives.

Key Responsibilities Strategic & Visionary Leadership
  • Architectural

    Roadmap:

    Define and articulate a multi‑year technical vision and roadmap for the data lake, encompassing data ingestion, processing, storage, governance, and consumption layers.
  • Customer Service & Priority Alignment:
    Act as the primary technical interface for customer stakeholders, ensuring all architectural decisions and development activities directly address and prioritize the customer's mission‑critical needs and objectives.
  • Innovation & CoE Leadership:
    Serve as a technical leader in driving innovation across the data platform and actively implement initiatives defined by the existing Center of Excellence (CoE) and Innovation teams.
Technical Oversight, Coordination, & Collaboration
  • Team Collaboration & Mentorship:
    Collaborate closely with cross‑functional development, data science, and engineering teams (Dev Sec Ops , AWS, Databricks) to translate architecture into actionable implementation plans, providing guidance and mentorship to ensure technical excellence and adherence to standards.
  • Technical Coordination:
    Coordinate and synchronize technical activities across multiple work streams (data ingestion, graph services, and data consumption) to manage interdependencies, mitigate risks, and ensure the timely delivery of integrated solutions.
  • Solution Design (Databricks Focus):
    Lead end‑to‑end technical design and architecture reviews, focusing on optimal use of Databricks E2/Delta Lake features.
  • Interface Management:
    Architect robust and secure interfaces for both internal and external data exchange, managing technical discussions with various data source owners (internal/external to USCIS).
  • Code Review & Quality:
    Lead code and design reviews to uphold high‑quality standards, ensuring scalability, security, and maintainability across the Python, Scala, Apache Groovy, and PL/SQL code base.
Required Technical Experience Data Platform & COTS Expertise (Growth Areas)
  • Unified Data Analytics Mastery:
    Expert‑level proficiency and architectural experience with Databricks E2, Delta Lake, and MlFlow.
  • Graph Data Services & Identity Resolution:
    Mandatory, deep, and proven experience in architecting solutions centered on Graph Data and Entity Resolution, specifically leveraging Senzing (Senzing Runtime API v3.10.2).
  • Social/Unstructured Data:
    Familiarity with integrating and utilizing platforms like Sprinklr and Copy Storm into the Delta Lake architecture.
Cloud Architecture (AWS Mastery)
  • Core Services:
    Expert‑level knowledge of AWS foundational services, including S3, EC2, RDS (Postgre

    SQL), ELB, DMS, and EBS.
  • Data Warehousing/Querying:
    Extensive experience designing and optimizing solutions using AWS Redshift, AWS Athena, and AWS Glue.
  • Infrastructure as Code (IaC):
    Proficiency in defining and managing infrastructure using Cloud Formation Templates (CFT).
Development & Dev Ops Environment
  • Languages:

    Expert‑level proficiency in at least two of the primary data lake languages:
    Python, Scala, or…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary