×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer - U.S. Citizenship

Job in Washington, District of Columbia, 20022, USA
Listing for: ZenPoint Solutions LLC
Full Time position
Listed on 2026-01-14
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing, Data Analyst
Job Description & How to Apply Below
Position: Data Engineer - U.S. Citizenship Required

Description

ZenPoint Solutions LLC (“
ZenPoint Solutions
”) is a rapidly expanding Information Technology (IT) services company in the federal sector. We foster a thriving, ambitious work environment that prioritizes employee well-being and a positive company culture. We invite you to join our team and help us shape a dynamic future as we deliver innovative solutions to address the nation's most critical IT missions.

Clearance Requirement:

Candidates must hold and maintain an active Top Secret clearance

Work Location:

Washington, D.C.

Work Schedule (Onsite):

Five (5) days a week, onsite at the customer’s facility

Position Overview:

ZenPoint Solutions is seeking a skilled Data Engineer to support the maintenance, optimization, and effective use of the Private Sector Portal (PSP) across the enterprise. Responsibilities include managing and auditing PSP data, supporting HQ and field users, and coordinating technical requirements with relationship management teams and contractors. The position designs, implements, and automates enterprise-level ETL and data analysis pipelines, aggregating data from multiple sources to ensure timely, accurate delivery.

The role also focuses on improving system efficiency through automation and emerging technologies, while developing clear technical documentation and providing reliable customer support. Strong independence, collaboration, and a results-driven mindset are essential for success in this position.

Requirements
  • Support the maintenance, optimization, and effective use of the Private Sector Portal (PSP) across the enterprise
  • Manage, audit, and ensure the quality and consistency of PSP data, including identifying missing or inappropriate metadata
  • Provide customer service and subject-matter support to HQ and field users, addressing system questions and troubleshooting issues
  • Coordinate with relationship management teams and contractors to identify, document, and resolve technical requirements related to the PSP system
  • Design, implement, automate, and maintain enterprise-level ETL and data analysis pipelines that aggregate data from multiple sources
  • Ensure the scheduling, processing, documentation, and implementation of data deliveries in a timely and accurate manner
  • Identify and recommend automation, emerging technologies, and alternative solutions to improve system efficiency and support developers and application stakeholders
  • Develop and maintain technical documentation, including system guidance, process documentation, and software design references
  • Work independently while collaborating effectively with team members to deliver results in a fast-paced environment
Required Qualifications:
  • Candidate must be a United States Citizen and present proof of Citizenship, if selected
  • Bachelor’s degree in a quantitative field (e.g., Statistics, Operations Research) or a technical field (e.g., Computer Science, Engineering)
  • Minimum of 2–4 years of professional experience in a data analytics, data engineering, or related role
  • Four (4) years of exceptional proficiency in writing and editing skills using MS Office
  • Exceptional proficiency in written and verbal communication, including strong writing and editing skills using Microsoft Office
  • High proficiency in Python and advanced SQL for data processing and analysis
  • Experience designing and building modern medallion-style ETL processes for data lakes, data warehouses, or lakehouse architectures
  • Experience designing, implementing, and maintaining data processing and analysis pipelines using Python, Spark, or similar technologies
  • Experience building and automating data pipelines and workflows using tools such as Apache NiFi or Databricks
  • Experience implementing automated data CI/CD pipelines and dataframe-based processing
  • Proficiency with data warehousing concepts and architectures
  • Experience with cloud platforms such as AWS and Google Cloud, including services like Big Query or Redshift
  • Experience with business intelligence and data visualization tools (e.g., Looker, Tableau, Microsoft Power BI)
  • Experience with web technologies including HTML, CSS, JavaScript, and modern frameworks such as React or Bootstrap/Reactstrap
  • Strong critical thinking,…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary