Senior Data Engineer; Informatica/Databricks
Listed on 2026-02-28
-
IT/Tech
Data Engineer, Data Analyst, Data Science Manager
Job Title:
Senior Data Engineer (Informatica/Databricks)
Job Category:
Information Technology
Time Type:
Full time
Minimum Clearance Required to Start:
None
Employee Type:
Regular
Percentage of
Travel Required:
Up to 10%
Type of Travel:
Local
CACI is currently looking for a highly skilled and experienced Senior Data Engineer (Informatica/Databricks) with agile methodology experience to join our BEAGLE (Border Enforcement Applications for Government Leading-Edge Information Technology) Agile Solution Factory (ASF) Team supporting Customs and Border Protection (CBP) client located in Northern Virginia! Join this passionate team of industry-leading individuals supporting the best practices in Agile Software Development for the Department of Homeland Security (DHS).
As a member of the BEAGLE ASF Team, you will support the men and women charged with safeguarding the American people and enhancing the Nation’s safety, security, and prosperity. CBP agents and officers are on the front lines, every day, protecting our national security by combining customs, immigration, border security, and agricultural protection into one coordinated and supportive activity.
ASF programs thrive in a culture of innovation and are constantly seeking individuals who can bring creative ideas to solve complex problems, both technical and procedural at the team and portfolio levels. The ability to be adaptable and to work constructively with a technically diverse and geographically separated team is crucial. You should have worked with or have a strong interest in agile software development practices and delivering deployable software in short sprints.
Responsibilities- Design, develop, and maintain robust and scalable data warehouse architectures and ETL/ELT data pipelines using Databricks or a similar cloud-based platform.
- Optimize and troubleshoot data pipelines and warehouse performance to ensure efficient and reliable data processing.
- Work with database developers and administrators across multiple product teams.
- Serve as a data and technology expert across a broad and diverse set of mission critical applications.
- Modernize the data warehouse environment by migrating the platform to Databricks.
- Evaluating existing data sets and reporting architectures to identify strategic gaps and apply modern technologies to creatively achieve superior mission outcomes.
- Analyze project-related problems and create innovative solutions involving technology, analytic methodologies, and advanced solution components.
- Create or augment business and operational intelligence tools using languages such as SQL, Spark, and Python to detect trends, patterns, and non-obvious relationships in large, complex, and disparate data sets.
- Actively participate in Agile Scrum sprint planning, artifact creation, sprint testing, regression testing, demonstrations, retrospectives and solution releases.
- Must be a U.S. Citizen with the ability to pass CBP background investigation, criteria includes but is not limited to:
- 3 year check for felony convictions
- 1 year check for illegal drug use
- 1 year check for misconduct such as theft or fraud
- 7+ years of professional experience working on complex data challenges in the areas of data architecture and engineering
- 7+ years of professional experience working on complex data challenges in the areas of data architecture and engineering
- Proven 7+ years of experience automating ELT data pipelines using Informatica.
- Experience with cloud platforms (e.g., AWS, Azure, GCP) and services related to data storage and processing (e.g., S3, ADLS).
- Experience building and optimizing data pipelines for batch and/or streaming data.
- 3-5 years of Databricks experience. Alternative/equivalent technologies:
Significant experience with Snowflake, Google Big Query, or Microsoft Azure Synapse Analytics will also be considered. - Experience with Databricks, including extensive hands-on experience with PySpark, Python, SQL, Kafka, and Databricks notebooks. Strong experience with data modeling techniques (e.g., dimensional modeling, data vault) and database design.
- Database skillset for AWS RDS concepts, and understanding of database principles…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).