More jobs:
Developer III w-as lead Dev); design end-to-end architecture; SQL/NoSQL databases
Job in
City of Albany, Albany, Albany County, New York, 12201, USA
Listed on 2026-03-01
Listing for:
MVP Consulting Plus
Full Time
position Listed on 2026-03-01
Job specializations:
-
IT/Tech
Data Engineer
Job Description & How to Apply Below
Location: City of Albany
Developer III – PRD#
Duration: 14 months.
Location:
Albany, NY – 50% Onsite.
- More than seven (7) years of experience working on complex projects with 2 or more years in a leadership role as a Developer.
- More than seven (7) years of experience in designing end‑to‑end architecture for an enterprise Data Integration hub for centralized data ingestion and transformation for various ingestion patterns (batch and event‑driven).
- More than seven (7) years of proven experience with SQL/No
SQL databases, API development (REST/SOAP), version control (Git), and building data integrations between on‑premises and cloud systems.
- More than seven (7) years of proven experience with standardizing API consumption, error handling, retries, and throttling.
- More than seven (7) years of experience in managing schema evolution and backward compatibility supporting legacy transformations.
- More than seven (7) years of hands‑on experience designing and implementing a data integration hub.
- More than seven (7) years of proven experience in complex API‑based workflows, data governance, metadata, and lineage tooling.
- More than seven (7) years of experience integrating Informatica with Databricks as the hub’s transformation and curation engine.
- More than seven (7) years of experience designing hub‑based data publishing mechanisms for data warehouses and analytics.
- More than seven (7) years of proven experience with security standards (OAuth, SAML), data governance, performance tuning, and relevant cloud/integration certifications.
- More than seven (7) years of experience in event streaming platforms and cloud integration architectures.
- Design and develop service‑based data integration architectures.
- Build ELT pipelines with Databricks using PySpark and Databricks SQL, implement batch and incremental data loads—including CDC patterns—and create reusable, standardized integration patterns.
- Establish logical and physical integration layers, curation, and distribution.
- Work with cross‑functional teams (developers, product managers, etc.) to define integration requirements and ensure successful integration.
- Identify and resolve integration‑related issues, optimize performance, and ensure data security.
- Create and maintain clear and comprehensive documentation for internal users and stakeholders.
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×