More jobs:
Reporting, Insights, and Analytics Team - Data Domain Architect Lead
Job in
Plano, Collin County, Texas, 75086, USA
Listed on 2026-03-05
Listing for:
JPMorgan Chase & Co.
Full Time
position Listed on 2026-03-05
Job specializations:
-
IT/Tech
Data Engineer, Data Science Manager
Job Description & How to Apply Below
Join a team where you will lead the end‑to‑end architecture for FORCE Reporting, Insights & Analytics, own the FORCE data schema, enforce governance, and deliver a scalable, high‑performance lakehouse that powers advanced dashboards and analytics.
As Data Domain Architect Lead, within the Functions, Operations, Real Estate, Culture and Expenditure (FORCE) Reporting, Insights and Analytics Team, you will partner with Product, Technology, P&A, CFO, D&A, Controllers, and global teams to convert business needs into governed, resilient designs.
Job Responsibilities- Publish architecture roadmap and decisions; define medallion layering; standardize patterns and provide guidance on tuning Spark/Delta (partitioning, OPTIMIZE, ZORDER, caching) and orchestration for reliability.
- Normalize Unity Catalog, Delta Lake schema evolution, vacuum/auto‑compaction, and Databricks Workflows and enforce lineage/metadata, RBAC/ABAC, encryption, PII handling, DUC alignment.
- Convert requirements to designs, cross‑site design reviews, and shared documentation across geographies.
- Support manual-file automation and post-automation retirements and assist in NLQ pilots and in developing LLM‑assisted workflows.
- Run workshops; publish runbooks, coding standards, and architecture playbooks.
- Partner with Product, Technology, P&A, CFO, D&A, Controllers, and global teams to convert business needs into governed, resilient designs. Mentor talent and tie delivery to measurable OKRs.
- 1 direct report and mentors broader architecture/modeling community.
- 10+ years in data architecture/engineering; 5+ years leading enterprise analytics architecture.
- Expert Databricks/Spark/SQL and Delta Lake performance patterns (including partitioning, OPTIMIZE, ZORDER, caching, vacuum).
- Unity Catalog lineage/metadata, RBAC/ABAC, encryption, regulated‑industry governance.
- Proven delivery of latency/cycle‑time targets at scale.
- Experience integrating BI platforms (Thought Spot/Tableau/Sigma) into governed lakehouse ecosystems.
- Stakeholder leadership, executive communication, and Agile delivery literacy.
- Python, PySpark, and SQL fluency;
Databricks Workflows/Airflow. - Fin Ops/cost optimization such as NLQ and LLM‑assisted documentation.
- Travel: 1%–10%.
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×