Director Sr. Information Architect
Listed on 2026-03-01
-
Software Development
Data Engineer
The Senior Information Architect will design, build, and maintain scalable, well‑structured data models that power analytics, reporting, and operational workflows across CIM's $35B+ portfolio spanning real estate equity, infrastructure, and private credit. This role establishes data modeling standards from the ground up, creating enterprise‑grade models within the Databricks Lakehouse platform and integrating with Snowflake and Mongo
DB where appropriate. As a key contributor to CIM's developing data architecture function, the Senior Information Architect will work closely with business teams—including Fund Accounting, FP&A, Investor Relations, Sales, and Investments—to translate complex requirements into high‑performing, future‑ready data structures that support analytics, governance, and operational needs across Azure and modern data ecosystems.
- Partner with business stakeholders across Fund Accounting, FP&A, Global Client Group, and Investments to gather data needs, understand pain points, and define use cases.
- Translate business requirements into clear, scalable data models, validating assumptions and ensuring alignment before solutioning.
- Collaborate with data analysts, data scientists, MLOps engineers, and application developers to confirm downstream requirements.
- Build trust by clearly explaining technical concepts and demonstrating measurable business value.
- Develop conceptual, logical, and physical data models for data warehouses, data lakes, operational data stores, and transactional systems.
- Create optimized dimensional models (star and snowflake schemas) for analytics and BI.
- Apply a variety of modeling techniques (Kimball, Inmon/3NF, Data Vault, No
SQL patterns) depending on the initiative.
- Architect medallion‑layer structures (bronze/silver/gold) and define modeling standards within the Lakehouse environment.
- Optimize Delta Lake tables leveraging ACID transactions, time travel, schema evolution, Z‑ordering, and liquid clustering.
- Design balanced partitioning strategies to improve performance without over‑segmenting data.
- Implement Unity Catalog organization (catalogs, schemas, tables) for governance and multi‑domain data management.
- Support MLOps teams by designing models for ML feature stores and GenAI/RAG workloads.
- Build and maintain relational schemas for Snowflake and other RDBMS systems, integrating them with Lakehouse patterns.
- Design No
SQL structures for Mongo
DB, including document design, indexing, and query optimization. - Establish a framework for deciding whether Databricks, Snowflake, or Mongo
DB is the correct platform based on workload characteristics.
- Recommend and refine query optimization, indexing, and partitioning strategies.
- Identify and resolve issues such as data skew, excessive small files, and inefficient Spark joins.
- Partner with engineering teams on Delta Lake maintenance processes (OPTIMIZE, VACUUM, ANALYZE).
- Work closely with Data Engineers to ensure models integrate smoothly with ETL/ELT pipelines built with Auto Loader, Delta Live Tables, and Spark workflows.
- Provide guidance on data mapping, transformation logic, schema evolution, and load patterns.
- Design SCD implementations using Delta Lake MERGE operations and Change Data Feed.
- Establish modeling standards, naming conventions, metadata practices, and governance frameworks.
- Implement row‑and column‑level security with Unity Catalog for sensitive investor and fund data.
- Define and maintain end‑to‑end data lineage from ingestion through reporting.
- Help build a comprehensive metadata repository and data dictionary.
- Ensure compliance with regulatory and ethical data handling standards.
- Create and maintain ERDs, data dictionaries, lineage diagrams, and model documentation.
- Document Lakehouse design patterns and platform best practices for internal knowledge sharing.
- Identify data quality issues early in modeling processes and collaborate on…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).