×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

Director Sr. Information Architect

Job in California, Moniteau County, Missouri, 65018, USA
Listing for: Ledgent Technology
Full Time position
Listed on 2026-03-01
Job specializations:
  • Software Development
    Data Engineer
Salary/Wage Range or Industry Benchmark: 100000 - 125000 USD Yearly USD 100000.00 125000.00 YEAR
Job Description & How to Apply Below
Location: California

The Senior Information Architect will design, build, and maintain scalable, well‑structured data models that power analytics, reporting, and operational workflows across CIM's $35B+ portfolio spanning real estate equity, infrastructure, and private credit. This role establishes data modeling standards from the ground up, creating enterprise‑grade models within the Databricks Lakehouse platform and integrating with Snowflake and Mongo

DB where appropriate. As a key contributor to CIM's developing data architecture function, the Senior Information Architect will work closely with business teams—including Fund Accounting, FP&A, Investor Relations, Sales, and Investments—to translate complex requirements into high‑performing, future‑ready data structures that support analytics, governance, and operational needs across Azure and modern data ecosystems.

Responsibilities Business Partnership & Requirements Discovery
  • Partner with business stakeholders across Fund Accounting, FP&A, Global Client Group, and Investments to gather data needs, understand pain points, and define use cases.
  • Translate business requirements into clear, scalable data models, validating assumptions and ensuring alignment before solutioning.
  • Collaborate with data analysts, data scientists, MLOps engineers, and application developers to confirm downstream requirements.
  • Build trust by clearly explaining technical concepts and demonstrating measurable business value.
Data Model Design & Development
  • Develop conceptual, logical, and physical data models for data warehouses, data lakes, operational data stores, and transactional systems.
  • Create optimized dimensional models (star and snowflake schemas) for analytics and BI.
  • Apply a variety of modeling techniques (Kimball, Inmon/3NF, Data Vault, No

    SQL patterns) depending on the initiative.
Databricks Lakehouse Architecture
  • Architect medallion‑layer structures (bronze/silver/gold) and define modeling standards within the Lakehouse environment.
  • Optimize Delta Lake tables leveraging ACID transactions, time travel, schema evolution, Z‑ordering, and liquid clustering.
  • Design balanced partitioning strategies to improve performance without over‑segmenting data.
  • Implement Unity Catalog organization (catalogs, schemas, tables) for governance and multi‑domain data management.
  • Support MLOps teams by designing models for ML feature stores and GenAI/RAG workloads.
Multi‑Platform Data Architecture
  • Build and maintain relational schemas for Snowflake and other RDBMS systems, integrating them with Lakehouse patterns.
  • Design No

    SQL structures for Mongo

    DB, including document design, indexing, and query optimization.
  • Establish a framework for deciding whether Databricks, Snowflake, or Mongo

    DB is the correct platform based on workload characteristics.
Performance Optimization
  • Recommend and refine query optimization, indexing, and partitioning strategies.
  • Identify and resolve issues such as data skew, excessive small files, and inefficient Spark joins.
  • Partner with engineering teams on Delta Lake maintenance processes (OPTIMIZE, VACUUM, ANALYZE).
ETL/ELT Collaboration & Pipeline Alignment
  • Work closely with Data Engineers to ensure models integrate smoothly with ETL/ELT pipelines built with Auto Loader, Delta Live Tables, and Spark workflows.
  • Provide guidance on data mapping, transformation logic, schema evolution, and load patterns.
  • Design SCD implementations using Delta Lake MERGE operations and Change Data Feed.
Data Governance & Standards
  • Establish modeling standards, naming conventions, metadata practices, and governance frameworks.
  • Implement row‑and column‑level security with Unity Catalog for sensitive investor and fund data.
  • Define and maintain end‑to‑end data lineage from ingestion through reporting.
  • Help build a comprehensive metadata repository and data dictionary.
  • Ensure compliance with regulatory and ethical data handling standards.
Documentation & Knowledge Sharing
  • Create and maintain ERDs, data dictionaries, lineage diagrams, and model documentation.
  • Document Lakehouse design patterns and platform best practices for internal knowledge sharing.
  • Identify data quality issues early in modeling processes and collaborate on…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary