The Data & Analytics Architect defines and governs the end-to-end data architecture for enterprise platforms, enabling trusted data products, analytics, and reporting across multiple domains. The role designs cloud-agnostic data platform patterns (lakehouse, streaming, semantic layers), ensures interoperability across vendor solutions, and aligns data delivery with governance, security, and privacy requirements.
Working with stakeholders, technical teams, and multiple delivery vendors, the architect defines reference architectures, data modeling standards, and performance/availability targets, and provides assurance through design reviews and implementation checkpoints. The role ensures analytics outputs are reliable, scalable, and operationally supportable.
Duties & Responsibilities- Establish patterns for data ingestion (APIs, events, IoT), transformation, and orchestration with portability in mind.
- Define data modeling standards: canonical models, domain-driven data products, dimensional modeling, and master data practices.
- Specify analytics architecture: semantic layer, KPI definitions, dashboards, and self-service analytics enablement.
- Define performance, scalability, and availability requirements for data workloads; implement partitioning, indexing, and cost optimization.
- Align data architecture with governance, privacy, and security requirements (classification, lineage, retention, encryption, DLP, access controls).
- Define reference designs for operational analytics, near-real-time reporting, and event-driven analytics.
- Govern integration of vendor-delivered data stores and BI/reporting tooling; ensure consistent data contracts and quality checks.
- Define and monitor data quality framework: validations, monitoring, SLAs, issue workflows, and reporting for data health.
- Provide architecture assurance: review vendor designs, validate implementations, and support go-live readiness and disaster recovery planning.
- Strong knowledge of modern data architectures (lakehouse, streaming, ELT/ETL, semantic layers, data products).
- Ability to design scalable, secure, and cost-effective analytics solutions in multi-tenant, regulated environments.
- Proficiency in data modeling (dimensional + domain-oriented), data quality, and metadata/lineage practices.
- Experience defining actionable KPIs and ensuring consistent metric definitions across stakeholders.
- Strong documentation and communication skills to align vendors and business owners on data contracts and standards.
- Bachelor’s degree in Computer Science, Information Technology, Cybersecurity
, or related field;
Master’s degree highly preferred. - 8+ years in data architecture, analytics engineering, or BI platform roles in enterprise settings.
- Experience delivering data platforms in cloud/hybrid environments and integrating multiple vendor solutions.
- Hands‑on exposure to streaming/event‑driven data (Kafka/Event Hubs) and orchestration (Airflow/ADF-like tools).
- Experience with BI/semantic layers and large-scale dashboarding programs (Power BI/Tableau/Looker).
- Familiarity with governance in regulated environments (classification, retention, auditability, privacy-by-design).
- Data platforms:
Databricks/Spark, lakehouse table formats (Delta/Iceberg), object storage (Blob/S3/GCS) - Streaming & orchestration:
Kafka/Event Hubs, Airflow/ADF-style orchestration, dbt (optional) - Analytics:
Power BI (preferred), semantic models, SQL engines (Synapse/Trino/Big Query equivalents)
- Ability to align diverse stakeholders on a single "source of truth" for metrics and data definitions
- Strong analytical thinking and structured problem solving
- Pragmatic prioritization balancing performance, cost, and governance constraints
- Collaborative leadership across vendors and internal teams
- Clear communication of complex data concepts to non-technical audiences
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).