×
Register Here to Apply for Jobs or Post Jobs. X

Snowflake Architect​/Modeler; Azure Focus

Job in Snowflake, Navajo County, Arizona, 85937, USA
Listing for: xCroTek
Full Time position
Listed on 2026-01-12
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing
Salary/Wage Range or Industry Benchmark: 100000 - 125000 USD Yearly USD 100000.00 125000.00 YEAR
Job Description & How to Apply Below
Position: Snowflake Architect/Modeler (Azure Focus)
Location: Snowflake

Job description About xCro Tek

xCroTek is a product- and service-based AI software company that builds innovative, AI-powered solutions to drive intelligent automation and digital transformation. We specialize in creating scalable, impactful technologies for modern businesses.

Job Overview

We are looking for a skilled Snowflake Architect/Modeler with 5-7 years of experience to lead the design and implementation of scalable data solutions on the Snowflake platform hosted on Azure. In this role, you will architect data pipelines adhering to the Medallion Architecture (Bronze, Silver, Gold layers), research legacy source systems, and create optimized data models to enable robust analytics and business intelligence.

The ideal candidate excels in translating complex business requirements into technical designs, with a strong emphasis on data governance, performance, and seamless integration with Azure services. This position offers the chance to drive data modernization initiatives in a collaborative, Azurecentric environment.

Key Responsibilities
  • Architect comprehensive Snowflake solutions following the Medallion Architecture framework, including ingestion into Bronze (raw), transformation to Silver (refined), and curation into Gold (aggregated) layers.
  • Research and analyse source systems (e.g., on-premises databases, SaaS applications) to document tables, columns, relationships and underlying business logic in existing data processing workflows.
  • Design Snowflake-compatible data models using Erwin for conceptual, logical, and physical modeling; generate DDL scripts; and create detailed source-to-target mappings for ETL/ELT processes.
  • Develop and optimise data ingestion pipelines using Snowflake features such as Snowpipe for continuous loading, Streams for change data capture, Tasks for scheduling, and Dynamic Tables for materialisation.
  • Collaborate with stakeholders, data engineers, and Azure teams to define requirements, ensure data lineage, and implement secure, compliant data flows integrated with Azure services like Azure Data Factory, Synapse, and Blob Storage.
  • Perform data quality assessments, tuning, and optimisation of queries and warehouses to support high-volume analytics on Azure.
  • Lead data migration and modernisation projects from legacy systems to Snowflake on Azure, ensuring minimal downtime and adherence to best practices.
  • Document architectures, mappings, and designs; conduct peer reviews; and mentor junior team members on Snowflake and Azure integrations.
Required Qualifications
  • Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field.
  • 5-7 years of experience in data architecture, modeling, and engineering, with at least 3 years hands‑on with Snowflake on Azure.
  • Expertise in Medallion Architecture implementation, including layering strategies for data ingestion, transformation, and consumption.
  • Proven ability to reverse‑engineer source systems, capturing metadata (tables, columns) and business rules for accurate data modeling.
  • Proficiency in data modeling tools like Erwin for designing relational and dimensional models; experience generating DDLs and creating source‑to‑target mapping documents.
  • Strong hands‑on experience designing and developing Snowflake native features:
    Snowpipe, Streams, Tasks, Time Travel, and clustering for Azure‑hosted environments.
  • In‑depth knowledge of SQL for advanced querying, optimisation, and Azure Snowflake integrations (e.g., Azure AD for authentication, external stages).
  • Familiarity with Azure ecosystem, including Data Lake, Synapse Analytics, and security/compliance standards (e.g., Azure Sentinel, GDPR).
Preferred Qualifications
  • Experience with ETL/ELT tools like dbt, Matillion, or Azure Data Factory for orchestrating pipelines in a Medallion setup.
  • Background in big data technologies (e.g., Spark on Azure Databricks) integrated with Snowflake.
  • Snowflake certifications (e.g., Snow Pro Core, Advanced Architect) and Azure certifications (e.g., DP203: Data Engineering on Microsoft Azure).
  • Knowledge of data governance tools like Collibra or Alation for lineage and cataloguing in Azure environments.
Technical Skills
  • Core:
    Snowflake…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary