×
Register Here to Apply for Jobs or Post Jobs. X

Technical Data Architect, Platform & Data Products

Job in Calgary, Alberta, D3J, Canada
Listing for: Kinaxis
Full Time position
Listed on 2026-01-13
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing, Data Analyst
Job Description & How to Apply Below
About Kinaxis
Elevate your career journey by embracing a new challenge with Kinaxis. We are experts in tech, but it’s really our people who give us passion to always seek ways to do things better. As such, we’re serious about your career growth and professional development, because People matter at Kinaxis.

In 1984, we started out as a team of three engineers. Today, we have grown to become a global organization with over 2000 employees around the world, with a brand-new HQ based in Kanata North in Ottawa. As one of Canada’s Top Employers, we are proud to work with our customers and employees towards solving some of the biggest challenges facing supply chains today.

At Kinaxis, we power the world’s supply chains to help preserve the planet’s resources and enrich the human experience. As a global leader in end-to-end supply chain management, we enable supply chain excellence for all industries, with more than 40,000 users in over 100 countries. We are expanding our team as we continue to innovate and revolutionize how we support our customers.

Location
Ottawa and Toronto, Canada (Hybrid)

Other locations in Canada - Remote

About The Team
The Data Architect, as a seasoned professional with extensive experience, is responsible for designing and managing data architecture at Kinaxis, which includes defining how data is stored, accessed, processed, and managed. The incumbent will be a leader in designing and implementing data solutions to drive and support our global business by enabling intelligent decision making. As a technical expert, they will collaborate with stakeholders to understand requirements, design data solutions that support organizational goals, and to develop a world class data environment.

What you will do

Define the modern cloud-based data architecture (on Databricks Lakehouse and GCP/Azure) with cross-functional teams that enable consistency and scalability.

Architect and operationalize Medallion (Bronze–Silver–Gold) data models ensuring governance, data quality, and reusability across analytical and ML workloads.

Combine data engineering and architecture expertise to design, develop, and deploy modern data warehouse and Lakehouse solutions, driving initiatives in data preparation, integration, exploration, and modeling.

Participate in Architecture Review Boards, interface with other Cloud Architects, and act as the technical liaison for the department.

Design, develop, and deliver complete data solutions, combining strong engineering execution with architectural design, managing pipelines, models, and workflows from ingestion through presentation, ensuring performance and cost efficiency.

Facilitate discovery workshops, articulate solution value, build roadmaps, and communicate architectural strategy and business impact through data storytelling.

Establish best practices and consistency across various Data on Cloud solutions.

Design and build highly reusable and scalable cloud-native data ecosystems.

Assist in the development of comprehensive and strategic business cases used at management and executive levels for funding and scoping decisions on data solutions.

Understand data technology trends and the practical application of existing, new, and emerging technologies to enable new and evolving business needs.

Partner closely with leadership and business stakeholders as a trusted and influential evangelist to identify important questions, define key metrics, cultivate a data driven and AI-ready culture.

Design and implement data security measures to protect sensitive information.

Work collaboratively with Data and Cloud Governance specialists to align on rules, processes, and standards, implementing strong governance through Unity Catalog, metadata management, and lineage tracking, while maintaining clear documentation of data architecture, flows, and dependencies.

Design the lifecycle management process to seamlessly handle Python package and API dependency version changes.

Technologies we use

Programming languages: SQL, Python, Powershell

Orchestration/Data Integration:
Airflow, Informatica, Dagster, dltHub and others

Databases:
Databricks, Snowflake, Google Big Query, SQL Server, PostgreSQL

CI/CD:
G…
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary