×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

Data Platform Engineer

Job in Winnipeg, Manitoba, Canada
Listing for: King River Capital Group
Full Time position
Listed on 2026-03-01
Job specializations:
  • IT/Tech
    Data Engineer
Job Description & How to Apply Below
Position: Staff Data Platform Engineer

Location

Remote, Canada, New York City Office, Montreal Office

Employment Type

Full time

Location Type

Hybrid

Department

Engineering

Staff Data Platform Engineer

OXIO is the world’s first telecom-as-a-service (TaaS) platform. We are democratizing telecom and making it easily accessible for brands and enterprises to fully own and operate proprietary mobile networks designed to support their own customers needs. Our TaaS solution combines multiple existing networks into one single platform that can be seamlessly managed in the cloud as a modern SaaS offering. And it gets better - with full network access comes unparalleled business intelligence and insights to help enterprises better understand customer and machine (M2M) behavior.

With a continuous focus on innovation, any company can build a powerful telecom presence with OXIO, and in addition help them glean unique customer insights like never before.

Job Description:

We’re seeking a seasoned Staff Data Engineer to lead the design, development, and scaling of our modern data platform. This role is ideal for someone who thrives in building and designing robust data systems. You’ll be instrumental in shaping our data infrastructure, driving governance, and building scalable APIs that power real-time and batch analytics.

You will also play a pivotal role in evaluating, designing, and migrating our organization toward a North Star data architecture
—a future‑proof foundation that supports scalable, secure, and intelligent data operations across the enterprise.

This role supports a wide range of analytics use cases across telecom networking
, product intelligence
, financial reporting
, and internal/external data insights
. You’ll help build a comprehensive Customer 360 platform powered by ML models and behavioral data, enabling advanced use cases such as fraud detection
, brand intelligence
, and personalized customer engagement.

Key Responsibilities
  • Architect, build, and scale a unified data platform integrating internal and external sources into data lakes and warehouses
  • Design and implement streaming and batch data pipelines using tools like Spark, Airflow, and DBT
  • Lead infrastructure provisioning using Terraform and Kubernetes, ensuring scalable and secure deployments
  • Evaluate and drive the migration to our North Star data architecture
    , aligning platform capabilities with long‑term business goals
  • Collaborate with cross‑functional teams to define logging standards, data contracts, and consumption patterns
  • Drive best practices in data governance, privacy compliance (GDPR, CCPA), and metadata management
  • Build APIs and data services to support real‑time activation and client‑facing data sharing
  • Participate in strategic architectural decisions and long‑term data roadmap planning
  • Evangelize modern data Platform & engineering practices across the organization
  • Mentor junior engineers and contribute to hiring and team growth
Required Qualifications
  • 15+ years of experience in software engineering with a strong focus on data systems and platform architecture
  • 10+ years of experience building data solutions using data lakehouse architectures and self‑serve data platforms that empower scalable analytics across teams
  • 10+ years of hands‑on programming experience with Scala, Go, Python, and SQL, building robust, high‑performance data systems
  • 5+ years of experience building scalable data solutions using Python, Spark, and Terraform
  • Proven experience building serverless and scalable ML infrastructure to support large‑scale data processing, ideally in telecom or similarly high‑throughput environments
  • Deep expertise in cloud‑native data stacks (AWS, Databricks, Snowflake, Big Query, Synapse)
  • Strong understanding of microservices architecture, event‑driven systems (Kafka), and container orchestration (Kubernetes)
  • Experience building operational tools and defining best practices to improve operational efficiency and developer experience—including evaluating and adopting AI‑based tools for engineering productivity
  • Experience with orchestration tools like Airflow or Prefect and transformation tools like DBT
  • Proficient in building and scaling APIs (REST/gRPC) for data access and activation
  • Familiarity…
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary