×
Register Here to Apply for Jobs or Post Jobs. X

Analytics Engineer

Remote / Online - Candidates ideally in
Winnipeg, Manitoba, Canada
Listing for: LotLinx
Remote/Work from Home position
Listed on 2026-02-28
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst
Job Description & How to Apply Below

Since our founding in 2012, Lotlinx has consistently pioneered advancements in the automotive landscape. We specialize in empowering automobile dealers and manufacturers by providing cutting-edge data and technology, delivering a distinct market advantage for every single vehicle transaction. Today, we stand as the foremost automotive AI and machine learning powered technology, excelling in digital marketing, risk management, and strategic inventory management.

Lotlinx provides employees with a dynamic work environment that is challenging, team-oriented, and full of passionate people. We offer great incentives to our employees, such as competitive compensation and benefits, flex time off, and career development opportunities.

Job Summary

We are seeking an experienced Analytics Engineer to join our growing Data team. In this role, you will play a pivotal role in architecting, building, and optimizing the data foundations that directly power our customer-facing products. Unlike traditional internal-only BI roles, your work here is the product. You will be responsible for pre-processing, integrating, and rigorously modeling complex data from a multitude of external and internal sources, transforming it into the highly reliable, performant datasets that our customers use to drive their business decisions.

Reporting to the Director of Data Analytics, you will collaborate closely with Product Managers, Data Engineers, and Software Engineers to seamlessly integrate these data models into our product suite. This is a key position where you'll have significant ownership over pipelines that directly impact our customers' success and our bottom line.

Why Join Us?

Direct Impact: You won't be stuck maintaining internal dashboards. You hold the keys that drive our customer-facing products. Your architectural decisions will have a highly visible, direct impact on our bottom line.

Autonomy & Fast Shipping: We dislike red tape just as much as you do. We trust our engineers. You will have the autonomy to make high-level technical decisions and ship code quickly without waiting on month-long committee approvals.

Outcome-Driven Culture: We measure success by the impact of your code, not the hours you sit at your desk. We value an async-friendly approach that treats you like an adult.

The Best of Both Worlds (Hybrid Flexibility): Enjoy the perk of working from home two days a week, while joining your team for three days of in-person collaboration at our Winnipeg, Hamilton, or Vancouver offices.

Top-Tier Developer

Experience:

To ensure you have the best tools for the job, we provide top-of-the-line laptops and let you choose your preferred hardware environment (Mac, Windows, or Linux). Furthermore, we actively encourage leveraging cutting-edge technologies, including LLMs and AI coding assistants to supercharge your daily workflows.

Key Responsibilities

  • Architect Product-Facing Data Models: Design, develop, and maintain scalable data models in our data warehouse (Google Big Query) using modern transformation frameworks to serve as the backend engine for our customer-facing applications.
  • Pre-process & Integrate Disparate Sources: Build robust logic to ingest, clean, and unify messy data from a wide variety of third-party APIs, external systems, and internal databases.
  • Data Validation & QA: Conduct rigorous data validation and testing across massive datasets (billions of rows, terabytes) to ensure pipeline integrity. Because this data is customer-facing, zero-downtime and high data fidelity are critical.
  • Develop Large Data Pipelines: Develop, monitor, and troubleshoot ELT/ETL pipelines processing high-volume data streams, ensuring reliability and SLA adherence at the terabyte scale.
  • Optimize Pipeline Performance: Write, tune, and debug complex SQL queries. Analyze execution plans and usage patterns to optimize performance and cost-efficiency using multi-terabyte datasets within Big Query and Apache Pinot to ensure low-latency product experiences.
  • Champion Data Governance & CI/CD: Implement strict data quality checks, automated testing frameworks, and maintain CI/CD best practices to ensure the trustworthiness of our production data assets.
  • Collaborate & Mentor: Work effectively within a collaborative, cross-functional product environment, mentor junior team members, and advocate for the adoption of new analytics engineering best practices.
  • Qualifications

  • Experience: 3+ years of professional experience in Analytics Engineering, Data Engineering, or a highly related role, with a proven track record of managing complex data systems—preferably powering user-facing applications.
  • Advanced SQL & Optimization: Deep expertise in writing, tuning, and debugging complex SQL in multi-terabyte cloud environments (Big Query, Apache Pinot). Strong understanding of query execution plans, partitioning, and clustering.
  • Data Modeling Mastery: Practical experience designing and implementing warehouse schemas tailored for both analytical processing and application…
  • Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
    To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
     
     
     
    Search for further Jobs Here:
    (Try combinations for better Results! Or enter less keywords for broader Results)
    Location
    Increase/decrease your Search Radius (miles)

    Job Posting Language
    Employment Category
    Education (minimum level)
    Filters
    Education Level
    Experience Level (years)
    Posted in last:
    Salary