×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer Denver

Job in Denver, Denver County, Colorado, 80285, USA
Listing for: Servicecore
Full Time position
Listed on 2026-01-12
Job specializations:
  • IT/Tech
    Data Engineer, Data Science Manager
Salary/Wage Range or Industry Benchmark: 80000 - 100000 USD Yearly USD 80000.00 100000.00 YEAR
Job Description & How to Apply Below
Position: Data Engineer at Servicecore Denver, CO

Data Engineer job ver, CO.

Data Engineer Company Overview

Service Core and Docket are rapidly-growing field-service Software as a Service platforms for the portable sanitation and dumpster industries, being named as the #80 fastest growing software companies in America by Inc. 5,000. The customers we serve have been underserved by software, making us the leading players in a huge industry with very little competition. Our software helps our incredibly hard-working business owners get more done and stress less.

How? By supercharging their businesses with software that cuts wasted time, manages jobs, optimizes routes, tracks inventory, and automates billing. We are proud to offer a one-stop solution that allows our hard-working customers to be more productive and successful!

We live by our core values of Love Our Customers, Be Real, Give a Shit, Deliver Results and of course Keep it Fun. Service Core provides hard-working individuals the opportunity to work and grow within an agile, fast-paced start-up environment. We are proud of our accomplishments and take our jobs seriously while not taking ourselves too seriously.

Role Overview

Service Core is seeking a mid-level Data Engineer / Analytics Engineer to help build and scale our next-generation data models, semantic layer, and analytics foundation in Snowflake.

This role is ideal for someone who has 2-4 years of experience working with data warehouses, SQL, and modeling , and is ready to take the next step into modern data architecture. You’ll help design and build data marts for Service Core and Docket, support our embedded reporting layer (Sigma), and lay the groundwork for future AI and agentic data workflows.

You won’t be alone, this is a hands-on builder role with mentorship and technical guidance provided by senior members of the Data & Analytics team including a Senior Data Engineer. You’ll grow into owning portions of our semantic layer and bringing best practices in data modeling, documentation, and transformation.

What You’ll Do
  • Build, optimize, and maintain data models and data marts in Snowflake for Service Core and Docket
  • Develop the foundational semantic layer that powers embedded reporting, dashboards, and ultimately AI/agentic use cases
  • Implement scalable ELT/ETL pipelines, transformations, and data cleaning logic
  • Partner with Data & Analytics leadership to define the canonical metrics and modeling standards used across products
  • Work closely with Product, Engineering, and Customer Success to understand reporting use cases and map them to data models
  • Support the upstream data needs for Sigma dashboards, customer-facing reporting, and internal analytics
  • Develop and maintain clear documentation for tables, schemas, and transformations
  • Monitor Snowflake performance and costs; tune queries, warehouses, and models as needed
  • Contribute to testing, data quality checks, validation frameworks, and model observability
  • Participate in code reviews, pairing sessions, and ongoing improvements to our data stack
  • Help prepare the warehouse and semantic layer for future AI enablement, including structured outputs, embeddings-ready modeling, and feature tables
  • Collaborate in planning, roadmap discussions, and architectural decisions as your expertise grows
What We’re Looking For
  • 2–4+ years of experience in data engineering, analytics engineering, BI engineering, or a similar role
  • Strong SQL skills, including writing performant queries and understanding joins, aggregations, and window functions
  • Proficiency in Python for data manipulation, automation scripts, and supporting ELT workflows
  • Experience modeling data in a data warehouse environment
  • Familiarity with common modeling approaches (star schema, snowflake schema, data marts, semantic modeling concepts)
  • Experience with at least one modern cloud data warehouse (Snowflake preferred; Redshift, Big Query, or Synapse acceptable)
  • Experience building or maintaining ELT/ETL pipelines using any toolset (Terraform, dbt, Fivetran, Matillion, Airflow, SSIS, custom Python, etc.)
  • Familiarity with BI tools such as Sigma, Looker, Tableau, or PowerBI
  • Ability to break down unclear requirements into structured technical plans
  • Curiosity,…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary