×
Register Here to Apply for Jobs or Post Jobs. X

Senior Analytics Engineer

Job in Louisville, Jefferson County, Kentucky, 40201, USA
Listing for: Jobs via Dice
Full Time position
Listed on 2026-01-12
Job specializations:
  • IT/Tech
    Data Analyst, Data Science Manager, Data Engineer, Data Security
Salary/Wage Range or Industry Benchmark: 60000 - 80000 USD Yearly USD 60000.00 80000.00 YEAR
Job Description & How to Apply Below

Join to apply for the Senior Analytics Engineer role at Jobs via Dice

Job#: 3017175

Job Title:

Senior Analytics Engineer

Location:

Louisville, KY (Hybrid, 3-5 days/week)

Reports to:

Chief Data Officer

Employment Type:

Full-Time

About

The Role

We're looking for someone who can go deep on modern data stack technology (Snowflake, dbt, semantic layers) while also understanding the business problems that data is meant to solve. You'll work directly with the CDO and collaborate closely with a small, high-impact analytics team where everyone contributes to shared infrastructure while owning their domain end-to-end. This role has a path to leadership for the right person.

Domain Ownership

You'll be the Directly Responsible Individual for one or more business domains ilable domains include:

  • Contracts & Controls (in-force contracts, compliance monitoring)
  • Finance Models (revenue/expense, variable compensation, FP&A)
  • Non-Finance Models (Sales, Marketing, Operations, Know Your Customer, Direct to Consumer)

Domain ownership means you're responsible for data quality, governance, pipeline reliability, modeling decisions, and delivering insights that actually move the business. Across all domains, you'll leverage AI and intelligence capabilities—semantic layers, self-service analytics, and business enablers—to deliver solutions that scale. No one is on an island—we collaborate on infrastructure and rally around problems together—but you'll have real accountability and autonomy over your domain.

Day-to-Day Work
  • Build and maintain dbt models that transform raw data from Salesforce, Net Suite, DTCC, and internal systems into business-ready datasets
  • Design and implement semantic layer definitions (metrics, dimensions, entities) that power both BI tools and AI/LLM interfaces
  • Own data quality for your domain—implement tests, monitoring, and alerting that catch issues before stakeholders do
  • Partner directly with business stakeholders to understand their problems and translate them into analytics solutions
  • Use AI coding assistants (Git Hub Copilot, Claude, dbt Copilot) throughout the development lifecycle—we expect you to leverage these tools to move faster and maintain quality
  • Contribute to shared infrastructure: pipeline orchestration, CI/CD, observability, governance
  • Develop data products that solve real business problems—this could include AI-powered document processing, self-service analytics tools, or embedded partner BI
What You Bring
Required
  • 5+ years of experience in analytics engineering, data engineering, or a related field
  • Expert SQL skills—you can write complex queries, optimize performance, and debug data issues without hand-holding
  • Strong experience with dbt—you understand the modeling patterns, testing framework, and deployment workflows
  • Experience with Snowflake or a comparable cloud data warehouse (Big Query, Redshift, Databricks)
  • Proficiency with Python for data processing, automation, or analysis (vibe coding is acceptable)
  • Working knowledge of Git and CI/CD practices for analytics code
  • Demonstrated experience using AI coding assistants to accelerate development—or genuine curiosity and willingness to adopt them
  • Ability to work onsite in Louisville, KY 3-5 days per week (relocation assistance available)
Nice to Have
  • Experience in insurance, financial services, or annuities distribution
  • Experience with semantic layer tools (dbt Semantic Layer/Metric Flow, Cube, AtScale, Snowflake Semantic Views)
  • Familiarity with Salesforce data models and integration patterns
  • Experience with modern BI tools like Sigma Computing, Hex, or Lightdash
  • Background building AI/ML-powered data products or preparing data for LLM applications, such as Snowflake Intelligence or Streamlit
  • Pipeline orchestration experience (Airflow, Dagster, Prefect, Matillion, Fivetran, Open Flow)
  • Experience at an enterprise software company or SaaS platform—you understand how data teams support product and go-to-market at scale
What Success Looks Like

First 30 Days:
You've onboarded to our tech stack, understand our data sources and current models, and have shipped your first contribution to the codebase.

First 60 Days:
You've taken ownership of a domain area, identified gaps or…

Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary