×
Register Here to Apply for Jobs or Post Jobs. X

Senior Data Engineer, Business Intelligence

Job in Boston, Suffolk County, Massachusetts, 02298, USA
Listing for: Klaviyo Inc.
Full Time position
Listed on 2026-03-01
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst, Data Science Manager, Cloud Computing
Salary/Wage Range or Industry Benchmark: 60000 - 80000 USD Yearly USD 60000.00 80000.00 YEAR
Job Description & How to Apply Below

Finance & Accounting

At Klaviyo, we value the unique backgrounds, experiences and perspectives each Klaviyo (we call ourselves Klaviyos) brings to our workplace each and every day. We believe everyone deserves a fair shot at success and appreciate the experiences each person brings beyond the traditional job requirements. If you’re a close but not exact match with the description, we hope you’ll still consider applying.

Want to learn more about life at Klaviyo? Visit  to see how we empower creators to own their own destiny.

Data is at the heart of every decision made at Klaviyo, and we’re looking for a Senior Data Engineer to join our Business Intelligence (BI) team. BI at Klaviyo collaborates across all departments to provide a platform that powers all internal data, analytic, and reporting needs. Our mission is to champion data-driven value creation, and you will own creating and maintaining the internal data infrastructure that powers Klaviyo’s business.

This role in particular will significantly contribute to the infrastructure, pipelines, and security/compliance aspects of our internal analytics platform while driving architectural innovation and mentoring the team.

How You’ll Make a Difference

As a Senior Data Engineer, you will shape the scalability, reliability, and cost-efficiency of our data platform. You’ll lead architectural decisions, establish engineering best practices, and mentor other engineers while partnering closely with analytics, engineering, and business stakeholders.

Your work will directly influence data-driven decision-making across the organization by ensuring our data systems are performant, observable, and built to scale.

What You’ll Do (Responsibilities) Accelerating Engineering with AI
  • Transform workflows by putting AI at the center, building smarter systems and ways of working from the ground up for example, using AI to generate tests, detect anomalies, summarize data issues, or accelerate analysis.
  • Design, develop, and maintain scalable dbt models and pipelines, including advanced incremental and merge strategies.
  • Architect solutions for attribution models, event data pipelines, and analytics at scale.
  • Lead performance optimization efforts across Snowflake and related data systems.
  • Define and enforce best practices for query performance, warehouse management, and cost control.
  • Own end-to-end data pipelines, ensuring reliability, scalability, and observability.
  • Lead complex DAG orchestration with Airflow/MWAA.
  • Oversee Spark/EMR cluster management, job optimization, and large-scale backfills.
  • Implement monitoring, alerting, and automated recovery strategies for production systems.
  • Architect infrastructure-as-code solutions using Terraform for Snowflake and AWS resources.
  • Oversee integration of AWS services (S3, EMR, Secrets Manager, Cloud Watch) into the data platform.
  • Guide CI/CD pipeline design and improvements using Git Hub Actions and Code Build.
  • Promote containerization best practices with Docker for scalable deployments.
Cost & Performance Management
  • Monitor Snowflake and EMR usage to proactively optimize costs.
  • Analyze query performance and warehouse efficiency.
  • Troubleshoot and resolve pipeline and infrastructure performance issues.
  • Mentor and coach junior and mid-level data engineers through code reviews and technical guidance.
  • Establish and enforce coding standards, testing practices, and CI/CD processes.
  • Serve as technical lead for cross-functional data initiatives.
  • Advocate for reliability, performance, and cost optimization across the data engineering function.
Who You Are (Qualifications)
  • 5+ years of data engineering experience, including demonstrated technical leadership.
  • Expert-level proficiency in dbt, including advanced modeling, testing frameworks, incremental strategies, and performance tuning.
  • Deep expertise in SQL and Snowflake, including query optimization, warehouse sizing, and cost governance.
  • Strong Python skills for data processing, API integrations, and internal tooling.
  • Experience architecting data lakehouse solutions.
  • Hands-on experience designing and operating Apache Iceberg-based data lake architectures on Amazon EMR.
  • Proven experience operating production…
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary