×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer, Data and Science

Job in Seattle, King County, Washington, 98127, USA
Listing for: Hard Yaka
Full Time position
Listed on 2026-02-28
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst, Data Science Manager, AI Engineer
Salary/Wage Range or Industry Benchmark: 80000 - 100000 USD Yearly USD 80000.00 100000.00 YEAR
Job Description & How to Apply Below

Aircall is a unicorn, AI-powered customer communications platform used by 22,000+ companies worldwide to drive revenue, resolve issues faster, and scale customer-facing teams. We’re redefining customer communications by bringing voice, SMS, Whats App, and AI together into one seamless workspace.

Our momentum comes from a simple idea: help teams work smarter, not harder. Aircall’s AI Voice Agent automates routine calls, AI Assist streamlines post-call work, and AI Assist Pro delivers real-time guidance so people can do their best work. The result is higher revenue, faster resolutions, and teams that scale with confidence.

Aircall is headquartered in Paris, our European HQ, with a strong North American presence anchored in Seattle, our North American HQ, and teams across Madrid, London, Berlin, San Francisco, New York City, Sydney, and Mexico City. We’ve built a product customers love and a business that’s scaling quickly, backed by world-class investors and driven by rapid AI innovation across multiple product lines.

At Aircall, you’ll join a company in motion. We’re ambitious, product-driven, and execution-focused, with visible impact, fast decisions, and real growth.

How we work at Aircall:

We’re customer-obsessed, data-driven, and focused on delivering meaningful outcomes. We value ownership, continuous learning, and thoughtful speed. If you thrive in a collaborative, fast-moving environment where trust and impact matter, you’ll feel at home here.

About the role

The Data Engineering team at Aircall works on providing high-quality, reliable, and actionable data. As an AI-first data team, we are currently in a pivotal transition to build a robust semantic layer that will power our AI-first data platform, enabling analytics at speed and democratizing intelligent insights across the company. Some of the key problems we are currently solving include taking charge of data reliability, integrating new sources for raw data ingestion, and building sophisticated data models to power real-time dashboards and predictive analytics.

In this role, you will be instrumental in building new datasets for high-impact use cases such as churn prediction and feature adoption, while owning the end-to-end reliability and scalability of our data pipelines. You will work closely with Product and GTM business teams, sitting at the heart of a larger data organization alongside Data Science, Analytics, and Applied Scientists to bridge the gap between raw data and AI-driven decision-making.

  • Design, build and maintain core data infrastructure pieces that allow Aircall to support our many data use cases.
  • Enhance the data stack, lineage monitoring and alerting to prevent incidents and improve data quality.
  • Implement best practices for data management, storage and security to ensure data integrity and compliance with regulations.
  • Own the core company data pipeline, responsible for converting business needs to efficient & reliable data pipelines.
  • Participate in code reviews to ensure code quality and share knowledge.
  • Lead efforts to evaluate and integrate new technologies and tools to enhance our data infrastructure.
  • Define and manage evolving data models and data schemas. Manage SLA for data sets that power our company metrics.
  • Collaborate with applied scientists, data scientists, analysts and other business stakeholders to drive efficiencies for their work, supporting complex data processing, storage and orchestration
A little more about you:
  • Bachelor's degree or higher in Computer Science, Engineering, or a related field.
  • 3+ years of experience in data engineering, with a strong focus on designing and building data pipelines and infrastructure.
  • Proficient in SQL and Python, with the ability to translate complexity into efficient code.
  • Experience with data workflow development and management tools (dbt, Airflow).
  • Solid understanding of distributed computing principles and experience with cloud-based data platforms such as AWS, GCP, or Azure.
  • Strong analytical and problem-solving skills, with the ability to effectively troubleshoot complex data issues. Excellent communication and collaboration skills, with the ability to work effectively in a…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary