Senior Data Engineer
Listed on 2026-01-12
-
IT/Tech
Data Engineer, Cloud Computing
Airspace is a tech-enabled freight forwarder that’s redefining how the world’s most critical packages are delivered. Headquartered in Carlsbad, California, Airspace has employees who are based around the world. Our European headquarters is in Amsterdam, The Netherlands.
As a recognized leader in AI and machine learning, our team leverages data and patented technology to coordinate logistics across a global network of drivers and airlines. Our goal is to deliver those packages that are truly mission-critical in a way that is faster, more transparent, more secure, and more accountable than ever before. The items we deliver range from organs for transplant, to parts for critical machinery including grounded aircraft and highly sensitive components such as semiconductors.
The company is growing rapidly and serving more places around the world than ever before. We are looking for passionate, motivated individuals who want to make an IMPACT every day to help us execute on our mission of reshaping the world of time-critical logistics.
About the RoleAs Senior Data Engineer, you will be the owner of all data infrastructure and ETL processes will be responsible for ensuring our data pipelines are performant, scalable, and robust — and that they support our business needs today and as we grow.
This role is not just about building great infrastructure — it’s about being a great teammate. We’re looking for someone who brings technical excellence and positive energy to the team, who can lead without ego, teach without condescension, and collaborate with humility and clarity.
You’ll be the technical thought leader for data engineering: advising on architectural decisions, executing high-impact refactors, and setting the roadmap for how data flows through our systems. This is a high-impact, high-autonomy role reporting directly to the Director of Data Science and AI, with the opportunity to shape the future of our data platform.
What You’ll Do- Own and evolve our data infrastructure and pipelines
- Design, build, and maintain reliable ETL pipelines that ingest data from internal application postgres databases and external SaaS platforms (e.g. Salesforce, Twilio, Zendesk)
- Manage and scale our orchestration layer built in Airflow
- Ensure reliability, consistency, and performance across our data systems through strong engineering practices and operational discipline
- Oversee and optimize how data flows into and through our warehouse layer (Snowflake), ensuring transformations (via dbt) integrate cleanly into upstream and downstream systems
- Be a technical leader and strategic architect
- Set a high bar for technical excellence, promoting clean architecture, code quality, and maintainability through thoughtful design, hands-on coding, and code reviews
- Drive architectural improvements and lead system-level refactors to improve performance, scalability, and efficiency
- Continuously evaluate and recommend tools, frameworks, and methodologies that enhance platform capabilities and developer velocity
- Level up our data function across the full lifecycle
- Act as a trusted partner to stakeholders in Product, Data Science, and Infrastructure, helping shape technical roadmaps that align with business priorities
- Identify and resolve systemic bottlenecks in our data workflows, and proactively implement process improvements
- Mentor teammates and foster a collaborative, inclusive, and high-performing engineering culture
- 6+ years of experience in data engineering or backend infrastructure roles
- Thorough knowledge and understanding of Airflow (familiarity with Astronomer or Google Cloud Composer is a plus)
- Experience managing and optimizing data infrastructure built on Snowflake (preferred) or Big Query, including warehouse design, performance tuning, and cost efficiency
- Familiarity with dbt core, including how it fits into modern data pipelines and transformation layers
- Solid understanding of SQL (especially in the context of large-scale warehouse environments)
- Proficiency in Git, version control, and collaborative development workflows
- Comfort working with CI/CD pipelines and deployment automation
- Experience…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).