×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer

Job in New York City, Richmond County, New York, 10261, USA
Listing for: TaskRabbit
Full Time position
Listed on 2025-12-02
Job specializations:
  • IT/Tech
    Data Engineer, Data Science Manager
Job Description & How to Apply Below
Position: Staff Data Engineer

Taskrabbit is a marketplace platform that conveniently connects people with Taskers to handle everyday home to-do’s, such as furniture assembly, handyman work, moving help, and much more.

At Taskrabbit, we want to transform lives one task at a time. As a company we celebrate innovation, inclusion and hard work. Our culture is collaborative, pragmatic, and fast-paced. We’re looking for talented, entrepreneurially minded and data-driven people who also have a passion for helping people do what they love.

Together with IKEA, we’re creating more opportunities for people to earn a consistent, meaningful income on their own terms by building lasting relationships with clients in communities around the world.

  • Taskrabbit is a remote-first company with employees distributed across the US and EU
  • 5-time Best Places to Work in 2022 by Built In. Including Best Companies in SF, Best Mid-Sized Companies, and Best Benefits
  • Data Bird journal’s “Best Places” Best Companies for Diversity, #1 2019 and 2020
  • Data Bird journal’s “Best Places” Best Companies for Women, #4 2019 and #1 2020

Join us at Taskrabbit, where your work will be meaningful, your ideas valued, and your potential unleashed!.

About the Role

We are seeking a Staff Data Engineer to lead the design, development, and optimization of our data infrastructure and analytics layers, enabling the creation of reliable, scalable, and high-quality data products across the company. This role will report to the Director of Data Engineering and Applications and work closely with both technical teams and non-technical stakeholders across the business. While this is an individual contributor role, it emphasizes mentorship and strategic guidance across both data engineering and analytics engineering functions.

The ideal candidate has deep experience building and maintaining modern data platforms using tools such as dbt, Airflow, and Snowflake (or equivalent), and brings strong expertise in data modeling, orchestration, and production-grade data pipelines. They excel at engaging with non-technical stakeholders to understand business needs and are skilled at translating those needs into well-defined metrics, semantic models, and self-serve analytical tools.

They are comfortable shaping architectural direction, promoting best practices across the team, and thrive in environments that require cross-functional collaboration, clear communication, and a strong sense of ownership.

What You'll Work On:
  • Design, build, and maintain scalable, reliable data pipelines and infrastructure to support analytics, operations, and product use cases
  • Develop and evolve dbt models, semantic layers, and data marts that enable trustworthy, self-serve analytics across the business
  • Collaborate with non-technical stakeholders to deeply understand their business needs and translate them into well-defined metrics and analytical tools
  • Lead architectural decisions for our data platform, ensuring it is performant, maintainable, and aligned with future growth
  • Build and maintain data orchestration and transformation workflows using tools like Airflow, dbt, and Snowflake (or equivalent)
  • Champion data quality, documentation, and observability to ensure high trust in data across the organization
  • Mentor and guide other engineers and analysts, promoting best practices in both data engineering and analytics engineering disciplines
Your Areas Of Expertise:
  • Expertise in building and maintaining ELT data pipelines using modern tools such as dbt, Airflow, and Fivetran
  • Deep experience with cloud data warehouses such as Snowflake, Big Query, or Redshift
  • Strong data modeling skills (e.g., dimensional modeling, star/snowflake schemas) to support both operational and analytical workloads
  • Proficient in SQL and at least one general-purpose programming language (e.g., Python, Java, or Scala)
  • Experience with streaming data platforms (e.g., Kafka, Kinesis, or equivalent) and real-time data processing patterns
  • Familiarity with infrastructure-as-code tools like Terraform and Dev Ops practices for managing data platform components
  • Hands-on experience with BI and semantic layer tools such as Looker, Mode, Tableau, or equivalent

At Taskrabbit, our…

To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary