×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer III

Job in New York, New York County, New York, 10261, USA
Listing for: Blue Nile
Full Time position
Listed on 2026-02-23
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst
Salary/Wage Range or Industry Benchmark: 60000 - 80000 USD Yearly USD 60000.00 80000.00 YEAR
Job Description & How to Apply Below
Location: New York

Overview

The Data Engineer III is responsible for building and maintaining robust data pipelines and models for the R2

Net organization, overseeing a broad scope of ETL infrastructure and databases. By deploying expertise across many diverse systems, APIs, and platforms, this role plays a key part in enabling accurate and timely access to data across the organization, with an emphasis on simplifying and centralizing a complex ecosystem – and thereby providing trustworthy, analytic-ready resources for the various business units.

This role, by interfacing with key stakeholders in Engineering, Analytics, Operations, Finance, Marketing, and Customer Service, will work to build a data environment that is accurate, complete, timely, and dependable, and will serve as a trusted partner to key associates across the org. In achieving these goals, this role will extend beyond pipeline management – it will involve deep collaboration with business stakeholders, proactive engagement in data strategy, and a focus on driving measurable impact through data.

This engineer is expected to bridge technical execution with business outcomes, owning long-term initiatives that drive top-line and bottom-line growth for R2

Net as a whole.

Key Responsibilities
  • Design, implement, and maintain complex data pipelines, ensuring scalability and reliability using Airflow
    , dbt
    , Rivery
    , Python
    , and SQL
    , enabling robust ingestion and transformation of structured and semi-structured data.
  • Serve as a strategic partner to business teams, working closely with stakeholders to translate high-level goals into data solutions that support forecasting, performance tracking, and optimization.
  • Develop and maintain clean, well-documented data models in Snowflake and Big Query that support analytics, reporting, and operational workflows and contribute to architecture decisions.
  • Integrate data from a variety of internal and external sources, including Google Analytics and third-party APIs, to support full-funnel visibility across departments.
  • Enable self-service analytics by ensuring data assets are discoverable and usable via tools such as Tableau
    , including thoughtful semantic layer design and performance tuning.
  • Contribute to the development of robust monitoring and observability practices for data quality and pipeline health.
  • Collaborate on architecture and design decisions, including cloud infrastructure and containerization using AWS. Pulumi and Docker
    .
  • Maintain strong documentation and promote engineering standards that ensure transparency, maintainability, and reusability of data systems.
Required Qualifications
  • 7+ years of professional experience in data engineering, analytics engineering, or related roles.
  • Advanced proficiency in SQL and Python
    , with expertise in efficient query writing, data structures, and software engineering principles.
  • Hands-on experience with Snowflake and/or Big Query
    , including data modeling and performance optimization.
  • Proficiency with orchestration tools (e.g., Airflow) and data integration tools like dbt
    .
  • Experience working with cloud platforms
    , especially AWS
    , for data storage, compute, and infrastructure management, including services such as AWS Batch, ECR, Lambda functions
    , and related tools.
  • Familiarity with data analytics and visualization tools
    , particularly Tableau
    , and ability to support data consumers in building actionable dashboards.
  • Experience with marketing and product data sources
    , including Google Analytics and similar platforms.
  • Strong knowledge of ETL/ELT design and data warehousing solutions.
  • Familiarity with CI/CD pipelines and Dev Ops practices for data engineering.
  • Strong skills in Microsoft Suite for documentation and collaboration.
  • Robust experience with API design and integration
    .
  • Familiarity with SCRUM development methodologies and tools like Jira
Benefits
  • Paid Time Off
  • Medical, Dental, Vision and Prescription Insurance
  • 401(k) Retirement Plan with Company Match
  • Flexible Spending Account | Health Savings Account
  • Tuition Reimbursement
  • Employee Discount
  • Parental Leave
  • Life Insurance

At this time, R2

Net will not sponsor a new applicant for employment authorization for this position.

Equal Opportunity and…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary