×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer

Job in Los Angeles, Los Angeles County, California, 90079, USA
Listing for: UpRecruit
Full Time position
Listed on 2026-01-12
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing
Salary/Wage Range or Industry Benchmark: 60000 - 80000 USD Yearly USD 60000.00 80000.00 YEAR
Job Description & How to Apply Below

Base Pay Range

$/yr - $/yr

The Role

Our client is seeking a Data Pipeline Engineer to own and enhance the core data infrastructure that powers their product ecosystem. This role is highly impactful — you will maintain ingestion pipelines, troubleshoot complex integration issues, optimize SQL workflows, and build reliable connections between internal systems and key third‑party SaaS platforms.

This position is ideal for a data engineer who loves solving messy data problems, improving reliability, and building clean, scalable data models. You’ll work closely with product leadership, engineering, and external tools to ensure the data foundation is robust and ready for high‑volume growth.

What You’ll Do
  • Maintain and improve data ingestion pipelines, including integrations built with Hotglue and Heroku
  • Troubleshoot and resolve schema mismatches, API limits, authentication errors, and connection issues
  • Build and optimize SQL‑based ETL/ELT workflows, transformations, and views in PostgreSQL
  • Manage staging datasets, including anonymization and synthetic data generation
  • Define and implement core customer‑facing metrics in partnership with product leadership
  • Develop and maintain third‑party SaaS integrations (Hub Spot, Quick Books, Asana, etc.)
  • Support lightweight Dev Ops tasks including CI/CD workflows, performance tuning, and monitoring
  • Ensure reliability and scalability across a multi‑tenant SaaS data architecture
  • Drive best practices for data quality, versioning, and pipeline observability
What We’re Looking For
  • 3–6+ years of experience in data engineering, integrations, or ETL‑focused roles
  • Deep SQL and Postgre

    SQL expertise (schema design, optimization, performance tuning)
  • Experience with ETL tools such as Hotglue, dbt, Airflow, Fivetran, or similar
  • Strong understanding of REST APIs, OAuth authentication, rate limiting, and webhook‑driven integrations
  • Familiarity with Git, Git Hub Actions, and modern CI/CD workflows
  • Experience working with SaaS data models and multi‑tenant architectures
  • Strong problem‑solving skills and a collaborative, product‑oriented mindset
Bonus Experience
  • Knowledge of SOC2/GDPR compliance or secure data‑handling practices
  • Experience generating synthetic datasets or anonymizing production data
  • Exposure to LLM/AI‑powered workflows or data enrichment processes
Compensation

Full‑time | Remote | No C2C

Seniority Level

Mid‑Senior level

Employment Type

Full‑time

Job Function / Industries

IT System Custom Software Development

#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary