Data Engineer
Listed on 2026-01-12
-
IT/Tech
Data Engineer, Cloud Computing
Base Pay Range
$/yr - $/yr
The RoleOur client is seeking a Data Pipeline Engineer to own and enhance the core data infrastructure that powers their product ecosystem. This role is highly impactful — you will maintain ingestion pipelines, troubleshoot complex integration issues, optimize SQL workflows, and build reliable connections between internal systems and key third‑party SaaS platforms.
This position is ideal for a data engineer who loves solving messy data problems, improving reliability, and building clean, scalable data models. You’ll work closely with product leadership, engineering, and external tools to ensure the data foundation is robust and ready for high‑volume growth.
What You’ll Do- Maintain and improve data ingestion pipelines, including integrations built with Hotglue and Heroku
- Troubleshoot and resolve schema mismatches, API limits, authentication errors, and connection issues
- Build and optimize SQL‑based ETL/ELT workflows, transformations, and views in PostgreSQL
- Manage staging datasets, including anonymization and synthetic data generation
- Define and implement core customer‑facing metrics in partnership with product leadership
- Develop and maintain third‑party SaaS integrations (Hub Spot, Quick Books, Asana, etc.)
- Support lightweight Dev Ops tasks including CI/CD workflows, performance tuning, and monitoring
- Ensure reliability and scalability across a multi‑tenant SaaS data architecture
- Drive best practices for data quality, versioning, and pipeline observability
- 3–6+ years of experience in data engineering, integrations, or ETL‑focused roles
- Deep SQL and Postgre
SQL expertise (schema design, optimization, performance tuning) - Experience with ETL tools such as Hotglue, dbt, Airflow, Fivetran, or similar
- Strong understanding of REST APIs, OAuth authentication, rate limiting, and webhook‑driven integrations
- Familiarity with Git, Git Hub Actions, and modern CI/CD workflows
- Experience working with SaaS data models and multi‑tenant architectures
- Strong problem‑solving skills and a collaborative, product‑oriented mindset
- Knowledge of SOC2/GDPR compliance or secure data‑handling practices
- Experience generating synthetic datasets or anonymizing production data
- Exposure to LLM/AI‑powered workflows or data enrichment processes
Full‑time | Remote | No C2C
Seniority LevelMid‑Senior level
Employment TypeFull‑time
Job Function / IndustriesIT System Custom Software Development
#J-18808-Ljbffr(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).