×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer

Job in San Francisco, San Francisco County, California, 94199, USA
Listing for: Cargomatic
Full Time position
Listed on 2026-02-28
Job specializations:
  • Software Development
    Data Engineer, AI Engineer
Salary/Wage Range or Industry Benchmark: 140000 - 160000 USD Yearly USD 140000.00 160000.00 YEAR
Job Description & How to Apply Below

Senior Data Architect – Data Engineering

Location:

San Francisco, CA

Reports To:

VP of Engineering

FLSA Status:
Exempt

Employment Type:

Full-Time

Compensation: $140,000 – $160,000 annually (based on experience)

About Cargomatic

Cargomatic is transforming the local trucking industry with cutting‑edge technology that connects shippers and carriers in real time. Every product that humans build, grow, or sell has spent time on a truck. Local trucking is the lifeblood of every regional economy, yet this $82 billion industry still relies heavily on outdated systems. Cargomatic is bringing transparency, efficiency, and intelligence to local freight through modern technology and data‑driven solutions.

We are solving complex, real‑world logistics problems every day. If you thrive in a fast‑paced environment, enjoy building scalable systems, and want to help shape the future of AI‑powered logistics, we’d love to meet you.

Position Summary

Cargomatic is seeking a Senior Data Architect – Data Engineering to design and build scalable, cloud‑native data infrastructure that powers analytics, machine learning, and AI‑driven applications. This role combines deep data architecture expertise with hands‑on experience in modern data platforms and LLM‑enabled application development.

You will lead the design of enterprise‑grade data models, architect RAG systems, implement agentic workflows, and integrate secure, production‑ready LLM capabilities into our ecosystem. This is a high‑impact role with significant ownership, visibility, and opportunity to shape the future of intelligent logistics technology.

Key Responsibilities
  • Design and build scalable, cloud‑native data pipelines (batch and streaming) supporting analytics, ML, and AI‑powered applications
  • Architect enterprise‑grade data models across data lakes, warehouses, and real‑time systems (Snowflake, Databricks, Kafka, DBT)
  • Define standards for data governance, reliability, performance, and cost optimization
  • Optimize storage formats and distributed data systems (Parquet, Delta Lake, Iceberg)
AI & LLM‑Enabled Systems
  • Develop Retrieval‑Augmented Generation (RAG) systems integrating structured and unstructured enterprise data
  • Design and implement agentic workflows using frameworks such as Lang Chain, Lang Graph, Llama Index, n8n, or similar
  • Integrate LLM APIs (OpenAI, Anthropic, or similar) into secure, production‑ready applications
  • Implement guardrails, validation layers, monitoring, and evaluation frameworks to mitigate hallucination, prompt injection, and data security risks
Backend & API Development
  • Build secure backend APIs (Python/FastAPI) to expose AI‑powered capabilities
  • Ensure observability, monitoring, and cost controls across AI and data services
  • Contribute to microservices architecture and distributed system design
Collaboration & Leadership
  • Partner cross‑functionally with Product, Engineering, and Operations to translate business requirements into scalable technical solutions
  • Mentor junior engineers and contribute to architectural standards and best practices
  • Drive innovation in data engineering and AI‑powered logistics systems
Qualifications
  • Bachelor’s degree in Computer Science or equivalent practical experience
  • 8+ years of software or data engineering experience in production environments
  • Strong expertise in data modeling, distributed systems, and scalable cloud architectures
  • Hands‑on experience with ETL/ELT frameworks and streaming technologies (Kafka, Spark, HEVO, Snowflake, DBT, etc.)
  • Advanced SQL skills and deep understanding of modern storage formats
  • Proficiency in Python and RESTful API development
  • Experience integrating LLM APIs into production applications
  • Strong understanding of system reliability, observability, and cost management in cloud environments
Desired Experience
  • Experience building RAG pipelines including embeddings, vector search, chunking strategies, and hybrid retrieval
  • Experience designing multi‑agent or agentic AI workflows with orchestration frameworks
  • Knowledge of LLM evaluation, monitoring, and tracing tools (Lang Smith or similar)
  • Experience with microservices architecture and distributed system design
  • Exposure to transportation, logistics, or supply chain domains
  • Active Git Hub contributions or demonstrated passion for emerging AI and data technologies
Why Join Cargomatic?
  • Medical, Dental, and Vision insurance
  • 401(k) with company match
  • Flexible Spending Accounts (FSA)
  • Company‑paid Life and Disability insurance
  • Flexible Paid Time Off (PTO) and company holidays
  • Paid Parental Leave
  • Employee Assistance Program (EAP)
  • Opportunity to build cutting‑edge AI solutions in a high‑growth logistics technology company
  • Collaborative, high‑impact team environment

Cargomatic is proud to be an Equal Opportunity Employer. We are committed to creating a diverse and inclusive workplace where all employees feel valued and empowered to succeed.

#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary