×
Register Here to Apply for Jobs or Post Jobs. X

Senior Data Engineer

Job in Snowflake, Navajo County, Arizona, 85937, USA
Listing for: Ho Chi Minh City Securities Corp
Full Time position
Listed on 2025-12-05
Job specializations:
  • IT/Tech
    Data Engineer, Data Warehousing
Job Description & How to Apply Below
Location: Snowflake

JOB DESCRIPTION

Primary Objectives

Data engineers design and maintain scalable data pipelines powering CDP and CRM platforms, ensuring data quality, consistency, and availability across enterprise systems. They collaborate with cross-functional teams to deliver reliable, business-driven solutions and develop and enhance data mart (Customer Profile, Journey, Event) as CDP/CRM backbone.

Main responsibilities

  • Data Integration & Pipelines:
    Build and optimize batch/real-time ETL/ELT pipelines across transactional, marketing, CRM/CDP, and external APIs, ensuring scalability and cost efficiency.
  • Data Modeling & Architecture:
    Design CDP/CRM data models using medallion architecture, dimensional modeling, data vault, or data mesh; align with governance standards.
  • Operations & Optimization:
    Monitor pipelines, troubleshoot, automate alerts, and drive continuous performance improvements.
  • Others:
    Maintain documentation, conduct code reviews, and promote knowledge sharing. Support ad-hoc data needs and troubleshoot data issues. Research and adopt emerging data technologies to enhance platform capabilities
JOB REQUIREMENTS

1. Education level

Bachelor’s degree in information technology, business, or related field, or equivalent combination of education and experience required

2. Knowledge & Experience

  • 3+ years of experience as a Data Engineer with strong ETL/ELT pipeline development.
  • Advanced proficiency in SQL and data modeling for large-scale systems.
  • Hands-on experience with cloud data platforms (Databricks, Big Query, Redshift, Snowflake or Spark-based solutions).
  • Familiarity with CDP/CRM ecosystems and integration with marketing platforms (Facebook Ads, Google Analytics, Criteo, RTB House).
  • Practical experience with workflow orchestration (DBT, Airflow, or equivalent).

3. Technical skills

  • Advanced SQL (queries, tuning, optimization); strong Python (data processing, automation, APIs);
    Scala/Java for Spark is a plus.
  • Experience with Databricks, Spark, or similar; cloud data warehouses (Databricks, Big Query, Synapse, Snowflake); orchestration tools (DBT, Airflow, etc.).
  • Building ETL/ELT pipelines for structured/unstructured data; real-time streaming (Kafka, Pub/Sub, Event Hub); CRM/CDP and API integrations.
  • Knowledge of lineage, catalog, and metadata management; hands-on with frameworks like Great Expectations or Soda.
  • CI/CD for data engineering (Git, Git Hub Actions, Azure Dev Ops); exposure to Docker/Kubernetes.

4. Soft skills

  • Effective communicator, able to bridge technical and business stakeholders.
  • Analytical mindset with strong troubleshooting skills; proactive in identifying risks, gaps, and proposing solutions.
  • Curious and open to new technologies, eager to adopt best practices and experiment with new tools for efficiency.
#J-18808-Ljbffr
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary