Data Engineer; Snowflake
Listed on 2026-02-01
-
IT/Tech
Data Engineer, Data Analyst
Why Valtech? We’re advisors, visionaries, creative and techies. We embrace all things digital. We talk to each other. We have fun. We love our clients. We’re looking ahead
• We are global
Why Valtech? We’re the experience innovation company - a trusted partner to the world’s most recognized brands. To our people we offer growth opportunities, a values-driven culture, international careers and the chance to shape the future of experience.
The OpportunityAt Valtech, you’ll find an environment designed for continuous learning, meaningful impact, and professional growth. Whether you're pioneering new digital solutions, challenging conventional thinking or building the next generation of customer experiences, your work will help transform industries.
As a Data Engineer you are passionate about experience innovation and eager to push the boundaries of what’s possible. You bring 4+ years of experience, a growth mindset and a drive to make a lasting impact.
You will thrive in this role if you are:
- A curious problem solver who challenges the status quo
- A collaborator who values teamwork and knowledge-sharing
- Excited by the intersection of technology, creativity and data
- Experienced in Agile methodologies and consulting (a plus)
- Design and implement scalable, secure Snowflake data warehouses (multi-environment, role-based access, RBAC/ABAC, masking/row access).
- Architect ELT pipelines using Fivetran/Stitch/Singer; extend with custom connectors when required.
- Build and maintain dbt-based transformation layers (staging → marts → semantic models) with modular, test-driven designs.
- Implement orchestration using Airflow/Prefect/Dagster for scheduling, dependencies, and retries.
- Apply dimensional modeling, Data Vault, and star schema best practices for analytics at scale.
- Optimize Snowflake performance: warehouse sizing, clustering/partitioning strategies, query tuning, caching, and cost controls (Resource Monitors, Query Acceleration).
- Leverage advanced Snowflake features (Tasks, Streams, Time Travel, Data Sharing, External Tables/Iceberg, Snowpipe, object tagging).
- Implement robust testing (dbt tests, Great Expectations) and proactive data validation frameworks.
- Establish and enforce governance: data lineage, cataloging, PII classification, policies, RBAC, auditability.
- Maintain documentation and lineage in tools like Alation/Collibra/Data Hub; automate as part of CI/CD.
- Set up CI/CD for dbt and data artifacts (Git-based workflows, environments, automated tests, PR checks).
- Build Python utilities for ELT automation, data processing, backfills, and metadata operations.
- Implement observability using Monte Carlo/Datafold (freshness, volume, schema change, anomaly monitoring).
- Partner with analytics/data science teams to enable governed self-service (semantic/metric layers, certified datasets).
- Translate stakeholder requirements into scalable data models and SLAs.
- Champion Data Ops and best practices across teams; conduct knowledge sessions.
To be considered for this role, you must meet the following essential qualifications:
- Education:
A Bachelor's or Master’s degree in computer science, Engineering, or a related technical field is typically required. - 4–7 years in data engineering/analytics engineering with cloud data warehousing focus.
- Deep expertise in Snowflake
: architecture, security (RBAC, masking, row access), performance tuning, cost optimization, and workload management. - Advanced dbt
: macros, packages, refactoring layered models, exposures, tests (unique//relationships/custom), and best practices. - Strong ELT experience with Fivetran/Stitch/Singer
; ability to build custom connectors if needed. - Excellent SQL and data modeling (dimensional modeling, star schema, Data Vault).
- Proficient in Python for data processing, scripting, and automation.
- Experience with Airflow/Prefect/Dagster for orchestration.
- Solid CI/CD knowledge:
Git workflows, environment promotion, automated testing. - Experience on at least one cloud (
AWS/GCP/Azure
) and their native data services (e.g., S3/GCS/ADLS, IAM, secrets). - D…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).