More jobs:
USA_Technical Lead
Job in
Dallas, Dallas County, Texas, 75201, USA
Listed on 2026-03-06
Listing for:
Varite, Inc
Full Time
position Listed on 2026-03-06
Job specializations:
-
IT/Tech
Data Engineer, Cloud Computing, Data Warehousing
Job Description & How to Apply Below
GBaMS ReqID:
Snowflake Tech Lead
Job Description:
Architecture & Design
• Design and implement end-to-end Snowflake-based data architectures for analytics, reporting, and advanced data use cases
• Define data modeling strategies (dimensional, data vault, and analytical models) optimized for Snowflake
• Establish standards for data ingestion, transformation, storage, and consumption
Snowflake Platform Management
• Architect and manage Snowflake features including Warehouses, Databases, Schemas, Cloning, Time Travel, Secure Data Sharing, Data Clean Rooms and Resource Monitoring
• Optimize performance and cost using warehouse sizing, clustering, caching, and query optimization
• Implement security best practices including RBAC, masking policies, row access policies, and data governance
Data Transformation & ETL/ELT
• Lead ELT pipeline development using DBT (models, macros, tests, documentation, and deployments)
• Design and implement ETL/ELT pipelines using cloud-native Snowpark and third-party tools. Implement Real time streaming and Batch data Processing.
• Ensure data quality, lineage, and observability across pipelines
Cloud & Big Data Integration
• Architect solutions leveraging cloud data services (AWS, Azure, or GCP) such as object storage, messaging, and orchestration services
• Integrate Apache Spark (Databricks or equivalent) for large-scale data processing and advanced transformations
• Support hybrid and multi-cloud data architectures
Development & Automation
• Develop data processing and automation solutions using Python
• Build reusable frameworks for ingestion, transformation, validation, and monitoring
• Implement CI/CD pipelines for data workloads and DBT, Snowpark deployments
Leadership & Collaboration
• Partner with business stakeholders, analytics, and data science teams to translate requirements into scalable solutions
• Mentor data engineers and analysts on Snowflake, DBT, Snowpark and data engineering best practices
• Provide architectural guidance, documentation, and design reviews
• Strong hands-on experience with Snowflake architecture and performance tuning
• Expertise in DBT (models, testing, macros, documentation, environments)
• Solid experience with ETL/ELT frameworks and data integration patterns
• Proficiency in Python for data engineering and automation
• Experience with Snowpark Implementation
• Strong knowledge of cloud data services (AWS, Azure, or GCP)
• Advanced SQL and data modeling skills"
Skills: Digital :
Amazon Web Service(AWS) Cloud Computing~Digital :
Snowflake~Digital :
Py Spark
Experience Required: 8-10 years
Skills:
Category Name Required Importance Experience Skill Category Test1 _MN Digital :
Amazon Web Service(AWS) Cloud Computing Yes 1 7+ years
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×