More jobs:
Snowflake Data Architect
Job in
North Charleston, Charleston County, South Carolina, 29405, USA
Listed on 2026-03-13
Listing for:
The Value Maximizer
Full Time
position Listed on 2026-03-13
Job specializations:
-
IT/Tech
Data Engineer, Data Warehousing
Job Description & How to Apply Below
Job Summary We are seeking an experienced Snowflake Data Architect with strong expertise in Snowflake data platform and Business Intelligence (BI) reporting. The ideal candidate will be responsible for designing scalable data architectures, optimizing Snowflake data warehouses, and enabling high-quality analytics and reporting solutions. This role requires strong experience in data modeling, data pipelines, performance optimization, and collaboration with BI teams to deliver actionable insights.
Key Responsibilities- Design and implement scalable data architecture using Snowflake to support enterprise analytics and reporting needs
- Develop and optimize Snowflake data warehouse solutions, including schema design, performance tuning, and cost optimization
- Build and manage data pipelines and ELT/ETL processes to ingest and transform data from multiple sources into Snowflake
- Design data models (Star/Snowflake schema) to support reporting and analytics use cases
- Collaborate with BI teams to develop dashboards and reports using tools such as Tableau, Power BI, or similar platforms
- Ensure data quality, governance, and security across the data platform
- Optimize query performance and data storage strategies within Snowflake
- Work with cross-functional teams including Data Engineers, BI Developers, and Business Analysts to understand reporting requirements
Implement best practices for data integration, data transformation, and reporting frameworks - Support data migration and modernization initiatives from legacy data warehouses to Snowflake
- Strong experience in Snowflake Data Warehouse architecture and administration
- Hands‑on experience with SQL and performance tuning in Snowflake
- Experience building ETL/ELT pipelines using tools such as Informatica, dbt, Talend, or similar
- Strong experience with BI reporting tools such as Tableau, Power BI, Looker, or similar platforms
- Expertise in data modeling and dimensional modeling techniques
- Experience with cloud platforms such as AWS, Azure, or GCP
- Strong understanding of data governance, security, and best practices for data architecture
- Experience working in large‑scale data warehouse environments
- Knowledge of data lake architecture and modern data stack tools
- Experience with Python or scripting languages for data processing
- Strong communication and stakeholder management skills
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×