×
Register Here to Apply for Jobs or Post Jobs. X

Snowflake Data Architect-Snowflake Data Modeling

Job in Baltimore, Anne Arundel County, Maryland, 21276, USA
Listing for: Genesis NGN Inc.
Full Time position
Listed on 2026-01-11
Job specializations:
  • IT/Tech
    AI Engineer, Data Engineer
Job Description & How to Apply Below

Client Visa Sponsorship

Client is not sponsoring any VISA.

Must Have Technical/Functional Skills
  • Snowflake expertise:
    Warehouses, databases, roles, RBAC, SCIM, MFA.
  • Data Engineering: ELT/ETL tools (dbt, Talend), orchestration (Airflow).
  • Cloud Platforms: AWS, Azure, or Google Cloud Platform with Snowflake integration.
  • Programming: SQL, Python; familiarity with ML frameworks.
  • Security & Compliance:
    Data masking, encryption, audit processes.
  • Strong experience with LLMs (OpenAI, Anthropic, Hugging Face, Lang Chain).
  • Proficiency in Python and modern AI frameworks.
  • Familiarity with vector databases, prompt engineering, and AI best practices.
  • 5 years of product-focused engineering experience.
  • Knowledge of cloud deployment and scaling AI systems.
  • Strong SQL and Python skills.
  • Hands‑on experience with dbt and Snowflake.
  • Familiarity with cloud platforms (AWS).
  • Knowledge of CI/CD, Dev Ops practices, and data orchestration tools (Airflow, Pref).
  • Ability to create lineage graphs, documentation, and validation frameworks.
Must Have Skills

snowflake, Cortex AI, AWS, DBT

Roles & Responsibilities
  • Design and manage data pipelines using dbt, Airflow, and CI/CD frameworks.
  • Implement Snowpipe for continuous ingestion, Streams & Tasks for real-time processing.
  • Enable AI/ML integration:
    Support predictive analytics and generative AI use cases.
  • Leverage Snowflake Cortex and Copilot for LLM-based applications.
  • Ensure data governance, RBAC, and security compliance across Snowflake environments.
  • Optimize performance and implement Time Travel, Zero Copy Cloning, and Secure Data Sharing.
  • Build production‑ready AI applications and LLM‑powered features.
  • Collaborate with AI Data teams to develop agentic AI workflows.
  • Experiment with open‑source models and translate prototypes into production systems.
  • Implement RAG pipelines, fine‑tuning, and observability for AI models.
  • Design and deploy secure, scalable, and highly available architectures on AWS.
  • Select appropriate AWS services for application design and deployment.
  • Implement cost‑control strategies and disaster recovery plans.
  • Collaborate with teams to integrate systems and ensure compliance.
  • Develop Infrastructure as Code (IaC) using Terraform or Cloud Formation.
  • Design, build, and maintain data pipelines using dbt for analytics and operational use cases.
  • Implement standards for data quality, consistency, and reliability. Optimize query performance and manage compute costs.
  • Collaborate with analysts and stakeholders to understand data requirements.
  • Build automation into workflows and ensure compliance with governance policies.
#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary