More jobs:
Job Description & How to Apply Below
Description.
Founded in 2018, Giggso is a responsible AI platform for enterprise operations with security and automations.
Role Description
We are seeking a visionary and highly technical Lead Data Architect to design,build, and help with customer needs in Dallas, Texas.
This role is pivotal in bridging the gap between complex business requirements and scalable, high-performance technical solutions.
The ideal candidate will be a master of modern cloud data stacks specifically Snowflake and Databricks with a unique ability to handle complex geospatial datasets while maintaining rigorous regulatory compliance standards.
Key Responsibilities
Architecture Strategy :
Define the long-term architectural roadmap for our data ecosystem, ensuring it supports scalable growth and advanced analytics.
Platform Integration
Design and implement seamless integrations between Snowflake (for warehousing) and Databricks (for lakehouse/AI workloads), leveraging the best of both platforms.
Geospatial Implementation
An architect specialized in pipelines and storage strategies for large-scale geospatial data, ensuring high-performance spatial querying and visualization capabilities.
Big Data Engineering
Oversee the development of robust ETL/ELT pipelines using Apache Spark, Delta Lake, and Hadoop ecosystems.
Governance & Compliance
Act as the primary architect for data privacy and security, ensuring all designs comply with GDPR, CCPA, or other relevant industry-specific regulations.
Data Modeling
Create and maintain enterprise-level conceptual, logical, and physical data models tailored for both structured and unstructured data.
Technical Requirements
Category Required Expertise :
Cloud Data Platforms - Advanced proficiency in Snowflake (Snowpark, Streams, Tasks) and Databricks (Unity Catalog, Delta Live Tables).
Big Data Stack - Deep experience with Apache Spark, Hadoop, and distributed computing principles.
Geospatial Tools
Experience with PostGIS, Esri, or Snowflake/Databricks native spatial functions and H3 indexing.
Languages
Expert-level SQL and Python (specifically PySpark).
Data Governance - Hands-on experience with data cataloging, lineage tools and building Compliance by Design frameworks.
(t.tech)
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×