×
Register Here to Apply for Jobs or Post Jobs. X

Lead Data Architect - Big Data Tools

Job in 600001, Chennai, Tamil Nadu, India
Listing for: Giggso
Full Time position
Listed on 2026-02-08
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing, Data Warehousing
Job Description & How to Apply Below
Position: Giggso - Lead Data Architect - Big Data Tools
Description.

Founded in 2018, Giggso is a responsible AI platform for enterprise operations with security and automations.

Role Description

We are seeking a visionary and highly technical Lead Data Architect to design,build, and help with customer needs in Dallas, Texas.

This role is pivotal in bridging the gap between complex business requirements and scalable, high-performance technical solutions.

The ideal candidate will be a master of modern cloud data stacks specifically Snowflake and Databricks with a unique ability to handle complex geospatial datasets while maintaining rigorous regulatory compliance standards.

Key Responsibilities

Architecture Strategy :

Define the long-term architectural roadmap for our data ecosystem, ensuring it supports scalable growth and advanced analytics.

Platform Integration

Design and implement seamless integrations between Snowflake (for warehousing) and Databricks (for lakehouse/AI workloads), leveraging the best of both platforms.

Geospatial Implementation

An architect specialized in pipelines and storage strategies for large-scale geospatial data, ensuring high-performance spatial querying and visualization capabilities.

Big Data Engineering

Oversee the development of robust ETL/ELT pipelines using Apache Spark, Delta Lake, and Hadoop ecosystems.

Governance & Compliance

Act as the primary architect for data privacy and security, ensuring all designs comply with GDPR, CCPA, or other relevant industry-specific regulations.

Data Modeling

Create and maintain enterprise-level conceptual, logical, and physical data models tailored for both structured and unstructured data.

Technical Requirements

Category Required Expertise :

Cloud Data Platforms - Advanced proficiency in Snowflake (Snowpark, Streams, Tasks) and Databricks (Unity Catalog, Delta Live Tables).
Big Data Stack - Deep experience with Apache Spark, Hadoop, and distributed computing principles.

Geospatial Tools

Experience with PostGIS, Esri, or Snowflake/Databricks native spatial functions and H3 indexing.

Languages

Expert-level SQL and Python (specifically PySpark).
Data Governance - Hands-on experience with data cataloging, lineage tools and building Compliance by Design frameworks.

(t.tech)
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary