×
Register Here to Apply for Jobs or Post Jobs. X

Data Architect

Job in Snowflake, Navajo County, Arizona, 85937, USA
Listing for: Talentica Software India Pvt. Ltd.
Full Time position
Listed on 2026-01-12
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing
Salary/Wage Range or Industry Benchmark: 125000 - 150000 USD Yearly USD 125000.00 150000.00 YEAR
Job Description & How to Apply Below
Location: Snowflake

Talentica Software, started by industry veterans and ex- IITB grads, is a product engineering company that helps tech-native enterprises and startups turn their ideas into market-leading products. We deliver innovative, high-quality products at an accelerated pace by combining the product mindset of our human experts with the power of AI. Over the last 22 years, the company has worked with over 200+ startups, with most clients based in the US, ensuring many successful exits.

In 2022, Great Place to Work® recognized Talentica Software as India's Great Mid-Size Workplace

What we're looking for?

We are looking for a highly skilled and experienced Data Engineer that will focus on leading the development, and implementation of our Data Warehouse/Lakehouse solution, ensuring it serves as the foundation for scalable, high-performance analytics.

With a deep technical expertise in data architecture, cloud-based data solutions, and modern analytics platforms, the data engineer will play a pivotal role in shaping our data infrastructure strategy and execution.

What you’ll do? - Lakehouse Design & Implementation:
  • Lead the end-to-end development and deployment of a scalable and secure Lakehouse architecture.
  • Define best practices for data ingestion, storage, transformation, and processing using modern cloud technologies.
  • Architect data pipelines using ETL/ELT frameworks to support structured, semi structured, and unstructured data.
  • Optimize data modeling strategies to meet the analytical and performance needs of stakeholders.
  • Evaluate and select appropriate cloud technologies, frameworks, and architectures.
  • Develop and maintain efficient, automated data pipelines that integrate data from multiple sources (Oracle, Big Query, Mongo

    DB, Google Analytics, etc.).
  • Implement distributed data processing frameworks using tools like Apache Spark and Dataflow.
  • Ensure optimal query performance tuning and cost-effective cloud resource utilization.
Governance, Security, and Compliance:
  • Establish and enforce data governance policies, ensuring adherence to compliance regulations such as GDPR and CCPA.
  • Implement robust data access control and security measures across multi-tenant environments.
  • Define data lineage, cataloging, and metadata management to enable data democratization.
  • Stay abreast of developments in cloud data platforms, AI integrations, and analytics technologies to drive innovation.
  • Evaluate and recommend emerging data technologies to enhance efficiency and scalability.
  • Lead Proof of Concepts (POCs) to validate new approaches and technologies.
To be successful in this role, you should have
  • Qualification: BE/BTech in Computer Science, Data Engineering, or a related field from a top institute (like IIT, NIT, BITS, etc.).
  • Experience: 8-12 yrs of experience in data engineering, with a proven track record of implementing large-scale data solutions.
Skills
  • Extensive experience with cloud platforms (AWS, GCP, or Azure), specifically in data warehouse/lakehouse implementations.
  • Expertise in modern data architectures with tools like Databricks, Snowflake, or Big Query.
  • Strong background in SQL, Python, and distributed computing frameworks (Spark, Dataflow, etc.).
  • Experience building high-volume, reliable data pipelines for real-time and batch processing.
  • In-depth knowledge of data modeling principles (e.g., Star Schema, Snowflake Schema).
  • Experience in enabling AI tools to consume data from the Lakehouse, including implementing semantic layers.
  • Strong understanding of data governance, security, and compliance frameworks.
Must-Haves
  • Must have experience with ETL/ELT frameworks
  • Programming & Scripting:
    Python, SQL, Scala (advantage).
  • Database Systems:
    Oracle, Big Query, Mongo

    DB.
  • CI/CD & Infrastructure as Code:
    Git, Jenkins.
  • Security & Compliance:
    Role-based access control (RBAC), encryption, data masking.
  • Excellent problem-solving abilities and strategic thinking mindset.
  • Strong leadership and collaboration skills to guide and influence technical teams.
  • Effective communication skills to present technical concepts to non-technical stakeholders.
  • Ability to work independently and manage multiple priorities in a fast-paced environment.
Nice to have
  • Good to have…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary