×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

GCP Data Architect; Concord, CA; Onsite

Job in Concord, Contra Costa County, California, 94527, USA
Listing for: Damcosoft
Full Time position
Listed on 2026-02-28
Job specializations:
  • IT/Tech
    Data Engineer
  • Engineering
    Data Engineer
Salary/Wage Range or Industry Benchmark: 60000 - 80000 USD Yearly USD 60000.00 80000.00 YEAR
Job Description & How to Apply Below
Position: GCP Data Architect(Concord, CA (Onsite))

Overview

Architect & Design:
Design and implement robust, scalable, and cost-effective data solutions on Google Cloud, serving as the target architecture for migrated workloads.

Develop Reusable Frameworks & Accelerators:
Design, build, and maintain reusable frameworks, templates, and code libraries to standardize and accelerate data engineering work.

This includes creating boilerplate pipeline structures, generic data validation modules, and automated deployment patterns that other engineers will leverage.

Migrate & Modernize:
Lead the hands-on migration of data and processes from on-premises systems like Teradata and Hadoop to Google Cloud services, with a primary focus on Big Query, Google Cloud Storage (GCS), Dataflow, and Dataproc. ETL/ELT Transformation:
Analyze, deconstruct, and translate complex legacy ETL logic from tools like Informatica and Teradata BTEQ/Stored Procedures into modern, cloud-native pipelines, leveraging the frameworks/tooling you help create.

Pipeline Development:
Build and automate new data pipelines for batch and streaming data using Python, SQL, and GCP's core services, ensuring all new development contributes to and benefits from our shared engineering frameworks.

Performance & Cost Optimization:
Proactively optimize Big Query performance through effective partitioning, clustering, and query tuning.

Data Validation & Governance:
Develop and implement rigorous data validation frameworks to ensure data integrity and accuracy post-migration. Collaborate with governance teams to apply data security, lineage, and cataloging using tools like Google Cloud Data Catalog and Dataplex.

Collaboration & Mentorship:
Work closely with on-premises data experts, business analysts, and other engineers to understand requirements, ensure a smooth transition, and act as a subject matter expert and mentor for GCP and internal framework best practices.

Qualifications

Required Qualifications (Must-Haves):

Professional

Experience:

5+ years of professional experience in a data engineering role, with a proven track record of building and maintaining large-scale data systems.

Framework Design & Development:
Demonstrable experience designing, building, and promoting the adoption of reusable data engineering frameworks such as Ingestion, Transformation, Validation etc

On-Premises Data Warehouse/Big Data Expertise:
Deep, hands-on experience with at least one of the following on-premise ecosystems:

Teradata:
Strong understanding of the Teradata architecture, utilities (BTEQ,

TPT, Fast Load/Multi Load), and advanced SQL/Stored Procedure development.

Hadoop:
Experience with the Hadoop ecosystem (HDFS, YARN, Hive) and hands-on proficiency with PySpark for large-scale data processing.

Enterprise ETL:
Demonstrable experience designing and building complex workflows in an enterprise ETL tool like Informatica Power Center.

Google Cloud Platform (GCP) Proficiency:
Demonstrable hands-on experience designing, building, and operating solutions with a comprehensive set of GCP data services, including:

Core Data Processing & Warehousing:
Google Big Query (including data modeling, performance tuning, cost management), Google Cloud Storage

(GCS), Cloud Dataflow, and Cloud Dataproc. Orchestration & Event-Driven Architecture:
Cloud Composer (Managed

Airflow) for complex workflow orchestration, and Pub/Sub and Cloud Functions for building streaming and event-driven data pipelines.

Data Governance & Management:
Practical experience using Dataplex for unified data management, security, and governance across data lakes and warehouses.

Core Engineering & Migration

Skills:

Expert-level proficiency in SQL, including complex joins, window functions, and

Strong programming skills in Python, applying software engineering best practices.

Hands-on experience with Google's migration assessment tools (e.g.,

Big Query Migration Service, Database Migration Service) to analyze on-premise workloads and accelerate migration.

Deep understanding of data warehousing concepts, ETL/ELT patterns, data modeling, and database design.

Preferred Qualifications (Nice-to-Haves):

Proven Migration

Experience:

Direct, hands-on experience successfully completing at

least one large-scale on-premise (Teradata, Hadoop, etc.) to Google Cloud Platform.

Certifications:

A Google Cloud Professional Data Engineer certification is a plus.

#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary