×
Register Here to Apply for Jobs or Post Jobs. X

Data Architect, Google Cloud Platform; GCP Washington, D.C.

Job in New York, New York County, New York, 10261, USA
Listing for: West Monroe Partners, LLC
Full Time position
Listed on 2025-12-23
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing
Job Description & How to Apply Below
Position: Data Architect, Google Cloud Platform (GCP) New New York; United States; Washington, D.C.
Location: New York

New York;
United States;
Washington, D.C.

Are you ready to make an impact?

As a Data Architect, you will lead the design and delivery of cloud‑native data architectures and solutions for our clients. You will work closely with business stakeholders, data engineers, and developers to build robust data platforms that enable advanced analytics, machine learning, and real‑time data processing. This role requires a mix of technical expertise, consulting skills, and leadership to drive successful outcomes in data‑driven projects.

Responsibilities
  • Design and implement scalable, secure, and high‑performance data architectures on Google Cloud Platform (GCP).
  • Define and implement data lake and data warehouse architectures using GCP services such as Big Query, Cloud Storage, Dataplex, and Dataform.
  • Develop strategies for data migration to GCP from on‑premises or other cloud platforms, ensuring minimal disruption and optimal performance.
  • Architect and oversee the implementation of batch data pipelines using tools such as Dataflow, Big Query Dataform, and Data Fusion.
  • Guide the development of data models optimized for performance, scalability, and cost‑efficiency in Big Query and other GCP services.
  • Define and implement best practices for data governance, data quality, lineage, security, and compliance in GCP environments, leveraging tools like Cloud DLP, IAM, and Dataplex.
  • Partner with stakeholders to establish real‑time analytics pipelines using services like Pub/Sub, Dataflow, and Big Query streaming.
  • Provide expertise in data partitioning, clustering, and query optimization to reduce costs and improve performance.
  • Lead the adoption of serverless solutions and modern data engineering practices, including CI/CD pipelines for data workflows using tools like Cloud Build, Git Hub Actions, or Terraform.
  • Evaluate and recommend GCP‑native AI/ML tools such as Vertex AI and AutoML for advanced analytics and predictive modeling.
  • Serve as a trusted advisor to clients, presenting technical solutions, architectural roadmaps, and cost‑optimization strategies.
  • Conduct workshops, proof‑of‑concepts (POCs), and training sessions to help clients adopt GCP technologies.
  • Lead end‑to‑end implementation of data solutions, including ETL/ELT pipelines, data lakes, and data warehouses, ensuring delivery within scope, budget, and timeline.
  • Troubleshoot and resolve complex issues related to GCP infrastructure, data pipelines, and integrations.
  • Monitor and optimize the performance and cost of GCP data systems, leveraging tools like Cloud Monitoring, Cloud Logging, and Big Query BI Engine.
Qualifications
  • 7+ years of experience in data architecture, data engineering, or related roles, with at least 3 years of hands‑on experience in Google Cloud Platform (GCP).
  • Proven track record of delivering data lake, data warehouse, and real‑time analytics solutions on GCP.
  • Expertise in GCP services including Big Query, Cloud Storage, Dataproc, Dataflow, Pub/Sub, and Cloud SQL/Spanner.
  • Proficiency in designing and implementing ETL/ELT pipelines using Cloud Data Fusion, Apache Beam, or Cloud Composer.
  • Experience with streaming data pipelines using Pub/Sub and Dataflow.
  • Familiarity with Vertex AI, AutoML, and AI Platform Pipelines for machine learning workflows.
  • Strong understanding of IAM roles, service accounts, VPC Service Controls, and encryption best practices.
  • Proficiency in SQL for data modeling, querying, and optimization in Big Query.
  • Strong programming skills in Python or Java, with experience in building reusable data pipelines and frameworks.
  • Experience with Terraform or Deployment Manager for infrastructure as code (IaC) in GCP environments.
  • Familiarity with CI/CD pipelines for data workflows using Cloud Build or other Dev Ops tools.
  • Proven ability to lead technical teams and deliver complex projects.
  • Excellent communication and stakeholder management skills, with the ability to explain technical concepts to non‑technical audiences.
  • GCP certifications such as Professional Data Engineer or Professional Cloud Architect are preferred.
  • Experience with data mesh or data fabric architectures is a plus.
  • Knowledge of multi‑cloud and hybrid cloud strategies is a plus.
  • Familia…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary