Google Cloud Platform Data Architect
Listed on 2026-02-28
-
IT/Tech
Data Engineer, Cloud Computing
Dice is the leading career destination for tech experts at every stage of their careers. Our client, HPTech Inc., is seeking the following. Apply via Dice today!
We are seeking an experienced Google Cloud Platform (Google Cloud Platform) Data Architect to design, build, and manage scalable, secure, and cost-optimized data solutions, aligned with reporting needs. This role involves translating business requirements into robust technical architectures, ensuring data integrity, and enabling advanced analytics through Google Cloud Platform services like Big Query and Cloud Storage. The ideal candidate will lead strategy, design, and implementation efforts while collaborating with stakeholders to drive data-driven decision-making.
Key Responsibilities:- Architect scalable data solutions: design and implement data warehouses, marts, lakes, and batch and/or real-time streaming pipelines using Google Cloud Platform-native tools.
- Data modeling & integration: design and develop conformed data models (star/snowflake schemas) and ETL/ELT processes for analytics and BI tools (Micro Strategy, Looker, Power BI).
- Pipeline development: build scalable pipelines and automate data ingestion and transformation workflows using Big Query, Dataflow, Dataproc/PySpark, Cloud Functions, Pub/Sub, Kafka, and Cloud Composer for orchestration.
- Security & compliance: implement IAM, encryption, and compliance standards (GDPR, HIPAA) with Google Cloud Platform security tools.
- Performance optimization: apply best practices for partitioning, clustering, and BI Engine to ensure high performance and cost efficiency.
- Dev Ops & automation: integrate CI/CD pipelines, IaC (Terraform), and containerization (Docker, Kubernetes) for deployment and scalability.
- Collaboration & leadership: engage with stakeholders including leadership, project managers, BAs, engineers, QA, platform teams, mentor teams, and provide technical guidance on best practices.
- Troubleshooting: resolve complex technical issues and support incident response.
- Healthcare domain expertise: ensure compliance with healthcare regulations and stay updated on industry trends.
Experience:
- Google Cloud Platform expertise:
Big Query, Cloud Storage, Dataflow (Apache Beam with Python), Dataproc/PySpark, Cloud Functions, Pub/Sub, Kafka, Cloud Composer. - Programming: advanced SQL and Python for analytics and pipeline development.
- Performance optimization: experience with optimization of query performance, partitioning, clustering, and BI Engine in Big Query.
- Automation: experience with CI/CD for data pipelines, IaC for data services, automation of ETL/ELT processes.
- Security: strong knowledge of IAM, encryption, and compliance frameworks.
- Architecture design: ability to create fault-tolerant, highly available, and cost-optimized solutions.
- Communication: excellent ability to convey technical concepts to both technical and non-technical stakeholders.
- Domain knowledge: familiarity with healthcare data management and regulatory compliance.
Cygnus Professionals is an Equal Opportunity Employer. We encourage professionals from all backgrounds to apply.
#J-18808-Ljbffr(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).