More jobs:
Databricks Data Platform Architect
Job in
Raleigh, Wake County, North Carolina, 27601, USA
Listed on 2026-03-01
Listing for:
UsefulBI Corporation
Full Time
position Listed on 2026-03-01
Job specializations:
-
IT/Tech
Data Engineer, Cloud Computing
Job Description & How to Apply Below
Databricks Data Platform Architect (Dev Ops & Infrastructure Focus)
Location: Alameda, CA – Onsite (5 days/week)
About the RoleWe are looking for a hands‑on Databricks Data Platform Architect to design, build, and operate our enterprise lakehouse platform on AWS.
This role combines Dev Ops, Infrastructure Engineering, and Data Engineering architecture.
You will own the reliability, scalability, security, and performance of our Databricks environment and enable data teams to build high-quality, production‑grade pipelines.
If you enjoy building platforms from the ground up, automating everything, and solving performance problems at scale, this role is for you.
What You’ll OwnPlatform & Infrastructure Architecture (Core Focus)
- Architect and manage Databricks on AWS end-to-end
- Implement CI/CD for notebooks, jobs, and pipelines
- Define cluster policies, autoscaling, and environment isolation (dev/stage/prod)
- Set up monitoring, logging, and alerting (Cloud Watch/Datadog)
- Drive cost optimization and compute efficiency strategies
- Ensure high availability, reliability, and production uptime
- Design lakehouse architecture (bronze/silver/gold)
- Optimize Spark/Delta workloads for performance and scalability
- Establish data engineering standards and best practices
- Support batch and streaming workloads
- Troubleshoot performance bottlenecks and production incidents
- Implement role-based access control using Unity Catalog
- Secure S3, IAM, and external locations
- Enforce data access policies and PII protection
- Enable audit logging and compliance guardrails
- Partner with security and governance teams to ensure best practices
- 8+ years in Data Engineering / Platform Engineering / Cloud Infrastructure
- 4+ years hands‑on with Databricks in production
- Strong expertise in Apache Spark, Python, and SQL
- Deep knowledge of AWS (S3, IAM, VPC, Cloud Watch, networking)
- Experience with Terraform or other IaC tools
- CI/CD implementation experience for data platforms
- Strong troubleshooting and performance tuning skills
- Experience with Delta Lake optimization (ZORDER, OPTIMIZE, partitioning)
- Workflow orchestration tools (Airflow/Databricks Workflows)
- Experience with enterprise data governance tools (Unity Catalog/Atlan/Collibra)
- Knowledge of GDPR/HIPAA or regulated environments
- Prior experience building platforms for multiple teams
- Clear environment separation and automation
- Self‑service data platform for engineers and analysts
- Strong security and governance with minimal friction
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×