More jobs:
GCP FinOps Engineer
Job in
Glasgow, Glasgow City Area, G1, Scotland, UK
Listed on 2026-03-02
Listing for:
Stackstudio Digital Ltd.
Contract
position Listed on 2026-03-02
Job specializations:
-
Software Development
Data Engineer
Job Description & How to Apply Below
Job Title:
GCP Fin Ops Engineer Location
- Newport, UK
Key Responsibilities (Individual contributor role) Optimise large scale data analytics workloads through partitioning, clustering, query rewrites, storage format improvements, and lifecycle policies. Tune containerised microservices by recalibrating CPU/memory requests, improving autoscaling efficiency, and restructuring workload placement on cost efficient compute. Redesign workflow orchestration pipelines for parallel execution, increased concurrency, and offloading heavy tasks to lower cost execution environments. Analyse distributed data processing pipelines to right size worker types, adjust scaling thresholds, and adopt low cost compute for batch workloads.
Reduce log processing and storage overhead through log level standardisation, routing rules, exclusion filters, and retention optimisation. Implement storage tiering strategies based on access patterns and enforce lifecycle rules to minimise cold data retention costs. Improve relational database performance through index tuning, connection optimisation, and instance right sizing. Enhance horizontally scalable database performance via autoscaling policies, index improvements, and mitigation of read/write hotspots.
Build dashboards, budgets, alerts, and guardrails to drive ongoing cost governance and financial accountability. Collaborate with engineering teams to embed cost efficient architecture patterns and operational best practices. Key Skills / Knowledge 5 years of hands-on experience in Google Cloud Strong understanding of GCP Data services (indexing, slots, pruning, partitioning, clustering) Expert-level Kubernetes & GKE resource tuning Hands-on experience with Dataflow job pipelines and worker optimisation Strong Airflow/Composer knowledge (DAG design, scheduling, Pod Operator) Strong Dataflow processing pipelines development & schedulers knowledge Deep understanding of Cloud Logging routing, sinks, exclusion filters
Experience with Cloud Spanner autoscaling, indexing, schema optimisation Cloud SQL performance tuning and indexing Ability to analyse billing data & resource consumption Experience using GCP Cost Explorer, Recommender API, Billing Export Ability to quantify cost savings and present ROI to leadership Build dashboards, alerts, and budget guardrails Strong communication and stakeholder management Ability to collaborate across engineering, data, and product teams Structured problem-solving mindset Ownership-driven, proactive and independent Experience Required Min 5 years experience in Google Cloud
Position Requirements
5+ Years
work experience
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×