More jobs:
Job Description & How to Apply Below
SQL, Big Query, Cloud Spanner in combination with third parties such as Spark, Apache Beam/ composer, DBT, Cloud Pub/Sub, Confluent Kafka, Cloud storage Cloud Functions & Github
Designing and implementing data ingestion patterns that will support batch, streaming and API interface on both the Ingress and Egress.
Guide a team of data engineers and work hands on in developing framework and custom code using best practices that will meet the demanding performance requirements
Take a lead in designing and building production data pipelines from data ingestion to consumption using GCP services, Java, Python, Scala, Big Query, DBT, SQL etc.
Experience using Cloud Dataflow using Java/Python for deploying streaming jobs in GCP as well as batch jobs using text/JSON files and writing them to Big Query
Building and managing data pipelines with a deep understanding of workflow orchestration, task scheduling and dependency management
Ability to do proof of technology using GCP technologies and work with data architects, solution architects to achieve the desired results and performance.
Provide end-to-end technical guidance and expertise on how to effectively use Google Cloud to build solutions; creatively applying cloud infrastructure and platform services to help solve business problems; and communicating these approaches to different business users
Provide guidance on Implementing application logging, notification, jobs monitoring and performance monitoring
Candidate Requirements / Must Have Skills 8-10 years of experience in data engineering, performance optimization for large OLTP applications with a minimum of 3 years of working experience as Google Cloud Platform (GCP) developer
5+ years of experience working with relational/No
SQL databases
2-3 years of experience with the primary managed data services within GCP, including Data Proc, Dataflow, Big Query/DBT, Cloud Spanner, Cloud SQL, Cloud Pub/Sub etc.
2-3 years of experience with Google Cloud Platform Databases (SQL, Spanner, Postgre
SQL)
1-2 years of experience with data streaming and technologies such as Kafka, Spark-streaming etc.
Nice-To-Have Skills Working knowledge of developing and scaling JAVA REST services, using frameworks such as Spring
Understanding of Wealth business line and the various data domains required for building an end to end solution
Experience with Infrastructure as Code (IaC) practices and frameworks like Terraform
Knowledge of Java microservices and Spring Boot
Strong architecture knowledge with experience in providing technical solutions for cloud infrastructure.
Active Google Cloud Data Engineer certification or Google Professional Cloud Architect certification preferred
Education Degree in Computer Science or related field
Active Google Cloud Data Engineer certification or Google Professional Cloud Architect certification preferred
#J-18808-Ljbffr
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×