More jobs:
Job Description & How to Apply Below
Overview
The ideal candidate will be responsible for developing high-quality applications. They will also be responsible for designing and implementing testable and scalable code.
Responsibilities- Designing the environment, defining implementation steps, and driving the transition from Hadoop to a new open-source solution.
- Bachelor's degree or equivalent experience in Computer Science or related field
- Open-source development with modern data platforms
- Programming in Python, Scala, and Java
- Data processing with Spark, Trino, and Flink
- Workflow orchestration & streaming:
Apache Airflow, Apache Kafka - Driving modernization initiatives: transitioning from Hadoop to next-gen open-source solutions
- Mid-Senior level
- Contract
- Design and Information Technology
- IT Services and IT Consulting, Financial Services, and Banking
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×