More jobs:
Job Description & How to Apply Below
12+ Years experience in Big data Space across Architecture, Design, Development, testing & Deployment, full understanding in SDLC.
Experience of Hadoop and related technology stack experience
Experience of the Hadoop Eco-system(HDP+CDP) / Big Data (especially HIVE)
Hand on experience with programming languages such as Java/Scala/python
Hand-on experience/knowledge on Spark.
Being responsible and focusing on uptime and reliable running of all or ingestion/ETL jobs.
Good SQL and used to work in a Unix/Linux environment is a must.
Create and maintain optimal data pipeline architecture.
Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
Good to have cloud experience
Good to have experience for Hadoop integration with data visualization tools like Power
BI .
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×