More jobs:
Job Description & How to Apply Below
Scala Tech Lead
Exp: 7-12 yrs
Location:
Mumbai
Virtual Interview date: 28th Jan
JD
Minimum 4+ years of experience in development of Spark Scala
2. Experience in designing and development of solutions for Big Data using Hadoop ecosystem technologies such as with Hadoop Bigdata components like HDFS, Spark, Hive Parquet File format, YARN, Map Reduce, Sqoop
3. Good Experience in writing and optimizing Spark Jobs, Spark SQL etc. Should have worked on both batch and streaming data processing.
4. Experience in writing and optimizing complex Hive and SQL queries to process huge data. good with UDFs, tables, joins, Views etc 5. Experience in debugging the Spark code
6. Working knowledge of basic UNIX commands and shell script 7.
Experience of Autosys, Gradle
Good-to-Have 1.
Good analytical and debugging skills
2. Ability to coordinate with SMEs, stakeholders, manage timelines, escalation & provide on time status
3. Write clear and precise documentation / specification
4. Work in an agile environment
5. Create documentation and document all developed mappings
SN Responsibility of / Expectations from the Role
1 Create Scala/Spark jobs for data transformation and aggregation
2 Produce unit tests for Spark transformations and helper methods
3 Write Scaladoc-style documentation with all code
4 Design data processing pipelines
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×