• Obtain an overall understanding of data and processes
• Partner with the Product Owner on requirements and brainstorm implementation options
• Participate in planning session to determine technical designs and estimates
• Develop and deliver solutions using Big Data/Spark tools and technologies
• Collaborate effectively with other teams (internal and external) in a very dynamic agile environment
• Learn new tools and implement continuous improvements in support of the Product’s strategic vision
2 to 5 years of experience working on the Cloud and Spark platform
A deep understanding of Hadoop internals, design principals, cluster connectivity, security and the factors that affect distributed system performance;
Expert experience with at least two of the following languages; SQL, Python, Scala, Spark and UNIX.
Strong development experience Scala/Python & SPARK
Good communication skills
Ability to work in a high-paced team environment
Scality,Spark,Kafka,Unix,Sql,Python/Scala
2 to 5 years
1 PM to 9:30 PM IST
Location: PAN India