Skills/Experience required
B.E./B.Tech in Computer science or Information Technology or Equivalent with at least 70 % throughout the academics(10th/ 12th/ B.Tech)
Demonstrated 3-6 yrs experience in keeping big hadoop clusters up
Operational experience with Big data technology stack - Hadoop, HDFS, Spark etc.
Ability and attitude to work with many stakeholders to ensure that clusters receiving all data that it supposed to receive, OS/storage/hardware remains in good health
Experience with O&M tools like Cloudera manager, Grafana, Kibana etc.
Experience with Linux & shell scripting, Python/PySpark is preferred
Experience with upgrades of Cloudera, Cassandra, ORC etc.
Experience with Commission/de-commissioning of nodes.