Extensive knowledge of Hadoop stack and storage technologies HDFS, MapReduce, Yarn, HIVE, sqoop, Impala , spark, flume, kafka and oozie
Extensive Knowledge on Bigdata Enterprise architecture (Cloudera preferred)
Experience in No SQL Technologies ( Cassandra, Hbase )
Qualifications & Experience:
Bachelor's degree in Science or Engineering
7+ year of Industry experience.
Minimum 3+ years of Big Data experience
Develop Big Data Strategy and Roadmap for the Enterprise
Experience in Capacity Planning, Cluster Designing and Deployment
Benchmark systems, analyze system bottlenecks, and propose solutions to eliminate them
Develop highly scalable and extensible Big Data platform, which enables collection, storage, modeling, and analysis of massive data sets from numerous channels
Continuously evaluate new technologies, innovate and deliver solution for business critical applications
Desired Experiences
Experience in Real time streaming ( Kafka )
Experience with Big Data Analytics & Business Intelligence and Industry standard tools integrated with Hadoop ecosystem. ( R , Python )
Visual Analytics Tools knowledge ( Tableau )
Data Integration, Data Security on Hadoop ecosystem. ( Kerberos )
Awareness or experience with Data Lake with Cloudera ecosystem