Role: Data Engineer - Hadoop Admin
Experience Level: 3 to 7 Years
Work location: Mumbai / Bangalore
Top Skills and Competency:
Job Description :
1. Experience in Big Data Technologies: Yarn, Pig, Hive, Flume, Sqoop, Sentry, Kafka, Spark, AD Kerberos, KMS and Zookeeper.
2. Experienced in Cluster Planning with the Team, Installation, Configuration and Deployment. Involved in the start to end process of Hadoop cluster setup was in installation, configuration and monitoring of the Hadoop Cluster.
3. troubleshooting and Providing solutions to users based on tickets generated. Creating documentation to help the internal team.
4. Effective communicator with excellent relationship building and interpersonal skills. Fond of learning & team player with strong relationship building.
5. Configuring and Managing HA for various services like HDFS HA and RM HA.
6. Worked on various components of AWS Cloud-like EMR, S3, EC2.
7. Responsible for Cluster maintenance, commissioning and decommissioning Data nodes, Cluster Monitoring. Followed Best practices for preparing and maintaining Hadoop clusters in production.
8. Experience in Commissioning, Decommissioning, Balancing and Managing Nodes and Tuning servers for optimal performance of the cluster.