Lowe's

Data Engineer Hadoop Admin

  • Job Type: Full Time
  • Industry Type: IT Sector
  • Industry Location: Bangalore/Bengaluru
  • Experience: 2-5yrs
  • No. of Positions: 10
  • Salary Range: 2-6 lac
  • Primary Skills: Airflow Hadoop Admin Open Source Cloudera Hadoop architecture HDFS Kafka Impala Grafana YARN HBase
  • Secondary Skills: Druid Hive GCP Zookeeper Hue Sentry Spark Hadoop Administrator
  • Job Location: Bangalore/Bengaluru
  • Posted Date: 385 days ago
Job Description

Roles and Responsibilities

We have opportunity for in Lowe’s India, please find details mentioned below.

 

Hadoop Platform Administrator:

 

Roles and Responsibility:

Guide the team in case of any issue technical issues and help them to implement new open source technology i.e. airflow, Grafana, Druid

Configure Kubernetes cluster and maintain this cluster

Maintain GCP and Hadoop infrastructure, monitor various services, manage configurations, operating system patching and updates

  • Work with various internal engineering teams & cloudera Support in infrastructure design and operational issues.
  • Plan and implement Cloudera hadoop Software Upgrades
  • Ensure availability and validity of system backups and disaster recovery preparedness.
  • Monitor performance, collect and present usage metrics
  • Develop and implement policies and procedures to manage all aspects of Hadoop administration, including security standards, workload management, availability monitoring and alerting.
  • Proactively identify potential issues and bottlenecks across hardware, Software, batch and user processes
  • Support data engineering solutions team and analytic users on various project needs and issues
  • Contribute to Hadoop cluster capacity planning and roadmap

 

Experience

  • 2 -4 years of Software/DB/System Admin experience with a 3-4 years of experience in administering Hadoop clusters
  • 1-2 years of experience in managing open source products
  • 2-3 years of experience in managing large Linux / DB systems
  • 2-3 years of exposure to data processing applications using HDFS, Java Map Reduce, Hive/Tez, HBase, Pig, Sqoop, Falcon, Oozie and Flume.
  • 2-3 years of scripting (bash, Perl, Python etc.) and devops experience

 

Technical Skills

  • Good understanding of Kubernetes and GCP
  • Thorough understanding of Cloudera Hadoop architecture, and services such as HDFS, Sentry, HBase, Impala, Hue, Spark, Hive, Kafka, YARN, and Zookeeper.
  • Experience in designing large scale hadoop clusters, adding/removing nodes, tuning memory /CPU and workload management parameters & software components installation
  • Experience in use of administrative tools like Ambari and troubleshooting various service logs
  • Good understanding of Linux administration in terms of access set up, security, storage etc.
  • Strong performance monitoring and performance tuning skills.
  • Experience in managing Cloud Iaas and PaaS hadoop/data components in Google Cloud or Azure will be an added advantage
  • Bachelor's degree in Computer Science, CIS, or related field and 8+ years IT experience

 

 

 

Relevant Job Openings
Network Administrator or System Administrator
Hiring for Data architecture for Bangalore