Matlen Silver

Hadoop Developer

  • Job Type: Contract W2
  • Industry Type: IT Sector
  • Industry Location: Remote or Plymouth Meeting
  • Experience: NA
  • No. of Positions: 1
  • Primary Skills: Hadoop Developer Spark Scala Kafka Hive
  • Secondary Skills: HDFS Python Big Data
  • Job Location: Remote or Plymouth Meeting, Pennsylvania
  • Posted Date: Posted today
Job Description

KNOWLEDGE AND EXPERIENCE

Responsibilities : 4-8 years of development experience in object oriented applications with 2-4 years of experience in scala.

1.In depth understanding/knowledge of Hadoop & spark Architecture and its components such as HDFS, Job Tracker, Task Tracker,executor cores and memory params.

2.Experience in Hadoop development, working experience on SPARK, SCALA is mandatory and Database exposure is must

3.Hands on experience in Spark and Spark Streaming creating RDD's, applying operations -Transformation and Actions.

4.Experience in code optimize to fine tune the applications.

5.Expertise in writing Hadoop/spark Jobs for analyzing data using Spark, Scala, Hive, Kafka and Python

6.Experience with developing large-scale distributed applications and developing solutions to analyze large data sets efficiently.

7.Integration with Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions. Experience in Data Warehousing and ETL processes.

8.Strong database, SQL, ETL and data analysis skills.

9.Knowledge in Maintaining the Log and Audit information in Hive/SQL tables, Experienced in providing Logging, Error Handling

10.Analyzed large amounts of data sets to determine optimal way to aggregate and report on it.

11.Configured Spark Streaming to receive real time data from the Apache Kafka and store the stream data to HDFS using Scala.

13.Experienced Scheduling jobs using Airflow/ESP.

14.Experienced in developing scripts for doing transformations using Scala.

15.Involved in developing Shell scripts to orchestrate execution of all other scripts and move the data files within and outside of HDFS.

16.Using Kafka on publish-subscribe messaging as a distributed commit log, have experienced in its fast, scalable and durability.

18.To be able to understand existing business logic & implement the business changes

19.Working with business and peers to define, estimate, and deliver functionality.

20.Should have gone through entire development life cycle including system integration testing

11.Provide Production Support for any issues raised by the Business and experience on Production/Integration release management

22.Experience with GIT, Jenkins, test frameworks.



Soft skills:

1.Right communication skills to be able to work with Business.

2.Good logical reasoning, analytical abilities.Recommend changes to client software based on information such as client feedback, language changes and new technology.

3.Experience working with remote teams.

4.Work to become the trusted partner of the Architects by regular engagement and proposals on best development practices which can be implemented and would bring change to the process

5.Excellent Performance and Technical Skills, Attend required training and support peers by sharing knowledge and mentoring.

6.Taking up training programs within your team to ensure understanding of BDF.



Agile skills:

1.Experienced in Agile development methodology.

2.Create the required technical tasks in backlog and update status regularly.

3.Participate in all Agile ceremony meetings and update task status and adher to timelines and quality standards for the task.



TECHNICAL SKILLS:

- Hadoop/Big Data

- HDFS

- Scala

- Spark

- Hive

- Python

- Kafka

- Zookeeper

- Shell Scripting

- Java experience is a plus

- GIT Version control tool knowledge,Jenkins and Scheduling tools Airflow & ESP

Relevant Job Openings
Tableau developer
Node JS Developer
Senior Java Developers
Senior Data Engineer
Senior Data Engineer
Bigdata Developer