Synechron

Hadoop Developer

  • Job Type: Full Time
  • Industry Type: IT Sector
  • Industry Location: Charlotte
  • Experience: NA
  • No. of Positions: 1
  • Primary Skills: Hadoop Hiveand based reporting Scala/Java Spark MongoDB REST API
  • Secondary Skills: pub/sub Micro-services File formats Parquette Avro Performance tuning
  • Job Location: Charlotte, North Carolina
  • Posted Date: Posted today
Job Description

We (Synechron, Inc) are looking to hire for the role of Hadoop Developer position . This is a Long Term role and based in Charlotte, NC.

 

About Synechron:

Synechron is one of the fastest-growing digital, business consulting & technology firms in the world. Headquartered in New York and with 22 offices around the world, Synechron is a leading Digital Transformation consulting firm and is working to Accelerate Digital initiatives for banks, asset managers, and insurance companies around the world. Synechron uniquely delivers these firms an end-to-end Digital, Consulting and Technology capabilities with expertise in wholesale banking, wealth management and insurance as well as emerging technologies like Blockchain, Artificial Intelligence, and Data Science. This has helped the company to grow to $650 Million+ in annual revenue and 10,000+ employees, and we’re continuing to invest in research and development in the form of Accelerators (prototype applications) developed in our global Financial Innovation Labs (FinLabs).

 

Learn more at: http://synechron.com/technology

 

Role: Hadoop Developer

Location: Charlotte, NC

Long Term Project

FTE/Contract

 

Top Skills:

Hadoop

Hive and based reporting

Scala/Java

Spark

MongoDB

REST API

 

Kafka = > pub/sub

Micro-services

File formats => Parquette, Avro, etc.

 

Performance tuning of Hadoop and Spark cluster/framework.

Python : Good to have

 

** Preferably certified resources on Spark/Scala.

 

· Requires 7+ years of development.

· Experience developing and enhancing conceptual, logical and physical data models.

· Experience with Python and Scala is must

· Experience working with Hadoop and eco system (e.g. Hadoop, Hive, pySpark, HBase, Sqoop, Impala, Kafka, Flume, Oozie, MapReduce, S3 etc.)

· Experience working as part of an Agile team and using Agile SDLC tools (Jira, etc.)

· Ability to partner with other functional areas to ensure execution of design, development, testing, debugging and documenting application

· Ability to develop awareness of business function for which application is being designed in order to drive out detailed requirements and form effective partnerships

· Strong analytical ability, independent problem solving, and good communication skills

· Demonstrated ability to learn new technologies and deliver in a fast paced agile environment

· Expert knowledge and experience in software development methodologies and industry practices

· Expert understanding and experience with building and deploying enterprise applications

· Ability to work in a matrix environment with minimal supervision

 

 

Thanks & Regards

Vikarant Kumar

Relevant Job Openings
Oracle Developer
CRM
Tableau developer
Python Developer or Devops Engineer
Java Developer
CRM