NTT DATA, Inc.

Hadoop Developer

  • Job Type: Contract W2
  • Industry Type: IT Sector
  • Industry Location: Charlotte
  • Experience: NA
  • No. of Positions: 1
  • Primary Skills: HADOOP BIG DATA
  • Secondary Skills: SPARK
  • Job Location: Charlotte, North Carolina
  • Posted Date: Posted today
Job Description
Position Summary:
Designs, builds and tests in various software technologies, application programs in accordance with a specified business need and in a way that achieves the development goals for assigned projects.
Application developers typically work as part of a project team, and also communicate progress, technical issues and their resolution. This role is primarily based on Big Data and analytics technologies.
Key Duties & Responsibilities:
Responsibilities:
" Interpret written business requirements, functional requirements and technical specification documents to design and develop technical solutions that meet business needs
" Collaborate with IT and Business partners to design, develop, and troubleshoot end to end technical solutions
" Perform system design and specification development, program logic and flow-charting that meets the stated project objectives
" Perform coding to written technical specifications
" Perform complex defect verification, debugging, testing and support
" Investigate, analyze and document reported defects
" Create and maintain technical documentation using defined technical documentation templates that meet SDLC standards
Required Skills:
" Extensive knowledge and understanding of Hadoop and Spark eco system
" Ability to design solutions with appropriate Hadoop components.
" Knowledge of different data ingestions into Hadoop, like Sqoop, Kafka, Distcp
" Experience is building data processing workflows using Oozie, shell scripting
" Experience in different processing frameworks - Map Reduce and Spark
" Experience in data transformation technologies in Hadoop like java map reduce, Hive, Pig
" Experience in performance tuning of Hive and Map Reduce.
" Knowledge of different file formats like Avro, Parquet, JSON etc.
" Experience with SQL data stores for fast web access like Impala
" Experience with NoSQL databases like HBase
" Experience in Spark Core, Spark Streaming and Spark SQL API
" Strong knowledge in core Java.
" Expert in writing complex Oracle SQL, Stored Procedure & Packages.
" Ability to troubleshoot performance issues and perform query tuning.
" Comprehensive understanding of complex software and information technology solutions and ability to apply to client needs
" Experience in business process definition, requirements definition and technology solution research
" Strong interpersonal, oral, presentation, and written communication skills
" Ability to understand end-audience needs and requirements
" Requirements definition and design specification
" Experience working with Agile Methodology
" Good experience with code reviews
" Excellent team player with good organizational, communicational, analytical and logical skills.
Desired Skills:
" Shell scripting in UNIX (AIX, Client-UX) , Autosys scheduler scripting
" Exposure to machine learning libraries
" Exposure to analytical tools in Java, Python, R
" Ability to generate, understand the output of SQL Explain Plan
" Good experience with service oriented architecture and REST, JSON, XML, SOAP
" Financial Services experience will be a plus
" Experience working with Agile Methodology
" Experience in using RTC, code quality tools, defect tracking tools
Job Requirements And Qualifications:
- Bachelors/Masters in Computer Science or Information Systems or Programming.
- 4-7 years of Hadoop Experience and 5-10 years of overall IT experience
Relevant Job Openings
Tableau developer
Azure Data Architect with Talend
Azure Cloud Architect
Senior Data Engineer
Senior Data Engineer
Sr Solution Architect