Job description
Hi,
We are from Nixsol India pvt ltd.
We have an opening For ORACLE
Mode: Contract to Hire -Initially 6 Months (Extendable) you will be under payroll of Nixsol India pvt ltd ,after that (Based up on your performance) you will be converting to client payroll.
Roles and Responsibilities
Designing and implementing highly performance data ingestion pipelines from multiple sources using Azure Databricks and Apache Spark
Hands on experience designing and delivering solutions using the Azure Data Analytics platform including Azure Storage, Azure SQL Data Warehouse, Azure Data Lake, Azure Cosmos DB, Azure Stream Analytics
Direct experience of building data pipelines using Azure Data Factory and Apache Spark (preferably Databricks).
Delivering and presenting proofs of concept to of key technology components to project stakeholders.
Developing scalable and re-usable frameworks for ingesting of data sets
Integrating the end to end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is maintained at all times.
Working with other members of the project team to support delivery of additional project components (API interfaces, Search)
Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints.
Experience using geospatial frameworks on Apache Spark and associated design and development patterns
Strong knowledge of Data Management principles.
Working with event based / streaming technologies to ingest and process data.
Microsoft Azure Big Data Architecture certification (preferred)
Desired Candidate Profile
No of years experience- 5-8 years of relevant Experience
Notice Period- Less than 30days.
Perks and Benefits