Primary Skills:Azure GCP Build Cloud Data Analytics
Secondary Skills:Strong Analytical Skills AWS
Job Location:
Navi Mumbai
Posted Date:
384 days ago
Job Description
Roles and Responsibilities
The Data Engineer will play an important role in enabling business for Data Driven Operations and Decision making Agile and Product-centric IT environment.
Responsibilities :
To design and build solutions to ingest data from variety of sources in to Data and Analytics platforms including cloud platforms like AWS/GCP/Azure
Build data pipelines that clean, transform, and aggregate data from disparate data sources. Strong analytical skills in working with structured and unstructured datasets
Develop and optimize scalable data pipelines, architectures & data sets and build new API integrations
Build processes to support data mining, data transformation, data structures, data modelling, metadata and workload management
Creating custom software components using languages sych R, Python etc. and tools to merge different systems together and develop a strong analytics infrastructure
Support Data Scientists to build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics
Enable delivery of business solutions on analytics platform that includes data security, governance, cataloging, preparation, automated testing, and data quality metrics.
Automate, optimize, migrate and enhance existing solutions.
Provide high operational excellence guaranteeing high availability and platform stability
Collaborate with analytics and business teams to improve data models, increasing data accessibility and fostering data-driven decision making.
Experience
Total Experience of 4-9 years
4 +years experience in data engineering which includes data ingestion, preparation, provisioning, automated testing, and quality checks.
3+ Hands-on experience in Big Data cloud platforms like AWS and GCP, Data Lakes, and Data Warehouses
3+ years of Big Data and Analytics Technologies. Experience in SQL, writing code in spark engine using python, scala or java Language. Experience in Spark, Scala and Kafka technologies
Experience in designing, building, and maintaining ETL systems like SAP BW
Experience in data pipeline and workflow management tools ( such as Azkaban, Luigi, Airflow etc.)
Application Development background along with knowledge of Analytics libraries, open-source Natural Language Processing, statistical and big data computing libraries
Familiarity with Visualization and Reporting Tools like Tableau, Salesforce Einstein Analytics and QlikSense