Roles and Responsibilities
- Hands on experience in developing Data pipeline using AWS Serverless services such as Lambda, Glue ETL, Step Function, SQS, SNS, API Gateway is necessary.
- Experience writing ETL jobs using Python, PySpark, Scala, Java, Parquet , S3 file portioning, Athena, DMS, Boto3, AWS Crawler and Data Catalog.
- Experience in developing ETL for AWS Data Lake or Data Lakehouse architecture.
- Must have worked on at least one Data Lake Data Warehouse project end to end.
- Ability to write DDL/DML/DQL in Oracle and PostgreSQL.
- Experience in migrating huge volume of data on to the AWS Cloud.
AWS Tools
AWS S3, Athena, Glue ETL, Lambda, API Gateway, Code Star, CloudWatch, Step Functions, Simple Notification Service, Cloud9, Kinesis, Cloud Trail, Audit Manager, Security Hub Code Commit, Code Build, Code Pipeline, Code Deploy, Cloud Shell, EFS, DMS, EC2
Desired Candidate Profile
4-9 years of EXP
Immediate joiners within 15-20 days
Perks and Benefits
Best in industry