Greeting from WB Solutions....!
We have the Urgent requirement for "Sr Cloud Data Engineer" at “St Louis, MO" - Long Term Contract
Job Title: Sr Cloud Data Engineer
Location: St Louis, MO
Duration: Long term
JD:
Sr Cloud Data Engineer, GCP; Java; and No SQL
Description:
Client is looking for creative; high-energy; diverse and driven Data Engineer with hands-on development skills to work on a variety of meaningful projects to work with one of the leading financial services organization in the US. You are ideal for this position if you are a forward-thinking; committed; and enthusiastic software engineer who is passionate about technology.
The Senior Data Engineer will be responsible for analyzing existing on-prem data bases and storages; responsible for the hands on role expanding and optimizing data quality; data pipelines and architecture; as well as optimizing data flow and collection in the GCP ecosystem. The Data Engineer will support a group of Architects; Technical Leads; Software Engineers; DBA on the data initiatives. The Data Engineer will ensure optimal data quality and delivery is consistent and secure throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams; systems and products. The Data Engineer will work with IDH staff; and with the Project Manager; and other members of the project team to meet data-related objectives.
The candidate must possess excellent written and verbal communication skills with the ability to collaborate effectively with domain experts and technical experts in the team.
Roles & Responsibilities:
- Design and perform data pipeline related implementation and production support tasks such as data migration; ETL; validation; and conversion.
- Evaluate and/or implement changes in production transactional system to support ETL data pipelines as needed.
- Build or configure infrastructure required for optimal ETL of data into an analytics environment from a variety of data sources using SQL and Cloud ??????data?????? technologies.
- Monitor and maintain live data ingestion; data pipelines; and ETL
- Understand best practices for data separation; security; and disaster recovery through multiple data centers and Cloud regions.
- Provide insight into system data quality; recommend; and implement solutions to resolve data quality issues.
- Perform basic analysis to transform data into actionable information for use by the stakeholders.
- Evaluate data analysis and visualization tools and use these tools or assist analysts in developing recurring and ad-hoc reports/dashboards.
- Build configure or install analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition; operational efficiency and other key business performance metrics.
- Required Skills and Experience:
- Experience with implementation and design of non-relation NoSQL; and Graph databases such as Cassandra; Neptune; or Neo4J.
- Experience with open source data pipeline and workflow management
- Experience with design and implementation of open source databases
- Experience with big data tools such as: Hadoop and/or Spark
- Experience with object-oriented/object function scripting languages: Python; Java; Scala; etc.
- Experience with data warehouse or data lake creation
- Experience with evaluating mainly open source database and data pipeline tools; designing data sources; custom loading programs and/or pipelines
- Experience building and optimizing ETL data pipelines; architectures and data sets an GCP/AWS or open source ecosystem
- Experience in production support transactional data sources and ETL data pipelines
- Ability to design database tables and schemas to support analytics operations or facilitate ETL
- Advanced working SQL knowledge and experience working with relational databases; query authoring (SQL) as well as working familiarity with a variety of databases.
- Build processes supporting data transformation; data structures; metadata; dependency and workload management.
- Demonstrated capabilities using data analysis and visualization tools and methodologies.
- Experience with data management and cleansing of operational and analytical data stores maintaining data quality.
- You enjoy mentoring other engineers; having a voice in defining our challenging technical culture; and helping to build a fast-growing team (depending on level)
- Report any issues with the client stakeholders and UST leadership
Tech Stack = NoSQL; GCP (Google Cloud Platform); BigQuery; ETL; Java ;J2EE (pipeline would be a plus) Other tech tools = Jira; Aha!; Eclipse; Python scripting .