Useready

Cloud/ Hybrid Big data Architect

  • Job Type: Full Time
  • Industry Type: IT Sector
  • Industry Location: Remote
  • Experience: NA
  • No. of Positions: 1
  • Primary Skills: Cloud Bigdata Python Data Visualization
  • Secondary Skills: Architect Hadoop
  • Job Location: Remote, Texas
  • Posted Date: Posted today
Job Description



USEReady was ranked #113 in Inc. 5000, an exclusive list of America's fastest-growing private companies and named as the Red Herring Top 100 North America winner. The company was also announced as the 'Service Partner of the Year by Tableau at the 2015, 2016, 2017, 2018 and 2019 annual Partner Summit. We have also received Marketing Partner of the year- 2018 from Tableau. Some of the key highlights of our company are as below

 

 

 

* Ranked #113 in Inc. 5000

 

* Red Herring Top 100 North America Winner

 

* Tableau Gold Partner for 5 consecutive years

 

* Tableau Services Partner of the Year 2015, 2017, 2019, 2020

 

* Informatica Parther

 

* Received award for best marketing partner with Tableau

 

* Tableau Services & Training and Alliance partner nominee of the year 2016

 

* Alteryx, Collibra, Snowflake, AWS & Micros

 

 

 

Job Title: Cloud/ Hybrid Big data Architect

 

Job Location: New Jersey/ Remote

 

Job Type: Fulltime

 

Travel: 40%

 

 

 

Responsibilities:
Work within a Big data and cloud consulting delivery technical team to drive Clients enterprise technology transformation for Big Data solutions on cloud (AWS, Azure, Google Cloud Platform, Snowflake and other leading providers), on-prem & Hybrid environments. Leads technical Solution architecture & Implementations for end to end big data ecosystems or core components such as - Data Migration & Transformation pipelines, Data integrations and Virtualization, cloud & Hadoop Data Lakes, Data Warehouses/ Marts, Spark, Machine Learning, Data Science, and BI.

 

 

 

Job Overview

 

We are looking for a savvy data & cloud Architect to join our team of Cloud & Big data experts. They must be self-directed and comfortable in providing advisory, architecture, Hands-on Engineering consultation for Big data needs of multiple teams, systems and solutions. The right candidate will be excited by the prospect of optimizing or even re-designing our client's data architecture to support next generation of products and data initiatives.

 

Experience Requirements

 

5+ years of consulting experience and demonstrated client affinity

 

10+ Years of proven experience in architecture & hands-on engineering of various complex technical solutions on Big data Platforms for Cloud (Aws, Google Cloud Platform, Azure, Snowflake) and On-prem deployments for large Enterprises or Multinational Organizations, preferably in one or more of Industries - Financial Services, Travel, Media, Retail etc.

 

Extensive experience in Data Warehouse modeling techniques- Dimensional, Data Vault 2.0, etc. for OLAP & OLTP systems

 

5+ years of SQL, ETL/ ELT, data transformation experience

 

5+ years of analytical, problem-solving, data analysis and research skills

 

Extensive experience in building data integration pipelines for streaming, bulk, APIs, structured/ Unstructured data sets across large enterprise environments

 

3+ Years of development experience in Data Science space, including Machine learning and deployment of ML models as API's & Microservices

 

Experience in designing and implementation of big data ecosystems with controls for Complex Risk, Compliance & Security regulations, Scalable/self-healing Infrastructure, Authentication/ Identity, Auditing, Crypto & Key management, Logging/monitoring, IT Service management integrations etc.

 

Demonstrable ability to think outside of the box and not be dependent on readily available tools

 

Excellent communication, presentation, and interpersonal skills are a must.

 

 

 

Skills required

 

Preferred Certification & required Expertise in one or more of the cloud services - AWS/Azure/ Google Cloud Platform cloud services

 

Expertise in Big data tools: Hadoop, Spark, Kafka, Storm, etc.

 

Expertise in one or more with Data Warehouse tools - Snowflake, Redshift, Big-query etc.

 

Expertise in Data Warehouse modeling techniques including but not limited to dimensional, Data Vault 2.0, etc. for OLAP & OLTP systems

 

Expertise in SQL & NoSQL databases- RDS, DynamoDB, Mongo etc.

 

Expertise in Data pipeline, Data Integration, Data Virtualization and workflow management tools, such as Snap-logic, Denodo, Talend, Informatica cloud etc.

 

Machine Learning & Data Science platforms & Solutions - Tensorflow, AutoML, SageMaker, Alteryx, etc.

 

One or more Object-oriented/object, scripting languages: Python, Java, C++, Scala, R, etc.

 

Infra as Code: AWS Cloud-formation, Terraform, Azure resource manager, etc.

 

Experience in designing and engineering cloud infra, security & connectivity controls for deploying and managing data systems in cloud.

 

 

 

 

Relevant Job Openings
Tableau developer
Azure Data Architect with Talend
Azure Cloud Architect
Java Full stack Developer
Senior Data Engineer
Senior Data Engineer