Position: Cloud Data Architect
Location: United States (Remote)
Hashmap, an NTT DATA Services Company, advises, architects, selects, and implements solutions for clients so that they may efficiently run their businesses, with a focus on data analytics in the Cloud. These service areas span IoT, Solution Architecture, data engineering, analytics, AI, ML, and DevOps.
Major Role Responsibilities:
- Leadership Qualities:
- Leading team members in developing applicable methodologies, tools, approaches, strategies, points of view and accelerators to differentiate Hashmap in the data analytics and Cloud market.
- Deliver technical strategies aligned with business objectives
- Cloud Architecture:
- Must have sound Cloud Architectural knowledge to design robust solutions that include technology, strategy, culture and process
- Must be able to provide fit for purpose solutions integrating various technology areas (e.g. data acquisition, transformation, tuning, governance, management, automation, business intelligence topologies. s)
- Ability to understand current-state architectures and existing tech stacks, knowledge of best practices and cloud market trends and overall industry to provide thought leadership (seminars, whitepapers, etc.,) and mentor team to build necessary competency
- Develop and execute a cloud migration plan for existing applications
- In-depth knowledge of the cloud and on-prem computing platform, its many services, and dimensions of scalability
- Proven experience assessing client’s workloads and technology landscape for Cloud suitability, develop a business case and Cloud adoption roadmap
Technical Skill Sets (Mandatory):
- Hands-on experience of Azure, AWS or Google Cloud Platform with an excellent understanding of its ecosystems. Certification as a cloud architect is preferable
- Expert-level knowledge on implementing cloud data warehouses – Snowflake, AWS Redshift, Azure SQL Data Warehouse, or Google BigQuery experience
- Designing highly available applications with responsibility cost, performance, security and compliance
- Experience working with applications backed by cloud service
- Oversee cloud buildout based upon recommended or prescribed architecture
- Ensuring critical system security using best-in-class cloud security solutions.
- Comfortable with working extensively in Linux as well as with cloud-vendor UIs
- Experience with all facets of data pipeline construction - ETL/Real-time, data management and consumption (e.g. BI, Micro-services) architectures and tooling etc.
- Experience with Snowflake, databricks or Azure synapse.
- Able to integrate a combination of cloud-vendor and 3rd party services.
- Expert-level knowledge building data pipelines - using a variety of different technologies; e.g. Azure Data Factory, Matillion, dbt (data build tool), Python & Spark will be a plus.
- Candidates must have direct work experience developing and deploying multiple projects where they implemented the full cloud data pipeline documented on their resume.
- Strong Knowledge of database development tools.
- Proficient in Python OR Java/Scala programming would be a plus.
- Understanding of Data Engineering design patterns around data acquisition, data transformation and data presentation in various paradigms; e.g. real-time, batch, NoSQL/SQL, SaaS offerings and others
- Architecture Knowledge
- Strong knowledge of databases
Knowledge of the following would be an asset:
- Open-source contributions of any sort
- Experience with BI tools such as Tableau, Spotfire, PowerBI, Looker, etc…
- IoT understanding from connectivity and data management perspective
- Certifications (AWS/Azure/Google Cloud Platform) and prior consulting experience highly preferred
- Data cataloging and governance techniques
- Strong agile methodology knowledge