Major Duties & Responsibilities for Data Engineer-Cloud:
- Develop, construct, test and maintain ETL/ELT architectures
- Align architecture with business requirements
- Data acquisition and ingestion for data at rest or streaming data
- Develop data set processes
- Use programming language and tools
- Identify ways to improve data reliability, efficiency and quality
- Conduct research for industry and business questions
- Use large data sets to address business issues
- Possibly deploy sophisticated analytics programs, machine learning and statistical methods
- Prepare data for predictive and prescriptive modeling
- Find hidden patterns using data
- Use data to discover tasks that can be automated
- Deliver updates to stakeholders based on analytics
Other Hashmap Responsibilities:
- Must be an excellent communicator in terms of status and raising issues that might impact projects to either project or account management
- While not engaged, work on internal projects and learning exercises to
Extend your technical knowledge and skills by working with standard stack and new tools
Work on internal tech aimed to help build new technical practices
- Participation is encouraged to assigned training and obtain certifications as identified by the ITT department and/or Hashmap Professional Growth Manager
Skill Sets for Role:
Soft Skills Required:
- High level of personal initiative and energy
- Very good verbal and written communication skills
- Capable of working autonomously with minimal to no daily oversight
- Strong experience leading a team to deliver high quality software solutions
- Must raise issues to Hashmap or client management as required, clearly defining the issue and potential solutions
- Common sense approach to problems
- Capable of adapting written processes when required and capturing improvements to the process for later review
Technical Skill Sets (Mandatory):
- Advanced Experience with implementing cloud data warehouses Snowflake, AWS Redshift, Azure SQL Data Warehouse, or Google BigQuery experience considered
- Advanced Experience with building data pipelines - using a variety of different technologies; Azure Data Factory, Matillion, DBT, Python & Spark,
Candidates must have direct work experience developing and deploying at least one project where they implemented the full cloud data pipeline documented on their resume
- Advanced understanding of SQL (at least 4 years)
- Need to be well versed in Software engineering principles around Object Oriented Programming & SOLID principles.
- Excellent development experience with Python (3-4 Years)
- Strong ability to gather and track requirements
- Must have experience with DevOPS. ie.. (Git +CI/CD + IaC)
- Experience with Docker containers, including containerizing services and deploying containers in a cloud environment
Knowledge of the following would be an asset:
- Open source contributions of any sort
- Experience with BI platforms such as Spotfire, Tableau, PowerBI, Zoomdata, SuperSet, etc.
- Data cataloging and governance technique
- Strong Agile methodology knowledge
Travel Expectations:
- Candidates must be available to travel up to two weeks per month on a regular basis, with occasional periods of 2 to 3 weeks per month based on specific client requirements.
Major Duties & Responsibilities for Data Engineer-ETL
- Develop, construct, test and maintain ETL/ELT architectures in the cloud.
- Work and assist the Architects to ensure development work aligns with business requirements.
- Data acquisition and ingestion for ETL data pipelines using frameworks like HVR, attunity, debezium etc.
- Utilize ETL tools e.g. Informatica/DataStage/Fivetran/dbt/Matallion to build and deploy data ingestion and transformation pipelines in the cloud. One or more ETL tools will be used for the purpose of constructing ETL pipelines in the cloud.
- Use enterprise orchestration tools e.g. CA7/Control M/airflow etc. for scheduling ETL jobs.
- Perform migration from ETL to ELT paradigm for data pipelines.
- Translate ETL/ETL from Source to Target documentation and from one ETL/ELT tool to another.
- Will be responsible for conversion and/or development of ETL/ELT packages using various available technologies.
- Identify ways to improve data reliability, efficiency and quality
- Deliver updates to stakeholders based on analytics
- Ensure accuracy & integrity of data & applications through analysis, coding, writing clear documentation & problem resolution
- Analyze & translate functional specifications & change requests into technical specifications
- Provide recommendations to project teams on security, lineage, data governance, data quality etc. type facets of a data engineering project.
Other Hashmap responsibilities:
- Must be an excellent communicator in terms of status and raising issues that might impact projects to either project or account management
- While not engaged, work on internal projects and learning exercises to
Extend your technical knowledge and skills by working with standard stack and new tools
Work on internal tech aimed to help build new technical practices
- Participation is encouraged to assigned training and obtain certifications as identified by the ITT department and/or Hashmap Professional Growth Manager
Skill Sets for Role:
Soft Skills Required:
- High level of personal initiative and energy
- Excellent verbal and written communication english skills
- Capable of working autonomously with minimal oversight
- Willing to raise issues to Hashmap or client management as required, clearly defining the issue and potential solutions
- Common sense approach to problems
- Capable of adapting written processes when required and capturing improvements to the process for later review
- Ability to adapt to a fast paced environment with high tolerance to change management
- Comfortable in working in a virtual environment (i.e. conference calls)
- Self motivated, autonomous, curious and analytical and results oriented in nature
- Highly analytical and curious
- Demonstrated problem solving skills
- Ability to influence internal and external teams
- Results oriented, self motivated, autonomous and comfortable in working in a virtual environment (i.e. conference calls)
- Confident and demonstrate strong leadership
Technical Skill Sets (Mandatory):
- Some Experience with cloud data warehouses Snowflake, AWS Redshift, Azure Synapse, or Google BigQuery experience (2 - 4 years)
- Experience with implementing (developing and deploying) ETL solutions using technologies such as Informatica/Talend/Pentaho/Datastage/SSIS and more cloud based tooling e.g. AWS Glue, Azure data factory, google data flow, DBT, Fivetran, matallion etc. into Oracle exadata/Netezza/Teradata and Snowflake/Big query/Redshift/synapse. Either one or more of the representative technologies (a combination thereof) is required.
- Excellent proficiency on SQL (6-8 years) with advance level knowledge on performance tuning, troubleshooting or constructing performant data pipelines.
- Experience with writing python based scripts where needed
- Good experience around data warehousing fundamentals e.g. Star schema, slowly changing dimensions, change data capture, late arriving foreign keys, data pipeline parallelism, optimized target load strategies etc. (4-8 Years)
- Experience gathering and tracking requirements
- Ability to adapt to a fast paced environment with high tolerance to change management
- Has Agile methodology knowledge or participated in and Agile project
- Drive for continuous process improvement
- Experience with model data catalogs, data lineage and cloud based data security frameworks e.g. Alation, amundsen, privacera, terraform, collibra etc.
- Experience with modern data ops principles for developing data pipelines.
Knowledge of the following would be an asset:
- Open source contributions of any sort
- Knowledge of or Experience with a variety of different technologies;, Matillion, dbt
- Nice to have, at least one project where they implemented the full cloud data pipeline documented on their resume
- Data cataloging and governance techniques
Travel Expectations:
- Remote position - no travel is expected on a regular basis
*We have multiple openings for Associate, Senior, and lead level