- Commercial experience leading on client-facing projects, including working in close-knit teams
- 3+ years of experience and interest in Big Data technologies (Hadoop / Spark / NoSQL DBs)
- 3+ years of experience working on projects within the cloud ideally AWS or Azure
- 5+ years of experience working with streaming architectures and patterns like Kafka, Kinesis, Flink, or Confluent
- Experience with open source tools like Apache Airflow and Griffin
- Experience with DevOps and DataOps patterns and tools like Jenkins, Kubernetes, Docker, and Terraform
- Data Warehousing experience with cloud products like Snowflake, Azure DW, or Redshift
- Experience building operational ETL data pipelines across a number of sources, and constructing relational and dimensional data models
- Experience with one or more ETL/ELT tools like Talend, Matillion, FiveTran, or Alooma
- Experience building automated data quality and testing into data pipelines
- Experience with AI, NLP, Machine Learning, etc. is a plus
- Strong development background with experience in at least two scripting, object oriented or functional programming language, etc. SQL, Python, Java, Scala, C#, R
- Experience working on lively projects and a consulting setting, often working on different and multiple projects at the same time
- Excellent interpersonal skills when interacting with clients in a clear, timely, and professional manner.
- A deep personal motivation to always produce outstanding work for your clients and colleagues
- Excel in team collaboration and working with others from diverse skill-sets and backgrounds