Primary Skills:Amazon Web Services Apache Hadoop Apache Kafka Apache Spark Apache Airflow Big data Computer science
Secondary Skills:Data engineering ETL Google Cloud RDBMS Python NoSQL
Job Location:
New York, New York
Posted Date:
Posted today
Job Description
We’re looking for premier Data Engineers to specifically focus on data ingestion and transformation through web scrapes.
In this role you will have the opportunity to:
Develop solutions that enable investment professionals to efficiently extract insights from data. This includes owning the ingestion (web scrapes, S3/FTP sync, sensor collection), transformations (Spark, SQL, Kafka, Python/C++/R), and interface (API, schema design, events)
Build tools and automation capabilities for data pipelines that improve the efficiency, quality and resiliency of our data platform
Drive our evolution of our data strategy by challenging the status quo and identifying opportunities to enhance our platform
Skills
Data engineering focus
Significant experience with web scraping data engineering
Google Cloud experience
Big Data
Apache Airflow, AWSE/Azure, Jupyter, Kafka, Docker, Nomad/Kubernetes, or Snowflake
Python
RDBMS, NoSQL, distributed compute platforms such as Spark, Dask or Hadoop