DataOps Engineer

Explorium is a cutting-edge data science company that has recently closed a Series B round bringing their total funding to $50 million.

Explorium offers a first of its kind data science platform powered by augmented data discovery and feature engineering. By automatically connecting to thousands of external data sources and leveraging machine learning to distill the most impactful signals, the Explorium platform empowers data scientists and business leaders to drive decision-making by eliminating the barrier to acquire the right data and enabling superior predictive power.

Responsibilities:

  • Help us build one of the most complex knowledge systems in the world.
  • Work closely with engineering in building Explorium’s in house data infrastructure, working on low latency, high throughput production systems.
  • Work closely with DevOps in managing infrastructure, from POC to production.

Requirements:

  • Deep understanding of modern data stack: orchestration, distributed systems, and cloud infrastructure and services, mainly AWS and GCP.
  • Hands-on production experience with Elasticsearch - Must.
  • Hands-on experience in monitoring production critical systems.
  • Experience in data intensive systems, focus on K8s, Kafka, Spark, and alike.
  • Ability to define and manage DevOps workflow
  • Experience with python applications - a plus
Back to open roles

Apply for this role

New! Get Our Latest Research Report "2021 State of External Data Acquisition" Read Now