Careers

View all jobs

Senior Big Data Developer

Department: Engineering
Location: Herzliya, Israel

Description

Upstream is looking for a Senior Big Data Developer with a strong technical background in data infrastructure to join our engineering department.

You will be part of a growing team, responsible for building robust, scalable and accessible data solutions.

As a Senior Big Data Developer you will design, develop and implement core components such as our data lake, data pipelines and orchestration tools using modern big data architectures and frameworks.

The infrastructure you will develop will enable data consumers from across the entire company to produce data insights, implement ML models, query and process our data in a scalable and efficient manner. 

This role is full-time and is based in Herzliya, Israel.

Responsibilities

  • Build and expand our foundational data infrastructure, including our data lake, analytics engine, data streaming and batch processing frameworks.
  • Create robust infrastructure to enable automated pipelines that will ingest and process data into our analytical platforms, leveraging open-source, cloud-agnostic frameworks and toolsets.
  • Develop and maintain our data lake and data warehouse layouts and architectures for efficient data access and advanced analytics.
  • Build our ML platform and automate machine learning lifecycle.
  • Drive business-wide ML projects from an engineering perspective.
  • Develop and manage orchestration tools, governance tools, data discovery tools, and more.
  • Work with other team members of the engineering group, including data architects, data analysts, and data scientists, to provide solutions using a use case-based approach that drives the construction of technical data flows.

Requirements

  • At least 3+ years of experience with designing & building data pipelines, analytical tools and data lakes.
  • At least 5+ years of development experience, using a general purpose programming language (Java, Scala, Kotlin, Go, etc.) 
  • Experience with the data engineering tech stack: ETL & orchestration tools (e.g. Airflow, Argo, Prefect), data processing tools (e.g Spark, Kafka, Presto)
  • Experience with Python is a must
  • Experience working with open-source products - Big advantage
  • Experience working in a containerized environment (e.g. k8s ) - Advantage

Upstream is an equal opportunity employer. All candidates for employment will be considered without regard to race, color, religion, sex, national origin, physical or mental disability, veteran status, or any other basis protected by applicable federal, state or local law.