DevJobs

Infra Data Engineer

Overview
Skills
  • Java Java
  • Python Python
  • Scala Scala
  • SQL SQL
  • Kafka Kafka
  • Spark Spark
  • NoSQL NoSQL
  • Snowflake Snowflake
  • Airflow Airflow

Artlist is where the creative arts meet technology. If there’s one thing we all have in common, it’s a love of music and film, which is why we build innovative products to help global brands and individual creators make amazing videos.

We do this by giving them the best music, footage, sound effects, and templates around. We also revolutionized the industry with a radical new music licensing model that has since become the global standard.

Artlist is now the go-to platform for over 26M users worldwide. They range from top-tier global brands like Google, Apple, Amazon, Microsoft, and Calvin Klein to social creators, video editors, and more.


Our products:

  • Artlist: an all-in-one platform for video creation, including high-quality and curated royalty-free music, SFX, footage, templates, plugins, and more.
  • Motion Array: the ultimate destination for creators, including high-quality video templates by the world’s top motion designers, presets, plugins, music, SFX, stock footage, graphics for design, motion graphics, and stock photos.


We are looking for an Infra Data Engineer to join our Data department in Tel Aviv.


As a Senior Infra Data Engineer, you’ll join our data engineering team, working alongside software engineers, backend engineers, frontend engineers, and the data team.

In this crucial role, you will take part in building our next-generation data platform that will advance

Artlist as a data-driven company.

You will work with cutting-edge technologies and build a solid foundation of a data platform that will serve the company’s internal and external sources and eventually enable the business stakeholders to make decisions based on accurate and enriched data.


Wake up for this:

  • End-to-end development of the company data infrastructure.
  • Build and design high-performance, real-time/near real-time streaming data pipelines incorporating current and new data stack tools such as Airflow, AWS, Snowflake, Spark, Redis, and Kafka.
  • Build and design an autonomous platform that will produce data that will be served by multiple consumers 3rd parties and internal.
  • Build and design a self-recovery platform that will reduce downtime to a minimum.


Requirements:

  • 4+ years of experience in data engineering or backend engineering roles - must.
  • Proven experience in designing and building real-time events platform solution-must .
  • Strong knowledge of backend programming languages such as Python, Java, Scala, or similar - must.
  • Deep knowledge and strong experience deploying distributed data technologies and systems (Spark, Kafka, airflow, or similar) - must.
  • Strong and proven knowledge of working with data lakes, lakehouses, and data warehouses in the cloud (Snowflake or similar).
  • Advanced proficiency in working with SQL / NoSQL databases.
  • Strong experience with ETL/ELT processes, data ingestion, data transformation, data modeling, and monitoring.


Artlist