DevJobs

Data Engineer

Overview
Skills
  • Python Python ꞏ 3y
  • Kafka Kafka ꞏ 2y
  • Spark Spark ꞏ 1y
  • Linux Linux ꞏ 2y
  • Grafana Grafana
  • Databricks
  • Dataproc
  • EMR
  • Prometheus Prometheus
Job ID: 206771

Required Travel : Up to 25%

Managerial - No

Location: :Israel- RAANANA (Amdocs Site)

Who are we?

Amdocs helps those who build the future to make it amazing. With our market-leading portfolio of software products and services, we unlock our customers’ innovative potential, empowering them to provide next-generation communication and media experiences for both the individual end user and enterprise customers. Our employees around the globe are here to accelerate service providers’ migration to the cloud, enable them to differentiate in the 5G era, and digitalize and automate their operations. Listed on the NASDAQ Global Select Market, Amdocs had revenue of $5.00 billion in fiscal 2024. For more information, visit www.amdocs.com

At Amdocs, our mission is to empower our employees to 'Live Amazing, Do Amazing' every day. We believe in creating a workplace where you not only excel professionally but also thrive personally. Through our culture of making a real impact, fostering growth, embracing flexibility, and building connections, we enable them to live meaningful lives while making a difference in the world.

In one sentence

The Amdocs Data and AI platform is looking for a platform data engineer specialist to join us in Raanana, Israel. In this role, you will join a team that develops new features for our product. The team is responsible for the developing a data based product that use cutting-edge technologies such as datalake (Databricks, EMR, Dataproc), different formats such as Iceberg and Delta, big data technologies such as Spark, Python.

This team will develop the domain of datalake and data products using python and spark.

You’ll need excellent technical mentioned below along with communication skills.

The team have open discussion, any voice count and we are open minded for new technologies to adopt. Anything is possible in this product and team.

What will your job look like?

  • Development of services for the data platforms which will be deployed over the cloud spark clusters to provide deployable and deliverable data pipelines to the customers.
  • Work with other team members to develop, improve, maintain and enhance the data lake domain and the data products in the Data and AI platform.
  • Design new requirement and take the ownership from design till production.
  • Come to the office 3 times a week.

All you need is...

  • Mandatory - Python development specialist with at least 3 years of experience.
  • Mandatory - Spark development specialist with at least 1 years of experience.
  • Mandatory – At least 2 years experience and deep knowledge of Kafka.
  • Mandatory – At least 2 years experience working with Linux
  • Considered a plus:
    • Knowledge in cloud spark clusters - Databrick, EMR, Dataproc
    • Grafana and Prometheus
Why You Will Love This Job

  • You will be challenged to design and develop new software applications.
  • You will have the opportunity to work in a growing organization, with ever growing opportunities for personal growth.

Amdocs is an equal opportunity employer. We welcome applicants from all backgrounds and are committed to fostering a diverse and inclusive workforce
Amdocs