DevJobs

Data Engineer

Overview
Skills
  • Python Python
  • SQL SQL
  • RDBMS RDBMS
  • CI/CD CI/CD
  • AWS AWS
  • Azure Azure
  • GCP GCP
  • Airflow Airflow
  • Non-relational Databases
  • Workflows
  • Data Factory
  • Data Governance
  • Data Modeling
  • Data Vault
  • Databricks Lakehouse
  • DataOps
  • Delta Lake
  • Star Schema
  • Unity Catalog
  • Apache Spark
  • Databricks
  • ELT
  • ETL
✨Data Engineer – Be’er Sheva Site (Client-side)✨
Cambium is looking for a Data Engineer to join an innovative project at our client’s site in Be’er Sheva.
This is a great opportunity to work with modern data technologies, collaborate with multidisciplinary teams, and make an impact on large-scale data environments.

What You’ll Do
Design, develop, and maintain data pipelines (ETL/ELT) for ingesting, transforming, and loading data from various sources (systems, APIs, files, etc.).

Work closely with business stakeholders (Finance, Operations, Procurement, Marketing) to translate business processes into scalable data models.

Build and optimize architectures supporting Data Warehouse and Data Lakehouse environments.

Develop transformation logic using SQL, Python, and Spark.

Ensure data quality, consistency, and performance through monitoring and automation.

Collaborate with BI, Data Science, and Data Governance teams to support analytics and machine-learning initiatives.

Promote engineering best practices (CI/CD, DataOps).


What You Bring
3+ years of experience as a Data Engineer or in a similar data-focused role.

Proven experience with Databricks or Apache Spark.

Advanced SQL skills and hands-on experience with ETL/ELT pipeline development.

Proficiency in Python.

Experience with relational and non-relational databases.

Strong analytical and problem-solving skills with attention to detail.

Excellent communication skills and ability to collaborate with non-technical teams.

Understanding of business domains such as Finance, Supply Chain, and Operations.


Nice to Have
Experience with Databricks Lakehouse (Delta Lake, Unity Catalog, Workflows).

Familiarity with Airflow or Data Factory.

Knowledge of Data Governance and Data Modeling (Star Schema, Data Vault).

Cloud certifications (Databricks / Azure / AWS / GCP).


Apply now → https://apply.cambium.co.il/
Cambium