Kela is a defense technology startup pioneering next-generation command & control and data infrastructure systems. We empower defense organizations with robust, scalable, and secure platforms that integrate diverse sensors and AI to deliver real-time battlefield intelligence. Backed by leading investors, Kela combines Israeli innovation with global mission impact.
We are looking for a talented and experienced Senior Data Engineer
. As a
Senior Data Engineer, you will own the design and implementation of Kela’s data backbone, responsible for how mission-critical data is ingested, stored, processed, and made accessible for both operational and AI-driven use cases. You will design pipelines and data lake architectures that span multimodal sensor data (video, radar, RF, telemetry, etc.), enabling both real-time mission applications and large-scale AI experimentation.
This role is highly cross-functional: you will collaborate with algorithm teams, product, DevOps, and system engineers to ensure data infrastructure seamlessly supports battlefield deployments, MLOps workflows, and edge inference.
Responsibilities:
- Design, optimize, and maintain data structures and storage/access patterns across on-prem, edge, and Kubernetes-based deployments.
- Build scalable Extract / Load / Transform pipelines (ELT/ETL) for multimodal data, ensuring efficient throughput and resilience under bandwidth constraints.
- Architect and operate data lake solutions for storage and querying of large-scale, multimodal datasets (structured, semi-structured, unstructured).
- Build and maintain tooling for AI model training, evaluation, and deployment including on-prem experimentation and edge inference pipelines.
- Partner with algorithms, product, and DevOps teams to establish robust data engineering standards across the company.
Requirements:
Requirements:
- 2-5 years of experience in designing and scaling data lakes (Iceberg, Delta, or equivalent) and query engines (e.g. Snowflake, Athena, BigQuery).
- Expertise in designing and operating ELT/ETL pipelines
- Strong proficiency in Python for production-grade pipelines and services
- Strong SQL skills and experience with analytical & vector databases
Preferred Qualifications:
- Familiarity with DBT for data transformations and modeling
- Hands-on experience with Kubernetes and containerized deployments
- Experience with MLOps frameworks (e.g., MLflow, Kubeflow, Airflow orchestration)
- Exposure to multimodal sensor data (EO/IR, radar, RF, geospatial, IMU, LiDAR)
- Background in defense, aerospace, or other mission-critical systems