DevJobs

Data Engineer

Overview
Skills
  • Kafka Kafka
  • NoSQL NoSQL
  • Airflow Airflow
  • Dagster
  • ETL tools
  • Prefect
At Honeycomb, we're not just building technology , we’re reshaping the future of insurance.

In 2025, Honeycomb was ranked by Newsweek as one of “America’s Greatest Startup Workplaces,” and Calcalist named it as a “Top 50 Israel startup.”

How did we earn these honors?

Honeycomb is a rapidly growing global startup, generously backed by top-tier investors and powered by an exceptional team of thinkers, builders, and problem-solvers. Dual-headquartered in Chicago and Tel Aviv (R&D center), and with 5 offices across the U.S., we are reinventing the commercial real estate insurance industry, an industry long overdue for disruption. Just as importantly, we ensure every employee feels deeply connected to our mission and one another.

With over $55B in insured assets, Honeycomb operates across 18 major states, covering 60% of the U.S. population and increasing its coverage.

If you’re looking for a place where innovation is celebrated, culture actually means something, and smart people challenge you to be better every day - Honeycomb might be exactly what you’ve been looking for.

What You’ll Do

We are expanding our data team and looking for an experienced Data Engineer to help architect and build data workflows and infrastructure. You’ll work closely with senior engineers and cross-functional stakeholders to ensure data flows reliably and is modeled for use across the organization.

This is a hands-on role with the opportunity to grow your skills in modern data engineering, cloud infrastructure, and data product development.

Responsibilities

  • Architecture Design: Design and implement information architecture to support stakeholder needs ( Finance, RevOps AI, and BI data needs).
  • Data Extraction: Extract, transform, and load (ETL) raw data into the correct databases for use by the platform.
  • Data Quality & Validation: Implement checks and alerts to ensure data consistency, completeness, and accuracy across systems.
  • Documentation & Best Practices: Participate in defining and maintaining internal standards, pipeline documentation, and data lineage tracking.
  • Performance Optimization: Identify opportunities to improve pipeline efficiency and reduce latency in data delivery.

Requirements

  • 4+ years of experience as a Data Engineer from a SaaS company
  • Experience in pipeline design and data architecture.
  • Proficiency with ETL tools and data pipeline technologies (e.g., Apache Airflow, Prefect, Dagster).

Nice to Have

    • knowledge of NoSQL databases
    • Exposure to data warehousing concepts, event streaming (e.g., Kafka), or analytics engineering
Honeycomb Insurance