DevJobs

Data Pipeline Engineer

Overview
Skills
  • Kafka Kafka
  • Flink Flink
  • PostgreSQL PostgreSQL
  • ClickHouse
  • Temporal
Miggo's Pipeline team builds the event-driven data foundation behind the company's Application Detection & Response platform. We transform massive application telemetry into real-time security value, powering detections, correlations, and automated insights across the product.

What you'll do

  • Design and scale event-driven streaming systems using Kafka and Temporal to enable real-time security analysis and decision making.
  • Build rule-based and data-driven pipelines that turn application signals into actionable detections.
  • Collaborate with product, integrations, and detection teams to create new data flows that unlock innovative security capabilities.
  • Lead improvements in scalability, reliability, and observability to keep the pipeline ahead of growth and complexity.

What you'll bring

  • Deep experience with stream processing and distributed systems (Kafka, Flink, Temporal, or similar).
  • Expertise in data architecture and high-performance querying (ClickHouse, Postgres).
  • Proven ability to design and own complex systems with clarity and resilience.
  • Strong engineering fundamentals and curiosity about how data becomes security intelligence.
  • 7+ years of backend development experience with a focus on distributed systems

It Would Be Great If You Also Have

  • Background in security analytics or large-scale data systems.
  • Experience with multi-tenant, real-time, or high-throughput environments.
Miggo Security