At Dream, we redefine cyber defense vision by combining AI and human expertise to create products that protect nations and critical infrastructure. This is more than a job; It’s a Dream job. Dream is where we tackle real-world challenges, redefine AI and security, and make the digital world safer. Let’s build something extraordinary together.
Dream's AI cybersecurity platform applies a new, out-of-the-ordinary, multi-layered approach, covering endless and evolving security challenges across the entire infrastructure of the most critical and sensitive networks. Central to our Dream's proprietary Cyber Language Models are innovative technologies that provide contextual intelligence for the future of cybersecurity.
At Dream, our talented team, driven by passion, expertise, and innovative minds, inspires us daily. We are not just dreamers, we are dream-makers.
The Dream Job:
We are on an expedition to find you, someone who is passionate about creating intuitive, out-of-this-world data platforms. You'll architect and ship our streaming lake-house and data platform, turning billions of raw threat signals into high-impact, self-serve insights that protect countries in real time – all while building on top-of-the-line technologies, such as Iceberg,
Flink,
Paimon,
Fluss,
LanceDB,
ClickHouse and more.
The Dream-Maker Responsibilities:
- Design and maintain agentic data pipelines that adapt dynamically to new sources, schemas, and AI-driven tasks
- Build self-serve data systems that allow teams to explore, transform, and analyze data with minimal engineering effort
- Develop modular, event-based pipelines across AWS environments, combining cloud flexibility with custom open frameworks
- Automate ingestion, enrichment, and fusion of cybersecurity data including logs, configs, and CTI streams
- Collaborate closely with AI engineers and researchers to operationalize LLM and agent pipelines within the CLM ecosystem
- Implement observability, lineage, and data validation to ensure reliability and traceability
- Scale systems to handle complex, high-volume data while maintaining adaptability and performance
- Own the data layer end-to-end including architecture, documentation, and governance
The Dream Skill Set:
- 5+ years of experience building large-scale distributed systems or platforms, preferably in ML or data-intensive environments
- Proficiency in Python with strong software engineering practices, familiarity with data structures and design patterns
- Deep understanding of orchestration systems (e.g., Kubernetes, Argo) and distributed computing frameworks (e.g., Ray, Spark)
- Experience with GPU compute infrastructure, containerization (Docker), and cloud-native architectures
- Proven track record of delivering production-grade infrastructure or developer platforms
- Solid grasp of ML workflows, including model training, evaluation, and inference pipelines
Never Stop Dreaming...:
If you think this role doesn't fully match your skills but are eager to grow and break glass ceilings, we’d love to hear from you!