About Classiq
Classiq Technologies is a quantum software company based in Tel Aviv. We provide a fullstack platform (IDE, high-level language, compiler, and OS) for designing, compiling, and running quantum algorithms. Backed by 70 patents and a recent $110M Series C funding round, our technology automates quantum programming and turns functional intent into executable quantum circuits. Our core compiler technology scales quantum algorithms from concept to implementation, making quantum software development faster and more practical.
The Role
We are seeking a Data Engineer with a passion for building data platforms from the ground up to join our Engineering team and help us develop and scale our data collection solutions. In this role, you will play a crucial part in establishing the infrastructure that supports quantum data for machine learning, software engineering, and business intelligence initiatives. You’ll develop, deploy, and maintain the data pipelines and workflows required to collect, process, and visualize data across the organization.
Responsibilities:
- Lead the design and development of scalable, efficient data warehouse and business intelligence solutions aligned with organizational goals.
- Build and maintain data pipelines to support real-time and batch processing, ensuring efficient data storage, retrieval, and transformation.
- Design and optimize data models, schemas, and database structures for analytics and reporting.
- Leverage DevOps and infrastructure skills to build and support data systems.
- Implement monitoring and observability tools to ensure system reliability.
- Implement and maintain data quality checks and monitoring mechanisms to ensure accuracy and reliability across the data lifecycle.
Requirements:
- 5+ years of experience in data engineering, with strong expertise in building and maintaining data pipelines and data warehouses.
- Experience with cloud platforms (e.g., AWS, GCP, Azure) and related services for data storage, processing, and orchestration.
- Solid understanding of ETL/ELT workflows, data modeling, and database design principles.
- Proficiency in Python for data processing and automation.
- Hands-on experience with DevOps tools — CI/CD pipelines, Docker, Kubernetes, and infrastructure-as-code (Terraform).
- Strong background in SQL and experience with modern data storage systems (e.g., Snowflake, BigQuery, or similar).
- Familiarity with data quality frameworks, observability, and monitoring tools.
- Excellent problem-solving skills and the ability to optimize data workflows for both batch and real-time environments.
- Strong communication and collaboration skills, with the ability to work cross-functionally with developers, ML engineers, and data analysts.
- A self-starter mindset with enthusiasm for learning new technologies and improving existing systems.