Join Agora, a pioneering Fintech startup headquartered in TLV, as we reshape the Real Estate investment landscape. Our Investment Management platform empowers Real Estate firms and investors by automating back-office operations, enhancing investor satisfaction, and driving growth.
We’re seeking our first Data Engineer to join the Revenue Operations team. This is a high-impact role where you’ll build the foundations of our data infrastructure — connecting the dots between systems, designing and maintaining our data warehouse, and creating reliable pipelines that bring together all revenue-related data. You’ll work directly with the Director of Revenue Operations and partner closely with Sales, Finance, and Customer Success.
This is a chance to shape the role from the ground up and create a scalable data backbone that powers smarter decisions across the company.
Role Overview
As the Data Engineer, you will own the design, implementation, and evolution of Agora’s data infrastructure. You’ll connect core business systems (CRM, finance platforms, billing systems,) into a central warehouse, ensure data quality, and make insights accessible to leadership and revenue teams. Your success will be measured by the accuracy, reliability, and usability of the data foundation you build.
Key Responsibilities
Data Infrastructure & Warehousing
- Design, build, and maintain a scalable data warehouse for revenue-related data.
- Build ETL/ELT pipelines that integrate data from HubSpot, Netsuite, billing platforms, ACP, and other business tools.
- Develop a clear data schema and documentation that can scale as we grow.
Cross-Functional Collaboration
- Work closely with Sales, Finance, and Customer Success to understand their reporting and forecasting needs.
- Translate business requirements into data models that support dashboards, forecasting, and customer health metrics.
- Act as the go-to partner for data-related questions across revenue teams.
Scalability & Optimization
- Continuously monitor and optimize pipeline performance and warehouse scalability.
- Ensure the infrastructure can handle increased data volume and complexity as the company grows.
- Establish and enforce best practices for data quality, accuracy, and security.
- Evaluate and implement new tools, frameworks, or architectures that improve automation, speed, and reliability.
- Build reusable data models and modular pipelines to shorten development time and reduce maintenance.
Requirements:
- 4–6 years of experience as a Data Engineer or in a similar role (preferably in SaaS, Fintech, or fast-growing B2B companies).
- Strong expertise in SQL and data modeling; comfort working with large datasets.
- Hands-on experience building and maintaining ETL/ELT pipelines (using tools such as Fivetran, dbt, Airflow, or similar).
- Experience designing and managing cloud-based data warehouses (Snowflake, BigQuery, Redshift, or similar).
- Familiarity with CRM (HubSpot), ERP/finance systems (Netsuite), and billing platforms.
- Strong understanding of revenue operations metrics (ARR, MRR, churn, LTV, CAC, etc.).
- Ability to translate messy business requirements into clean, reliable data structures.
- Solid communication skills — able to explain technical concepts to non-technical stakeholders.
What Sets You Apart
- You’ve been the “first data hire” before and know how to build from scratch (not a must).
- Strong business acumen with a focus on revenue operations.
- A builder mindset: you like solving messy data problems and making systems talk.
- Comfortable working across teams and translating business needs into data solutions.