DevJobs

Data Engineer

Overview
Skills
  • Python Python
  • SQL SQL
  • Snowflake Snowflake
  • dbt
  • Airbyte
  • Bigquery
  • Cursor
  • Fivetran
  • LLM
  • Rivery

About DoorLoop

DoorLoop is property management software built for speed and the smart choice for people who take growth seriously. With offices in Miami, New York City, and Tel Aviv, we’re a global company helping property owners and managers move faster, scale smarter, and get real support, real fast.

We’re proudly People First. That’s why we’re a Certified Great Place to Work, recognized by Forbes as one of America’s Best Startup Employers in both 2024 and 2025, and earn top ratings on Glassdoor. We’re growing quickly and looking for a Motion Designer to help elevate our brand in the market.

Mission

DoorLoop is looking for a Data Engineer to help build and scale our analytics data infrastructure. In this role, you will work closely with analysts and business stakeholders to design reliable data models and support the development of a centralized semantic layer used across the company.

You will play a key role in improving the structure, reliability, and usability of our data stack. This includes building and maintaining dbt models, supporting data pipelines, and ensuring analysts have access to clean, well-documented, and consistent data.

This role is ideal for someone who enjoys working at the intersection of data engineering and analytics - translating business needs into scalable data models and enabling teams to move faster with trusted data.

Responsibilities

  • Design and implement data models that support analytics across key business domains such as GTM, CX, and Finance
  • Build and maintain transformation workflows using dbt
  • Work closely with analysts to translate business questions into scalable and reusable data models
  • Help define and implement a structured semantic layer that enables consistent metrics across the company
  • Improve the reliability and clarity of the analytics data stack by centralizing logic into well-designed data models
  • Support the ingestion and transformation of data from various sources using tools such as Fivetran and Airbyte
  • Contribute to improving data quality, monitoring, and documentation practices
  • Help establish best practices for analytics modeling and data usage across teams
  • Actively leverage AI tools (e.g. Cursor, LLM-based assistants) to improve development speed, data modeling, and data workflows



Requirements:


  • 2–4 years of experience in bi/data engineering, analytics engineering or a similar role.
  • Strong SQL skills and experience working with modern data warehouses.
  • Experience building and maintaining data models for analytics.
  • Familiarity with modern data stack tools such as dbt, Snowflake/Bigquery, Fivetran/Rivery, or similar.
  • Experience collaborating with analysts or BI teams.
  • Familiarity with Python for data-related tasks (scripting, automation, or tooling).
  • Hands-on experience using AI tools (e.g. Cursor, LLMs) as part of day-to-day development workflows.
  • Strong problem-solving skills and the ability to work in evolving data environments.
  • Clear communicator who can work effectively with both technical and non-technical stakeholders.



How we use AI:


We may use AI tools to help review resumes and applications, with human oversight at all times. Please review our privacy policy.

DoorLoop