About Us
TULU creates a new way of living that helps people reduce their cost of living in the city while creating more responsible consumption patterns that accommodates the trend of smaller living spaces.
TULU’s Usage Economy platform, elevates products and brands to win immediate consumers' needs using IoT-based smart rental units that are customized to provide the most necessary and useful items, while leveraging data to analyze behavioral patterns, optimizing usage, refining inventory, and creating a personalized experience for our users.
We are a global company that operates in 30 cities across the US and Europe, already working with the largest landlords in the world, in residential buildings, student housing, and offices, with 30,000+ households under management with over 100% QoQ growth and a very exciting wait-list.
About the Role
This is a full-time on-site role for a Data Engineer with AI expertise. You'll take on a key leadership role within the data team, contributing significantly to the integration of AI tools.
The role includes leading projects and contributing to building the foundations and infrastructure of data pipelines, structure, and AI within the company. You will also:
- Design, implement, and support a data warehouse and data pipelines using tools like Big query and DBT.
- Architect data models using SQL and ensure data integrity.
- Design, develop, and document automated reports/dashboards using Tableau to address reporting challenges and meet stakeholder information needs.
- Contribute to early decision-making, architecture, design, and technological research.
- Create data transformation and data ingestion infrastructures.
- Collaborate with functions to ensure data needs are addressed.
- Implement and manage AI product development and infrastructure.
- Practical hands-on experience with ML & DL is an advantage.
About You
- Degree in Computer Science or hands-on job experience in data analytics or analytics engineering.
- At least 3 years of professional experience building and maintaining production data systems in cloud environments like GCP
- Technical expertise with data models, data mining, and segmentation techniques.
- 2+ years of work experience using Git for version control.
- Enthusiasm for learning and adapting to the world of AI – a commitment to exploring this field.
- Proficiency in Python, with the ability to effectively use it for data manipulation, analysis, and pipeline development in a large-scale data environment.
- Ability to understand business logic for data structuring while overcoming technical limitations.
- Extreme sense of ownership - leading design for new products and initiatives as well as integrating with currently implemented best-practices.