About The Position
About Justt
Justt is a fintech company dedicated to revolutionizing financial technology solutions. We're building the next generation of financial infrastructure and need forward-thinking engineers who are passionate about staying ahead of the technology curve. Our culture embraces innovation, continuous learning, and adopting emerging technologies to solve complex financial challenges.
The Role
As a Senior Data Engineer at Justt, you will architect, optimize, and maintain our data infrastructure while actively exploring and implementing cutting-edge data technologies. You'll have the freedom to experiment with new tools and approaches, pushing our data capabilities beyond conventional boundaries. This role balances maintaining our core infrastructure while constantly evolving our tech stack to maintain our competitive edge in the fintech space.
Responsibilities
- Design, implement, and optimize data pipelines using Dbt, Rivery, and FiveTran, with opportunities to evaluate and integrate newer pipeline technologies
- Enhance and maintain our Snowflake data warehouse architecture while exploring modern data lakehouse architectures and real-time processing capabilities
- Lead initiatives to modernize our data stack, including evaluating and implementing next-generation data technologies
- Collaborate with data scientists, analysts, engineers and other stakeholders to understand and meet their data needs
- Implement robust data quality monitoring and testing frameworks
- Optimize data flows from various sources including MongoDB, PostgreSQL, and Google Sheets
- Work with AWS cloud infrastructure to ensure scalability and performance, with opportunities to explore multi-cloud strategies
- Develop and maintain data documentation and governance standards aligned with emerging best practices
- Participate in code reviews and mentor junior engineers on both existing and new technologies
- Troubleshoot and resolve complex data engineering issues
- Research and propose adoption of emerging technologies that could provide competitive advantages
- Lead proof-of-concept projects to validate new data technologies before wider adoption
Requirements
5+ years of experience in data engineering roles with a demonstrated interest in technology evolution
- Strong expertise in SQL and Python
- Experience with modern data warehousing platforms, particularly Snowflake
- Proficiency with dbt (data build tool) for transformation workflows
- Experience with ETL/ELT tools such as Airflow, Rivery, and FiveTran
- Hands-on experience with MongoDB and PostgreSQL databases
- Strong understanding of data modeling concepts and techniques
- Proficiency with Git for version control
- Experience with AWS cloud services
- Knowledge of data governance and security best practices
- Excellent problem-solving and analytical skills
- Strong communication skills and ability to work effectively in a team environment
- Growth mindset and eagerness to learn new technologies
Technical Growth Areas We're Exploring
- Stream processing technologies (Kafka, Flink, Spark Streaming)
- Real-time analytics and data serving layers
- Data mesh architecture principles
- Vector databases and embedding technologies
- Semantic layers and metrics frameworks (MetricFlow, Cube)
- Feature stores for machine learning
- Data contracts and data product thinking
- Hybrid transactional-analytical processing (HTAP) systems
- Zero-ETL architectures
- Privacy-preserving computation techniques
Nice to Have
- Experience in the fintech industry
- Knowledge of real-time data processing frameworks
- Experience with data visualization tools
- Familiarity with containerization and orchestration technologies
- Experience with infrastructure as code (IaC)
- Understanding of machine learning workflows and MLOps
- Contributions to open-source data projects
- Experience implementing data mesh architectures