Be a member of an agile core team, contributing to the company’s core infrastructure
Design & implement scalable solutions for high-performance trading & research systems
Drive the implementation of distributed and highly parallelized State of the art Big Data processing pipelines which process massive amounts of data in both batch and near real-time
Manage the company’s production trading infrastructure
Work closely with our data scientists & researchers to develop ML models & pipelines
Promote new technologies and develop POCs to improve the company’s offering
Requirements:
4+ years of experience in software development
Hands-on experience in designing and implementing highly scalable distributed systems
Programing experience with multithreaded environments in Python
Extensive knowledge in Unix/Linux
Experience with large scale No/SQL storage solutions (MongoDB/Druid/BigQuer/etc.)
B.Sc in Computer Science, Engineering or a related field (or equivalent experience)