About ThetaRay
ThetaRay is the leading provider of AI-based Big Data analytics. We are dedicated to helping financial organizations combat financial cybercrimes like money laundering, fraud, and ATM attacks, which are used to finance terrorism, narcotic and human trafficking, sex slavery, and other malicious acts. Our IntuitiveAI solutions for AML, fraud, and ATM security can detect malicious behavior months before existing rule-based and standard AI solutions can.
IntuitiveAI replicates the decision-making capabilities of human intuition, one of the most accurate decision tools available. In doing so, it dramatically decreases the incidence of false positives, while increasing the number of true positives. We uncover the unknown unknowns. To learn more about ThetaRay and IntuitiveAI, visit www.thetaray.com.
About The Position
You will be responsible for designing and developing a scalable data processing and machine learning pipeline using the latest big data technologies, all within a fast-paced and agile environment.
This role is part of an innovative team within the company that plays a pivotal role in enabling the company to scale and grow rapidly.
You'll work closely with a dynamic group of professionals who are dedicated to pushing the boundaries of what's possible in data processing and machine learning.
In addition to your technical prowess, you will need to showcase exceptional soft skills, including:
Rapid Self-Learning: Investigate new technological areas and gain deep insights through self-directed learning.
Analytical Problem-Solving: Demonstrate strong analytical problem-solving skills to support the development of scalable and sustainable solutions.
Ownership and Collaboration: Take ownership of product development across all lifecycle stages. This includes translating product requirements into actionable designs, hands-on development, unit testing, and addressing production challenges. You will also need to collaborate effectively with other departments and serve as a focal point for orchestrating department-wide processes within the company.
Requirements:
- Technical Proficiency: A minimum of 3 years of hands-on development experience, including expertise in developing data-centric products using Python, data processing frameworks like PySpark, Pandas, Hadoop, and Airflow, and container-based environments using tools such as Kubernetes (K8s) and Helm.
- Development Pipeline Skills: Proficiency in using development pipeline tools such as Git, Jenkins, Docker, and more.
- Agile Collaboration: Prior experience as a software developer in an Agile environment and a history of conducting thorough code reviews.
Nice-to-Have
- Linux Familiarity: Experience working with Linux systems.
- Framework Development: Experience in crafting frameworks for developers or data engineers
- Machine Learning Skills: Familiarity with machine learning frameworks like scikit-learn (SKLearn) and TensorFlow.
- Microservices Expertise: Experience with developing microservices-based architectures.
- Mentoring and Leadership: Experience in mentoring junior developers and/or students, showcasing leadership abilities alongside technical skills.