System Development at ScaleDesigning, developing, and maintaining high-scale data-driven systems.
Implementing optimizations and monitoring strategies to ensure performance and reliability.
Data Pipeline SolutionsArchitecting, building, and troubleshooting end-to-end data pipelines.
Collaborative GrowthWorking closely with data scientists and analysts to develop and refine threat intelligence solutions.
Participating in code reviews and knowledge-sharing sessions to continuously improve team output.
Machine Learning PipelinesBuilding and managing ML pipelines for training, deployment, and serving various model types (e.g., supervised, unsupervised).
Ensuring model performance, scalability, and integration with broader data infrastructure.
Microservices ArchitectureDeveloping, testing, and maintaining microservices-based systems that handle data and ML workflows.
Ensuring robust, scalable, and secure service designs using industry best practices.
ToolingStaying updated on emerging technologies and frameworks to continuously improve the development process.
What You Need:
Experience in Big Data Solutions
Proven track record designing, building, and maintaining large-scale, distributed data systems.
Familiarity with data processing frameworks and architectures (e.g. Kafka, Temporal, Spark).
Microservices Expertise
Hands-on experience building and testing microservices systems (e.g., Docker, Kubernetes, ArgoCD).
Solid understanding of service-to-service communication, APIs, and security best practices.
5+ Years of Professional Experience
Background in software development and/or data engineering.
Proficiency in relevant programming languages (e.g., Python, Go, Java/Scala).
Strong Communication and Collaboration Skills
Proven ability to work effectively with cross-functional teams, including data scientists, analysts, and stakeholders.
Capacity to translate complex technical concepts into actionable insights for broader audiences.
Familiarity with Industry Best Practices
Expertise in observability - monitoring, logging, and alerting to ensure system reliability.
DevOps Expertise [ Advantage ]
Experience working with CI/CD pipelines, automation tools, and DevOps methodologies.
Leveraging tools like ArgoCD, PostgreSQL, Grafana and AtlasGo, to maintain best-in-class DevOps practices.