DevJobs

Staff Backend Developer

Overview
Skills
  • Python Python ꞏ 6y
  • Kotlin Kotlin
  • Scala Scala
  • Rust Rust
  • Java Java
  • Go Go
  • Kafka Kafka
  • Pandas Pandas
  • Flink Flink
  • DynamoDB DynamoDB
  • MongoDB MongoDB
  • PostgreSQL PostgreSQL
  • Redis Redis
  • RESTful API RESTful API
  • CI/CD CI/CD
  • GitHub GitHub
  • AWS AWS
  • Docker Docker
  • Kubernetes Kubernetes
  • RabbitMQ RabbitMQ
  • Airflow Airflow
  • logging
  • metrics
  • WSGI
  • multiprocessing
  • Pydantic
  • threading
  • ASGI
  • gRPC
  • asyncio
  • GIL
  • FastAPI
  • distributed tracing
  • AWS SQS
  • Apache Spark
  • LiteLLM
  • video streaming protocols
  • audio streaming protocols
  • Daft
  • Dagster
  • Polars
  • Guidance
  • Instructor
  • LangChain
Loora is on a mission to revolutionize education and break down language barriers with its cutting-edge AI, building the first-ever personal AI English tutor. Loora offers its users an AI tutor that is always available to talk about whatever they want, give immediate feedback on English skills, and guide them on their journey to fluency.

We are seeking a Staff Backend Developer to join our growing core team!

Your Role

We are looking for a Staff Backend Developer to drive the architecture and development of our AI-powered language learning platform. In this role, you will design and implement backend systems that directly impact millions of learners worldwide. You will architect scalable services that power real-time AI conversations, ensuring low-latency performance and reliability while building the foundation for our next generation of language learning features.

Key Responsibilities

  • Design, develop, and maintain high-performance backend services and APIs (REST and gRPC) that power AI-driven conversational experiences.
  • Build and optimize asynchronous Python applications capable of handling real-time audio/text processing at scale.
  • Ensure seamless, low-latency integration between mobile and web clients and our AI backend platform.
  • Drive technical decisions on system architecture, focusing on low latency and fault-tolerance.
  • Collaborate with AI/ML, mobile, DevOps, and product teams to deliver end-to-end solutions that delight our users.
  • Work within CI/CD workflows to ensure smooth deployments and maintain high code quality standards.
  • Establish engineering best practices and mentor team members to build a world-class engineering culture.
  • Optimize system performance and resource utilization while maintaining reliability SLAs for our growing user base.

Requirements

  • Minimum of 6 years of diverse Python development experience.
  • Deep expertise in Python concurrency and execution models (WSGI, ASGI, asyncio, multiprocessing, threading, GIL).
  • Proven track record building production-ready asynchronous Python applications serving high-volume traffic.
  • Strong experience with Pydantic and FastAPI in production environments.
  • Expertise in designing and implementing RESTful APIs and gRPC services with strong emphasis on versioning strategies and backward/forward compatibility.
  • Demonstrated ability to solve complex performance, scalability, and workload distribution challenges.
  • Proficiency with relational and non-relational database solutions (PostgreSQL, Redis, DynamoDB, MongoDB), including query optimization and data modeling.
  • Experience with event-driven architectures and message queuing systems (Kafka, RabbitMQ, AWS SQS).
  • Hands-on experience with Docker, understanding of CI/CD pipelines and methodologies, and working in Kubernetes/AWS-based deployments.
  • Strong proficiency with AWS cloud services and cloud-native architectures.
  • Understanding of observability practices (distributed tracing, metrics, logging) and experience with monitoring tools.

Even Better If You Have

  • Proficiency in additional backend programming languages (Go, Rust, Java, Scala, Kotlin).
  • Experience with audio/video streaming protocols and real-time communication systems.
  • Experience with dataframe libraries (Pandas, Polars, Daft) for data processing and analytics.
  • Familiarity with LLM integration libraries (LiteLLM, LangChain, Guidance, Instructor) and AI model serving frameworks.
  • Background in building data-driven applications, pipelines, or ETL processes using frameworks like Apache Spark / Flink and orchestration tools such as Airflow or Dagster.
  • Contributions to OSS projects on GitHub.
Loora