DevJobs

Backend Software Engineer

Overview
Skills
  • Python Python ꞏ 5y
  • Kafka Kafka
  • PostgreSQL PostgreSQL
  • Elasticsearch Elasticsearch
  • GitHub Actions GitHub Actions
  • GCP GCP
  • Kubernetes Kubernetes
  • Docker Docker
  • asynchronous execution
  • Object-Oriented Design
  • Databases
  • cloud services
  • Flux
  • GCS
  • Flagger
  • FastAPI
  • Litestar
  • Mlflow
  • dagger.io
  • Prometheus Prometheus
  • PyAV
  • Pydantic
  • Alloydb
  • Ray
  • Timoni

Clarity is an AI cybersecurity startup, protecting against deepfakes and new social engineering and phishing attack vectors accelerated by the rapid adoption of Generative AI. Its patent-pending technology detects AI manipulations in videos, images and audio and authenticates media with encrypted watermarking. Clarity's AI DeepFake Firewall integrates into new and existing workflows, and enables publishers and intelligence agencies to verify sensitive media, financial institutions to prevent fraud, and enterprises to filter manipulated media and create an AI-Safe media environment — in a market that's growing at ~40% CAGR. Founded by AI and Cyber experts from 8200, Stanford, and Israel’s National Security Council, it is advised and backed by global leaders in the AI and cyber industries.


Job Description

We are looking for a Backend Engineer with experience in event driven distributed systems, a command of both relational and unstructured databases, and a comfort working in a cloud native environment to join our engineering team. You'll join a team dedicated to designing and implementing a high-performance, event-driven, distributed system. Your role involves working independently and collaboratively on complex projects, enhancing our product capabilities, and contributing to the growth and success of Clarity in our effort to defend against deepfakes and social engineering.


Responsibilities May Include

  • Working-on and enriching backend microservices – From researching areas of concern to implementing the final solutions
  • Implementing ETL processes to move and transform data between systems and databases
  • Implementing pipelines to support machine learning initiatives
  • A strong focus on Quality of Coding through Best Practices, Testing, Logging and Metrics


Requirements

  • 5+ years professional experience in backend development of high scale distributed systems
  • Proficient Python skills, including Object-Oriented Design and asynchronous execution
  • Proven knowledge and experience with Databases
  • Experience building high scale, event driven systems with Kafka
  • Proficiency with docker, kubernetes, and cloud services (GCP, AWS)
  • Strong teamwork skills and ability to collaborate and communicate effectively


Our Stack

  • Python (FastAPI, Litestar, Pydantic, PyAV, Ray, Mlflow)
  • Postgres, ElasticSearch, Kafka (Confluent)
  • GCP, Alloydb (postgres), Kubernetes (GKE), GCS
  • Prometheus, Flux, Timoni, Flagger
  • Github Actions, dagger.io
Clarity