DevJobs

Senior Data Engineer

Overview
Skills
  • Python Python ꞏ 5y
  • Java Java ꞏ 5y
  • Flask Flask
  • Kafka Kafka
  • MySQL MySQL
  • Jenkins Jenkins
  • GitHub Actions GitHub Actions
  • Snowflake Snowflake
  • AWS EC2 AWS EC2
  • AWS Lambda AWS Lambda
  • Docker Docker
  • Kubernetes Kubernetes
  • Airflow Airflow
  • Terraform Terraform
  • Dropwizard
  • RDS
  • Spring
  • FireHose
  • Datadog
  • AWS Step Functions
  • AWS SQS
  • Prometheus Prometheus
  • Pulumi
  • AWS EKS

Who are we?

Skai (formerly Kenshoo) is a leading omnichannel marketing platform that leverages advanced AI and machine learning to deliver intelligent solutions based on data with performance media, enabling smarter decision-making, increased efficiency, and maximized returns - Revenue enabler for businesses around the world. Its partners include Google, Meta, Amazon, and Microsoft and more. ~$7 billion in ad spending is managed on the SkaiTM platform every year.

Established in 2006, We’re 700 employees strong. We work hybrid with a great home/ office work mix.


What will you do?

Join a team that builds and operates the core data services behind Skai’s products.

As a Senior Data Engineer, you will design, build and maintain:

  • Data pipelines and data services
  • Microservices and internal APIs
  • The infrastructure and tooling that keep our data systems reliable, observable, and scalable.


This is a hands-on role that combines Data Engineering, Backend Development and DevOps.

It’s ideal for someone who enjoys working across layers: from schema design and data flows, through microservice code, to Kubernetes, CI/CD and observability.


Responsibilities:

  • Design and maintain robust infrastructure for large-scale data processing and streaming systems.
  • Develop automation and deployment processes using CI/CD pipelines.
  • Build and operate Kubernetes-based environments and containerized workloads.
  • Collaborate with data engineers to optimize performance, cost, and reliability of data platforms.
  • Design and develop REST-API microservices.
  • Troubleshoot and resolve complex issues in production and staging environments.
  • Drive initiatives that enhance observability, scalability, and developer productivity.
  • Lead by example - share knowledge, mentor teammates, and promote technical best practices.


Requirements:

  • 5 years of experience as a Data Engineer, Backend Developer, or DevOps.
  • 5+ years of experience with Python/Java microservices (Flask, Spring, Dropwizard) and component testing.
  • Deep understanding of Kubernetes, Docker, and container orchestration.
  • Hands-on experience with CI/CD pipelines (e.g., Jenkins, GitHub Actions).
  • Proven experience with Snowflake, MySQL, RDS, or similar databases.
  • Familiarity with streaming systems(e.g., Kafka, FireHose), databases, or data pipelines.
  • Self-learner, proactive, and passionate about improving systems and automation.
  • Strong communication skills and a collaborative, team-oriented mindset.


Advantages:

  • Experience with Kafka, Airflow or other data processing tools.
  • Knowledge of Terraform, Pulumi, or other IaC frameworks.
  • Familiarity with Datadog, Prometheus or other observability tools.
  • Experience with AWS (Lambda, EKS, EC2, Step functions, SQS).
  • Working with or building AI-driven tools.

Skai