DevJobs

Senior Data Engineer

Overview
Skills
  • SQL SQL
  • Python Python
  • Kafka Kafka
  • OOP OOP
  • CI/CD CI/CD
  • Airflow Airflow ꞏ 3y
  • Terraform Terraform
  • Dataflow ꞏ 3y
  • Pub ꞏ 3y
  • Sub ꞏ 3y
  • Apache Beam ꞏ 3y
  • BigQuery ꞏ 3y
  • Cloud Composer ꞏ 3y
  • Infrastructure as Code
  • Data modeling
  • Data observability tools
  • Dataplex
  • Looker
  • Confluent
  • Vertex AI
Description

Tango is a successful, market leader, a live-streaming Platform with 450+ million registered users, in an industry projected to reach $240 BILLION in the next couple of years.

The B2C platform, based on the best-quality global video technology, allows millions of talented people around the world to create their own live content, engage with their fans, and monetize their talents.

Tango live stream was founded in 2018 and is powered by 500+ global employees operating in a culture of growth, learning, and success!

The Tango team is a vigorous cocktail of hard workers, creative brains, energizers, geeks, overachievers, athletes, and more. We push the limits to bring our app from “one of the top” to “the leader”.

The best way to describe Tango's work style is not to use the word “impossible”. We believe that success is a thorny path that runs on sleepless nights, corporate parties, tough releases, and, of course, our users' smiles (and as we are a LIVE app, we truly get to see our users all around the world smiling right in front of us in real-time!).

Do you want to join the party?

Responsibilities

  • Design, implement, and maintain robust data pipelines and ETL/ELT processes on GCP (BigQuery, Dataflow, Pub/Sub, etc.).
  • Build, orchestrate, and monitor workflows using Apache Airflow / Cloud Composer.
  • Develop scalable data models to support analytics, reporting, and operational workloads.
  • Apply software engineering best practices to data engineering: modular design, code reuse, testing, and version control.
  • Manage GCP resources (BigQuery reservations, Cloud Composer/Airflow DAGs, Cloud Storage, Dataplex, IAM).
  • Optimize data storage, query performance, and cost through partitioning, clustering, caching, and monitoring.
  • Collaborate with DevOps/DataOps to ensure data infrastructure is secure, reliable, and compliant.
  • Partner with analysts and data scientists to understand requirements and translate them into efficient data solutions.
  • Mentor junior engineers, provide code reviews, and promote engineering best practices.
  • Act as a subject matter expert for GCP data engineering tools and services.
  • Define and enforce standards for metadata, cataloging, and data documentation.
  • Implement monitoring and alerting for pipeline health, data freshness, and data quality.


Requirements

  • Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.
  • 6+ years of professional experience in data engineering or similar roles, with 3+ years of hands-on work in a cloud env, preferably on GCP.
  • Strong proficiency with BigQuery, Dataflow (Apache Beam), Pub/Sub, and Cloud Composer (Airflow).
  • Expert-level Python development skills, including object-oriented programming (OOP), testing, and code optimization.
  • Strong data modeling skills (dimensional modeling, star/snowflake schemas, normalized/denormalized designs).
  • Solid SQL expertise and experience with data warehousing concepts.
  • Familiarity with CI/CD, Terraform/Infrastructure as Code, and modern data observability tools.
  • Exposure to AI tools and methodologies (i.e, Vertex AI).
  • Strong problem-solving and analytical skills.
  • Ability to communicate complex technical concepts to non-technical stakeholders.
  • Experience working in agile, cross-functional teams.


Preferred Skills (Nice To Have)

  • Experience with Google Cloud Platform (GCP)
  • Experience with Dataplex for data cataloging and governance.
  • Knowledge of streaming technologies (Kafka, Confluent).
  • Experience with Looker.
  • Cloud certifications (Google Professional Data Engineer, Google Cloud Architect).
Tango