DevJobs

Data Engineer

Overview
Skills
  • Python Python ꞏ 4y
  • SQL SQL
  • Kubernetes Kubernetes
  • Docker Docker
  • Airflow Airflow
  • Ansible Ansible
  • Terraform Terraform
  • Algorithms ꞏ 4y
  • Data Structures ꞏ 4y
  • ELT Pipelines ꞏ 3y
  • ETL ꞏ 3y
  • Cloud Providers ꞏ 2y
  • Software Design ꞏ 1y
  • Architecture ꞏ 1y
  • Distributed Systems
  • GitFlow
  • Google Coding Style Guidelines
  • Google Documentation Style Guidelines
  • Data Visualization Services
  • Large-Scale Infrastructure
  • Storage Architecture
  • Streaming Technologies
  • TDD

Tango is a successful, market leader, a live-streaming Platform with 450+ Million registered users, in an industry projected to reach $240 BILLION in the next couple of years.

The B2C platform, based on the best-quality global video technology, allows millions of talented people around the world to create their own live content, engage with their fans and monetize their talents.


Tango live stream was founded in 2018 and is powered by 350+ global employees operating in a culture of growth, learning, and success!

The Tango team is a vigorous cocktail of hard workers, creative brains, energizers, geeks, overachievers, athletes, and more. We push the limits to bring our app from “one of the top” to “the leader”.


The best way to describe Tango's work style is not to use the word “impossible”. And we believe that success is a thorny path that runs on sleepless nights, corporate parties, tough releases, and of course our users' smiles (and as we are a LIVE app, we truly get to see our users all around the world smiling right in front of us in real-time!).


We are looking for a Data Engineer to join our team. You will be responsible for building and designing our data and data pipeline architecture and building our data warehouse. The ideal candidate is a data enthusiast who enjoys building data systems from the ground up and maintaining their integrity. You will be the go-to person for anything and everything regarding understanding data, exploration, pipelines, analytics, and the connection between business and data.


Do you want to join the party?


Responsibilities:

  • Design, code and optimize scalable data pipelines
  • Design and build core storage and surrounding infrastructure
  • Participate in, or lead design reviews with peers and stakeholders to decide amongst available technologies
  • Review code developed by other developers and provide feedback to ensure best practices (e.g., style guidelines, checking code in, accuracy, testability, and efficiency)
  • Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on hardware, network, or service operations and quality


Requirements:

  • Bachelor’s degree or equivalent practical experience
  • 4 years of experience with software development in Python or more programming languages, and with data structures/algorithms
  • 3 years of experience testing, maintaining, or launching ETL and/or ELT pipelines, and 1 year of experience with software design and architecture
  • Practical knowledge of SQL scripting
  • 2 years of experience working with one of Cloud Providers (GCP and/or AWS)


Would be a plus:

  • Master's degree or PhD in Computer Science or related technical field
  • Practical experience with containerizing and orchestrating pipelines with Docker, Airflow, K8S and/or alternatives
  • Practical experience with organizing CI/CD of the pipelines with Terraform, Ansible or alternatives
  • Practical understanding of Google Coding Style Guidelines, Google Documentation Style Guidelines, TDD approach and GitFlow best practices
  • Knowledge of data visualization services
  • Experience developing large-scale infrastructure or distributed systems, and experience with streaming technologies and storage architecture


#LI-Onsite

Tango