DevJobs

Data Engineer

Overview
Skills
  • Python Python
  • SQL SQL
  • Airflow Airflow
  • AWS Athena
  • Cron
  • Cursor
  • GitHub Copilot
  • Google BigQuery
  • Rivery
Description

Minute Media is a leading global sports content and technology company, known for brands like Sports Illustrated , 90min , and FanSided . We deliver authentic stories and innovative media solutions to hundreds of millions of fans worldwide.

We are seeking a talented Data Engineer to join our BI & Data team in Tel Aviv. You will play a pivotal role in building and optimizing the data infrastructure that powers our business. In this mid-level position, your primary focus will be on developing a robust single source of truth (SSOT) for revenue data, along with scalable data pipelines and reliable orchestration processes. If you are passionate about crafting efficient data solutions and ensuring data accuracy for decision-making, this role is for you.

Responsibilities

Pipeline Development & Integration

  • Design, build, and maintain robust data pipelines that aggregate data from various core systems into our data warehouse (BigQuery/Athena), with a special focus on our revenue Single Source of Truth (SSOT).
  • Integrate new data sources (e.g. advertising platforms, content syndication feeds, financial systems) into the ETL/ELT workflow, ensuring seamless data flow and consolidation.
  • Implement automated solutions for ingesting third-party data (leveraging tools like Rivery and scripts) to streamline data onboarding and reduce manual effort.
  • Leverage AI-assisted development tools (e.g., Cursor, GitHub Copilot) to accelerate pipeline development

Optimization & Reliability

  • Optimize ETL processes and SQL queries for performance and cost-efficiency – for example, refactoring and cleaning pipeline code to reduce runtime and cloud processing costs.
  • Develop modular, reusable code frameworks and templates for common data tasks (e.g., ingestion patterns, error handling) to accelerate future development and minimize technical debt.
  • Orchestrate and schedule data workflows to run reliably (e.g. consolidating daily jobs, setting up dependent task flows) so that critical datasets are refreshed on time.
  • Monitor pipeline execution and data quality on a daily basis, quickly troubleshooting issues or data discrepancies to maintain high uptime and trust in the data.

Collaboration & Documentation

  • Work closely with analysts and business stakeholders to understand data requirements and ensure the infrastructure meets evolving analytics needs (such as incorporating new revenue streams or content cost metrics into the SSOT).
  • Document the data architecture, pipeline processes, and data schemas in a clear way so that the data ecosystem is well-understood across the team.
  • Continuously research and recommend improvements or new technologies (e.g. leveraging AI tools for data mapping or anomaly detection) to enhance our data platform’s capabilities and reliability and ensure our data ecosystem remains a competitive advantage.

Requirements

  • 4+ years of experience as a Data Engineer (or in a similar data infrastructure role), building and managing data pipelines at scale, with hands-on experience with workflow orchestration and scheduling (Cron, Airflow, or built-in scheduler tools)
  • Strong SQL skills and experience working with large-scale databases or data warehouses (ideally Google BigQuery or AWS Athena).
  • Solid understanding of data warehousing concepts, data modeling, and maintaining a “single source of truth” for enterprise data.
  • Demonstrated experience in data auditing and integrity testing, with ability to build 'trust-dashboards' or alerts that prove data reliability to executive stakeholders
  • Proficiency in a programming/scripting language (e.g. Python) for automating data tasks and building custom integrations.
Minute Media