WSC Sports, the pioneer in AI-powered sports content technology, empowers more than 460 clients world-wide to connect with their fans through AI-tailored sports content experiences. WSC Sports’ platform automates the creation, management and distribution of content, enabling sports rights holders to expand reach, grow fan bases, and unlock revenue opportunities across digital platforms.
Why WSC Sports:
You’ll work in an awesome environment alongside some of the most innovative people in the industry, using cutting-edge technologies and tools (video editing, Gen AI, data, etc.). At WSC Sports, you have the opportunity to directly influence the products and tools used by our clients, including sports giants such as the NBA, Bundesliga, LaLiga, ESPN – and that’s just the beginning of what WSC Sports has to offer! Join us and be a part of the best team in tech as we Fuel the Fandom worldwide.
What you’ll do:
- Design, build, and optimize large-scale data pipelines and workflows for both batch and real-time processing.
- Architect and maintain Airflow-based orchestration frameworks to manage complex data dependencies and data movement.
- Develop high-quality, maintainable data transformation and integration processes across diverse data sources and domains.
- Lead the design and implementation of scalable, cloud-based data infrastructure ensuring reliability, performance, and cost efficiency.
- Drive data modeling and data architecture practices to ensure consistency, reusability, and quality across systems.
- Collaborate closely with Product, R&D, BizDev, and Data Science teams to define data requirements, integrations, and delivery models.
- Own the technical roadmap for key data initiatives, from design to production deployment.
Requirements: 
What you’ll need: 
- 6+ years of experience as a Data Engineer working on large-scale, production-grade systems.
- Proven experience architecting and implementing data pipelines and workflows in Airflow — must be hands-on and design-level proficient.
- Strong experience with real-time or streaming data processing (Kafka, Event Hubs, Kinesis, or similar).
- Advanced proficiency in Python for data processing and automation.
- Strong SQL skills and deep understanding of data modeling, ETL/ELT frameworks, and DWH methodologies.
- Experience with cloud-based data ecosystems (Azure, AWS, or GCP) and related services (e.g., Snowflake, BigQuery, Redshift).
- Experience with Docker, Kubernetes, and modern CI/CD practices.
- Excellent communication and collaboration skills with experience working across multiple stakeholders and business units.
- A proactive, ownership-driven approach with the ability to lead complex projects end-to-end.