DevJobs

Student Data Engineering– Network Insights Platform

Overview
Skills
  • Python Python
  • SQL SQL
  • XML XML
  • Numpy Numpy
  • Pandas Pandas
  • MariaDB MariaDB
  • MongoDB MongoDB
  • Linux Linux
  • Airflow Airflow
  • CSV
  • JSON
  • Apache Superset
About The Role

Join our Network Insight (NI) development team working on a large-scale data processing and analytics platform used to analyze the physical, logical, and performance layers of the network. You’ll contribute to developing data pipelines, KPI analytics, and visualization components while gaining hands-on experience with real production data engineering and backend development.

What You’ll Work On

Data Engineering & Backend Development

  • Develop and maintain data collection components (MongoDB, MariaDB, CSV/JSON/XML)
  • Implement ETL/ELT logic and KPI calculations in Python and SQL
  • Work with data warehouse structures and repository patterns

Pipeline Orchestration & Monitoring

  • Develop and maintain Apache Airflow DAGs for scheduling and orchestration

Analytics & Visualization

  • Contribute to Apache Superset dashboards and reports
  • Develop new metrics and insights across physical, logical, and performance layers
  • Support users in building custom queries and visualizations

What We’re Looking For

Required:

  • Currently pursuing a degree in Computer Science, Software Engineering, Information Systems Engineering, Industrial Engineering & Management, or a related field
  • programming fundamentals and experience with Python
  • Basic understanding of SQL and relational databases
  • Familiarity with Linux environments
  • Curiosity and eagerness to learn new data engineering and analytics technologies

Nice to Have:

  • Experience with data processing or analytics projects
  • Knowledge of MongoDB, MariaDB, or other databases
  • Familiarity with Apache Airflow or similar orchestrators
  • Experience with Python data libraries (Pandas, NumPy)
  • Understanding of networking concepts (OTN, L2/L3 services, tunnels, KPIs)

What You’ll Learn

  • End‑to‑end data engineering using Python, SQL, and Airflow
  • Large-scale data ingestion, transformation, and KPI computation
  • Working with structured and unstructured datasets across multiple DBs
  • Data warehouse modeling and query optimization
  • Building dashboards and analytical views using Apache Superset
  • Handling production pipelines and debugging distributed systems
  • Integrating AI‑driven capabilities for analytics and automation

Responsibilities

  • Implement features across data pipelines, transformations, and analytics components
  • Write clean, maintainable Python and SQL code
  • Create and maintain charts, dashboards, and visual insights in Superset

Please Note:

'All qualified applicants will receive consideration for employment without regard to race, age, sex, color, religion, sexual orientation, gender identity, national origin, protected veteran status, on the basis of disability, or other characteristic protected by applicable law.'
Ribbon Communications