DevJobs

Data Engineer

Overview
Skills
  • Python Python
  • SQL SQL
  • AWS AWS
  • GCP GCP
  • Docker Docker
  • Kubernetes Kubernetes
  • Airflow Airflow
  • BigQuery
  • DBT

Join us in shaping the future of online protection.


Guardio protects millions of people from the threats they never see coming, like phishing scams, financial fraud, and the small cracks in online security that cause big problems. As a B2C company, we build smart, intuitive tools that help individuals and small businesses stay safe, effortlessly. We’re a fast-growing team of over 100, united by a shared goal: to make digital life safer for everyone. Our culture moves quickly but stays human, rooted in transparency, feedback, and collaboration. And yes, we have a good time while we’re at it, with team dinners and traditional karaoke nights.

If you’re ready to work on meaningful challenges, grow fast, and help shape what online safety looks like for the next million users, you’re in the right place.



So, what's the job?


We are seeking an experienced Data Engineer to join Guardio’s Data Team. In this role, you will take ownership of building and optimizing the infrastructure, tools, and data processes that empower the entire organization. You'll collaborate with cross-functional teams to ensure reliable data pipelines, efficient integrations, automation, and accessibility of high-quality data to drive business growth and operational excellence.



You will:


  • Take technical ownership of data infrastructure components, including ingestion pipelines, data transformations, and storage solutions
  • Design and implement scalable data architectures that support analytics, reporting, and automation across various domains (User Acquisition, Growth, etc.)
  • Develop and maintain integrations with internal systems and third-party APIs to collect, transform, and serve data across platforms
  • Collaborate with stakeholders to define data requirements, troubleshoot issues, and ensure data quality and integrity
  • Monitor, debug, and optimize pipeline performance, ensuring system reliability and minimal downtime
  • Develop tools and automation to streamline data workflows, enhance team productivity, and reduce manual effort



Sounds great! Am I the right fit?


Well, our guess is you have a good chance of being that person if you check as many of these as possible:

  • 3+ years of experience as a Data Engineer
  • Strong understanding of data pipeline design, ETL/ELT workflows, APIs, and data integration patterns
  • High proficiency in SQL and experience with data warehouse modeling
  • Solid programming skills in Python for building pipelines and automation
  • Experience with Cloud platforms such as GCP or AWS
  • Familiarity with modern data stack tools and platforms such as BigQuery, DBT, Airflow or similar
  • Experience working with Kubernetes and Docker - An advantage
  • Strong problem-solving skills and the ability to work effectively with cross-functional teams

Guardio