Who are we
Fido empowers millions across Africa to take control of their finances with ease. As a leader in cutting-edge financial technology, Fido clears the way for building credit, securing instant loans, making smart investments, and obtaining tailored insurance. No banker’s hours, no hidden fees—just endless opportunities.
From city centers to rural communities, Fido is breaking barriers and creating financial freedom, providing access to innovative tools and services that foster growth and empowerment. By leveraging advanced technology, Fido is shaping a future of opportunity and financial inclusion across the continent.
Join the team and be a part of leading this transformative change, driving impact where it matters most
.
What you will
- doBuild data pipelines that collect and transform data to support ML models, analysis and reportin
- g.Work in a high volume production environment making data standardized and reusable, from architecture to productio
- n.Work with off-the-shelf tools including DynamoDb, SQS, S3, RedShift, Snowflake, Mysql but often push them past their limit
- s.Work with an international multidisciplinary team of data engineers, data scientists and data analyst
s.
Who you
- are:At least 5+ years of experience in data engineering / software engineering in the big data dom
- ain.At least 5+ years of coding experience with Python or equival
- ent.SQL expertise, working with various databases (relational and NoSQL), data warehouses, external data sources and AWS cloud servi
- ces.Experience in building and optimizing data pipelines, architecture and data s
- ets.Experience with ML pipelines and MLOps to
- ols.Familiarity with data engineering tech stack - ETL tools, orchestration tools, micro-services, K8, lamb
- das.End to end experience - owning features from an idea stage, through design, architecture, coding, integration and deployment sta
- ges.Experience working with cloud services such as AWS, Azure, Google Cl
- oud.B.Sc. in computer science or equivalent S
TEM.