Who We're Looking For - The Dream Maker
We are looking for a
hands-on data engineer who will build and develop the entire data engineering capabilities and infrastructure.
This role will possess a combination of technical expertise, problem-solving skills, and strong communication abilities.
Your Arena
- Data Standardisation:
- Make sure that data is standardized, and manage efforts on that initiative (including past data).
- *Create a live documentation for the existing data schema (main entities + each field)
- Make sure that data schema is consistent and participate in design reviews that involve database design
- Oversee the adoption of a common data access layer for our main DB entities (disputes, customer data, integrations, billing)
- Add and maintain a validation processes ayer to prevent divergence from schema and closed set of value* Data Correctness & Integrity
- Set up processes to ensure data correctness, validity based on business requirements
- Set up processes to detect data duplication in our database, and take actions to remediate this with the business stakeholders* Data Optimization and Operations
- Monitor and Operate DB Performance, index management, Alerts, scaling on a daily basis* Data Engineering technologies and roadmap
- Analyze and choose the right database and related technologies for our evolving needs including data warehouses, data lakes, graph database, relational and non-relational databases and ETL tools.
- Be a knowledgable source to consult on DB design, optimization and operations for our existing and future DB technologies: Mongodb, dynamodb, rockset, postgres, snowflake.
- Oversee ETL, ELT processes that are used in the company, both inside R&D and in other departments (R&D, Opertations, Marketing), using AWS Glue, HighTouch, Airbyte.
- Design data tiering practices to find the tradeoffs for cost/performance/flexibility/query* Data Security, Access control and regulation
- Support our efforts to map out PII data in our database and limit access to that
- Implement encryption at rest and other practices to protect our databases
- Execute data deletion tasks like removal of personal data, customer(shop/account) data as part of our SOC2/GDPR compliance* Business Intelligence
- Choose and configure a BI system that makes all of our data accessible in a single system to various stakeholders and departments in the company (R&D, Product, Operations, Finance, Sales, Marketing, Customer Success, etc.)
- Create ad-hoc and periodic reports & dashboards for stakeholder, on demand.
Requirements:
What It Takes:
- Proficiency in programming languages (nodejs/javascript, Python, Java)
- Experience with big data technologies
- Knowledge of database management systems, both relational (e.g., SQL, PostgreSQL) and non-relational (e.g., MongoDB, Dynamodb)
- Familiarity with data integration and ETL tools, such as Hightouch, Airbyte, AWS Glue
- Strong problem-solving and analytical skills
- Excellent communication and collaboration abilities
- Familiarity with AWS ecosystem
- Data Cleaning / Quality Engineering
- Data Monitoring (including Infrastructure)
- Strong project management and organizational skills.
Our Story
Chargeflow is a leading force in fintech innovation, tackling the pervasive issue of chargeback fraud that undermines online businesses. Born from a deep passion for technology and a commitment to excel in eCommerce and fintech, we've developed an AI-driven solution aimed at combating the frustrations of credit card disputes. Our diverse expertise in fintech, eCommerce, and technology positions us as a beacon for merchants facing unjust chargebacks, supported by a unique success-based approach.
Propelled by a recent $14 million funding round led by OpenView Venture Partners and key fintech investors, Chargeflow has embarked on a product-led growth journey. Today, we represent a tight-knit community of passionate individuals and entrepreneurs, united in our mission to revolutionize eCommerce and fight against chargeback fraud, marking us as pioneers in protecting online business revenues.