Description
We are seeking for talented Data Engineer,
The ideal candidate is a self-motivated, multi-tasker, and demonstrated team-player.
You will be responsible for designing, developing, managing, and maintaining our data platform, including our Data-Lakehouse (S3 & Delta Lake / Iceberg & Clickhouse), ETL processes, and orchestration tool (Spark & Temporal Workflow).
What You Will Do
- Develop a scalable data platform integrating multiple sources for easy access.
- Design and enhance data tools (orchestration, governance, Data-Lakehouse, BI, etc.).
- Ensure smooth operation of data systems for analysts, scientists, and engineers.
- Optimize data pipelines (ingestion, processing, and output) in a microservices environment.
Requirements
Must:
- 2+ years of experience in a data engineering-related position.
- SQL expertise, including working with various databases, data warehouses, third-party data sources, and AWS cloud services.
- Proficient in Python, including Object-Oriented Programming (OOP) and Big Data processing.
- Experience in building, designing, and optimizing data pipelines.
- Self-driven, can-do attitude.
Nice To Have
- Experience with Spark (big advantage).
- Experience with Open Table Format (Delta Lake / Iceberg).
- Experience with ClickHouse.
- Experience with Temporal Workflow.
- Familiarity with ERP systems.
- Familiarity with supply chain systems.