Commit Data is looking for a highly skilled Big Data Engineer to join our growing Data Group.
As a Big Data Engineer, you will build data-driven solutions using cutting-edge data tools and large-scale data on AWS data infrastructure and other clouds as well.
Job Responsibilities:
- Lead data solutions design and development for our various clients and projects.
- Design the solution by understanding the needs, modeling the data, choosing the right tools and defining the interfaces/dashboards.
- Develop Data Pipe Lines, Data Lakes, DWHs, using Python, Spark, EMR, Glue, Athena, Airflow, DBT and many other leading technologies.
Requirements:
- 2+ years of relevant experience as Big Data Engineer – a must.
- Experience with Python based data pipelines/ETLs and other ETL\ELT tools.
- High Proficiency in SQL – a must.
- Microsoft (Azure) environment and tools
- Azure DevOps
- Data Factory – ETL tool
- DWH – SQL Server
- Data Bricks – very big advantage!!!