DevJobs

Data Engineer

Overview
Skills
  • Python Python ꞏ 5y
  • SQL SQL
  • Power BI Power BI
  • Tableau Tableau
  • Snowflake Snowflake
  • ETL ꞏ 5y
  • BigQuery
  • Redshift
  • Bedrock
  • BigQuery ML
  • Gemini
  • Looker
  • Q
  • Quicksight
  • SageMaker
  • Vertex AI
Commit Data is looking for a highly skilled Data Engineer to join our growing Data Analytics Department.

As a Data Engineer in a multi cloud company, you will build data-driven solutions for our customers using cutting-edge data tools and large-scale data on AWS\GCP\Azure.

Job Responsibilities:

  • Lead data solutions design and development for our various clients and projects.
  • Design the solution by understanding the needs, modeling the data, choosing the right tools and defining the interfaces/dashboards.
  • Develop Data Pipelines, Data Lakes, DWHs, AI\ML models, Dashboards and reports using advanced tools and leading technologies.

Requirements:

Requirements:

  • 5+ years of relevant experience as Data Engineer – a must.
  • Experience with Python based data pipelines/ETLs and other ETL\ELT tools (such as Glue, Rivery, Data Factory, DBT).
  • High Proficiency in SQL – a must.
  • Experience designing and developing DWHs in the cloud – Redshift\Snowflake\BigQuery is a must.
  • Experience with BI & visualizations tools like Tableau\Quicksight\Power BI\Looker.
  • Knowledge and experience with AI\ML (working with tools like SageMaker\ Bedrock\ Q\ BigQuery ML\ Vertex AI\ Gemini) - big advantage.
  • Strong analytical and problem-solving skills with attention to details.
  • High self-learning skills
  • Fluent English
Commit