Requirements
* University degree in IT combined with relevant IT professional experience of 13 years;
* At least 5 years of experience in relational database systems applied to data warehouse, data warehouse design & architecture;
* At least 5 years of experience in code-based data transformation tools such as Data build tool (dbt), Spark;
* At least 5 years of experience in SQL and data integration and ETL/ELT tools;
* Hands-on experience as Data Engineer in a modern data platform and on data analytics techniques and tools;
* At least 3 years of experience in Python and orchestration tools such as Airflow, Dagster;
* At least 3 years of experience in data modelling tools as well as online analytical data processing (OLAP) and data mining tools;
* Experience with data platforms such as Fabric, Talend, Databricks and Snowflake;
* Experience with containerised application development and deployment tools, such as Docker, Podman, Kubernetes;
* Excellent command of the English language.