Requirements
* University degree in IT or relevant discipline, combined with minimum 17 years of relevant working experience in IT;
* Minimum 5 years of excellent knowledge in Azure Data Lake Storage and Oracle databases;
* Minimum 5 years of excellent expertise in developing data pipelines from REST APIs and on integration (such as Azure Synapse, PySpark, Python, SQL, KNIME);
* Minimum 5 years of excellent expertise in processing JSON and GIS data;
* Minimum 2 years of Microsoft Fabric OneLake and Microsoft Fabric;
* Experience designing incremental loads, CDC processes, and automated schema evolution;
* Experience with CI/CD pipelines;
* Experience working in an Agile and Scrum framework;
* Excellent knowledge of working with REST APIs;
* Ability to implement robust data quality checks, logging, and monitoring in ETL processes;
* Ability to document ETL workflows, metadata, and technical specifications clearly and consistently;
* Familiarity with DevOps and version control best practices;
* The following certification is required: Microsoft Azure Data Engineer Associate;
* The following certifications are considered an asset: Microsoft Certified: Azure Solutions Architect Expert, Microsoft Certified: Azure Developer Associate, Microsoft Certified: Azure Database Administrator Associate;
* Excellent command of the English language, French will be considered an asset.