The following tasks will be performed by the external service provider:
• Design and implement ETL processes
• Design and develop reports and dashboards
• Provide advice on DB design and performance
• Design and implement data governance procedures (incl. access, data quality, …)
• Create data architecture models: optimise data flows and integrate solutions
• Create and maintain Enterprise Data Warehouses (EDW) and complex Business Intelligence Solutions
(Data Lakes / Data Lakehouses)
• Conduct research to improve solutions and provide advice about new approaches and technologies (Data
Virtualization, Data Mining, Linked Data, Artificial Intelligence, Big Data)
• Gather business requirements related to reports and dashboards
• Write technical documentation
• Participate in meetings with other teams
• Perform data quality tests
KNOWLEDGE AND SKILLS
Following skills and knowledge are required for the performance of the above listed tasks:
• Good knowledge of data architecture frameworks
• Excellent knowledge of data warehouse design and architecture
• Very good knowledge of near real time data warehousing and/or change data capture technologies
• Excellent knowledge of relational and non-relational database systems
• Very good knowledge of data architecture best practices
• Excellent knowledge of business intelligence technologies, including ETL, online analytical data
processing, analytics, reporting etc.
• Knowledge of data modelling, graphical representation, abstraction and design
• Service- and client-oriented mindset
• Ability to solve problems, recognise opportunities, analyse options, draw conclusions and promote
approaches
• Very good analytical capabilities and high-quality drafting skills.
• Strong communications skills, both verbal (e.g.: presentations, technical- and non-technical workshops
etc.), and written (e.g.: technical documents, reports etc.)
• Leadership and conciliation skills, facilitation of group sessions
• Ability to organise and supervise the work of others.
• Ability to apply high quality standards, work efficiently and fast
SPECIFIC EXPERTISE0F
1
• Following specific expertise is mandatory for the performance of tasks:
o At least 7 years of specific expertise working with data warehouses (min. competence level 4)
o At least 7 years of specific expertise in relational and non-relational database systems, data
modelling tools and SQL (min. competence level 4)
o At least 7 years of experience as a Business Intelligence developer, working with ETLs, data
processing, reporting and dashboards design and implementation (min. competence level 4)
o At least 3 years of experience with PowerBI or QlikSense (min. competence level 4)
o At least 5 years of specific expertise working with Python, of which at least 3 years on data
extraction and transformation tasks. (min. competence level 4)
o At least 3 years of experience working with Python-based open-source orchestration tools (e.g.
Dagster, Apache Airflow) (min. competence level 3)
o At least 3 years of experience with DevOps tools (e.g. Docker, Kubernetes, ArgoCD) (min.
competence level 2)
o At least 5 years of experience with data quality management (min. competence level 4)
• Following specific expertise is an asset (advantage) for the performance of tasks:
o Experience with MS-FABRIC.
o Experience with SAP Business Objects.
o Experience with AI/ML-powered analytics, including automated insights, natural-language
query and predictive capabilities.
o Knowledge of new technologies, including cloud-based business intelligence solutions, Linked
Data, Artificial Intelligence.
6. CERTIFICATIONS & STANDARDS1F2
Certification related to business intelligence tools and/or specific expertise mentioned above will be considered
for this request. Project management related certificates are an advantage.
Project duration: 1100 days Initial contract duration: 220 days