Position Title: ETL Expert
Location: Brussels, Belgium
Duration: 6 Months
Languages: English
Work mode: Hybrid
EU Candidates Only
Job Description:
We currently have a vacancy for an ETL Expert fluent in English, to offer his/her services as an expert who will be based in Brussels, Belgium. The work will be carried out either in the company’s premises or on site at customer premises. In the context of the first assignment, the successful candidate will be integrated in the Development team of the company that will closely cooperate with a major client’s IT team on site.
Your tasks
• Design, implement, and optimize ETL processes using PL/SQL, Python, and possibly SAS for data ingestion, transformation, and integration across various sources (Surveillance, CUP-MIS, CBAM);
• Install, configure, and maintain middleware platforms, including Tanzu, Docker, Kubernetes, and other containerized environments for secure and scalable data services;
• Administer and optimize Oracle Database systems, including schema management, indexing, performance tuning, and data security;
• Design, configure, and maintain data processing frameworks ensuring high-performance analytics and secure data handling;
• Automate data processing tasks using Python, shell scripting, and possibly SAS macros, improving operational efficiency and reducing manual intervention;
• Implement data integration solutions, including API-based ingestion, secure file transfers, and scheduled data synchronization;
• Establish and maintain data exchange platforms using API gateways and Kafka, ensuring secure, scalable, and efficient data flows.
Requirements
• University degree in IT or relevant discipline, combined with minimum 15 years of relevant working experience in IT;
• Excellent experience in designing, implementing, and optimizing complex ETL processes using PL/SQL, Python, and possibly SAS;
• Excellent experience with secure installation, configuration, and management of middleware solutions (Tanzu, Docker, Kubernetes) and API integration (API gateways, Kafka) to support scalable, automated data services;
• Excellent experience managing Oracle Database environments, including advanced schema design, performance tuning, data encryption, and backup/recovery tailored to high-volume data environments;
• Excellent experience in secure data processing, including GDPR compliance, data anonymization, encryption, and secure data exchange between DataLab and other platforms;
• Experience in API-based data integration, secure file transfers, and real-time data streaming using Kafka for cross-platform data exchange;
• Experience in SAS environments, including CASLib for in-memory data processing, and experience optimizing complex data transformations for analytics;
• Experience in designing and managing RESTful and SOAP web services, with hands-on experience in API gateways, versioning, and security;
• Experience in load balancers and reverse proxies for high availability, traffic distribution, and performance optimization of web and API services;
• Experience in designing, documenting, and maintaining scalable data architectures, including API specifications, security models, network diagrams, and data flow mappings;
• Experience in automating data processing and system deployments using scripting (Python, Bash);
• Experience in authentication protocols and identity services including Kerberos, OAuth, LDAP, and Active Directory;
• Experience in implementing secure user authentication, role-based access control, and integrating enterprise identity providers for centralized access management;
• Experience in technical documentation, including architecture diagrams, process documentation, and compliance reporting;
• Excellent command of the English language.