We're looking for a Senior Data Engineer to help build and scale the data foundations behind a fast-growing digital and innovation environment. You'll work in a modern hybrid setup with data coming from sensors, drones, geospatial sources, and analytical platforms. Your work directly enables advanced analytics, AI use cases, and real-time insights for mission-critical operations.
This role suits someone who enjoys designing robust data pipelines, deploying ML models, and shaping a high-quality data ecosystem that is reliable, scalable, and production-ready.
What you'll do
* Design and build end-to-end data pipelines and ETL flows for diverse structured and unstructured data sources.
* Develop performant data processing in
Python
and
(Py)Spark
.
* Bring
AI/ML models
into production together with data scientists.
* Work in a hybrid on-prem/cloud environment and implement containerized solutions (Docker, Kubernetes).
* Deliver clean, maintainable, production-ready code with clear documentation.
* Set up monitoring, logging and lineage using modern tooling (Airflow, OpenLineage, Fluentd, Marquez, Collibra).
* Support unit, performance, and system testing for data workflows.
* Maintain metadata and ensure strong governance around datasets and pipelines.
* Collaborate closely with innovation, data and engineering teams to continuously improve data quality and architecture.
What are we looking for?
* 5+ years experience as a Data Engineer
in complex data environments.
* Strong knowledge of
Python
,
(Py)Spark
, and distributed processing.
* Experience putting
AI/ML models
into production and automating data processes.
* Experience working with
PostgreSQL / pgvector
and other relational or vector databases.
* Hands-on experience with
monitoring / logging / lineage
tools (Airflow, OpenLineage, Fluentd, Marquez, Collibra).
* Experience with
containerized solutions
(Docker, Kubernetes).
* Experience building scalable
REST APIs
in Python (FastAPI, Flask).
* Strong problem-solving mindset, able to help build scalable and sustainable data solutions.
* Experience with
cloud platforms
(AWS, Azure, GCP).
* Experience in
Agile/DevOps
teams and version control (Git).
* Solid knowledge of databases, data structures, and ETL concepts.
* Native-level
Dutch (C2)
.
What do we offer?
Location: Brussels (hybrid, min. 1 day/week on-site)
Contract: Freelance or Permanent
Duration: 05/01/ /07/2026