You will work on a large-scale data integration program that modernises how critical asset data is exchanged between core systems in the energy domain.
The systems you help build and evolve handle information that is essential for correct consumption calculations, network cost settlement, and the proper functioning of the energy market.
You will join a senior engineering team focused on stream-based data processing and modern integration patterns. The role is suited to developers who enjoy deep technical challenges, distributed systems, and working on platforms where correctness and data integrity are non-negotiable.
Your responsibilities include
* Designing and developing Java-based services for high-throughput data exchange.
* Implementing and maintaining stream-processing solutions using
Kafka Streams
.
* Working with both the
Kafka Streams DSL
and
Processor API
.
* Applying deep knowledge of Kafka internals to ensure reliability and correctness.
* Participating in pair programming and collaborative development practices.
* Ensuring data consistency, ordering, and exactly-once processing semantics.
* Writing and maintaining unit and integration tests.
* Collaborating with team members in an Agile delivery context.
* Supporting CI/CD pipelines and containerised deployments.
* Communicating complex technical concepts clearly to both technical and non-technical stakeholders.
What are we looking for?
* You have
at least 5 years of professional experience
developing
Java-based applications
.
* You have
minimum 3 years of hands-on experience
with
Kafka Streams
.
* You have deep, demonstrable knowledge of
Kafka internals
, including:
Partitions, replicas, ISR, Consumer groups and offset management and Exactly-once semantics
* You have experience working with pair programming.
* You are able to explain complex technical topics clearly and convincingly.
* You have experience working in
Agile development environments
.
* You are fluent in
Dutch at CEFR level C2
(hard requirement).
Nice-to-Have
* Production experience with stateful stream processing, including: Windowing, joins, session windows, Interactive queries and RocksDB
* Confluent Cloud Kafka Developer Certification.
* Experience with CI/CD pipelines, such as GitHub Actions, Azure DevOps, or Jenkins.
* Experience with Docker and Kubernetes.
* Experience with relational databases (e.g. Oracle, SQL).
* Experience with unit testing frameworks.
* Familiarity with Azure cloud platforms.
What do we offer?
Location: Brussels (Belgium)
Contract: Freelance or Permanent
Start date: 25 February 2026
End date: 25 February 2028