For a customer in Brussels, we are searching for 2 Data Engineers for a long-term assignment to support extra workload working on their Opensource stack.
Design, build, and maintain robust data pipelines for batch and analytical workloads.
Develop data extractions and transformations using Airflow and dbt.
Work with PostgreSQL for structured data storage and data ingestion.
Use DuckDB for efficient analytical querying and local data processing.
Design and manage Delta tables as part of a modern data lakehouse architecture.
Contribute to finalizing and industrializing the data lakehouse setup in collaboration with Targetfire.
Ensure data quality, reliability, and observability across pipelines.
Document data models, pipelines, and operational procedures.
Strong knowledge of SQL, including inserting, updating, and transforming data in relational databases.
Hands-on experience building data pipelines with Airflow.
Experience with data transformations and modeling, ideally using dbt.
Solid understanding of relational databases, particularly PostgreSQL.
Experience working with containerized applications on Kubernetes or OpenShift.
Familiarity with data lake and lakehouse concepts.
Knowledge of data quality, monitoring, and observability practices.
Medior/Senior
START DATE: ASAP (February)
CONTRACT: Freelance
LANGUAGES: Fluent English
1 stage