We are looking for a talented and experienced
Data Engineer
to join our team dedicated to data processing, transformation, and management. In this freelance full-time role, you will be responsible for designing, building, and optimizing scalable data pipelines and architectures that support business intelligence, analytics, and operational reporting.
This is a hybrid position based in the Brussels metropolitan area, offering flexibility for partial remote work.
Key Responsibilities:
* Design, develop, and maintain robust, scalable, and secure
data pipelines
(batch and real-time).
* Build and optimize
ETL/ELT processes
to ensure high-quality, reliable, and accessible data.
* Develop and maintain
data models
(conceptual, logical, and physical) aligned with business requirements.
* Design and enhance
data warehouse and data lake architectures
(e.g., Medallion architecture, star/snowflake schema).
* Ensure data quality, integrity, governance, and compliance with best practices.
* Collaborate closely with business stakeholders, data analysts, and BI developers to translate business needs into technical data solutions.
* Monitor and optimize performance of databases and data workflows.
* Support data analytics and reporting initiatives by enabling clean, structured datasets for visualization tools.
Required Qualifications:
* Proven experience in
Data Engineering
and large-scale data processing environments.
* Strong expertise in
data modeling
and database design.
* Solid experience with
ETL/ELT frameworks and tools
.
* Hands-on experience in building and optimizing
data warehouses
.
* Strong understanding of
data analytics principles
and data preparation for reporting and dashboards.
* Proficiency in
SQL
and
Python
.
* Experience with relational and/or distributed databases (e.g., PostgreSQL, SQL Server, Oracle, Snowflake).
* Experience with big data and/or cloud data platforms such as:
* Microsoft Azure (Azure Data Factory, Synapse, Fabric, Data Lake)
* AWS (Redshift, Glue, S3)
* Google Cloud (BigQuery, Dataflow)
* Familiarity with data orchestration tools (e.g., Airflow).
* Experience with version control (Git) and CI/CD practices is a plus.
Nice to have:
* Experience with BI and visualization tools such as
Power BI
,
Tableau
, or
Looker
.
* Knowledge of
streaming technologies
(e.g., Kafka).
* Experience with
Data Governance frameworks
and data quality tools.
* Understanding of DevOps and DataOps practices.
* Knowledge of Agile/Scrum methodologies.
Profile
* Strong analytical mindset and problem-solving skills.
* Ability to work autonomously in a freelance setup.
* Excellent communication skills and ability to collaborate with cross-functional teams.
* Fluent in English; French or Dutch is a plus.