We’re working with a small scale tech company based in Ghent who are searching for an experienced Data Engineer to build scalable and reliable data platforms.
Design, build, and maintain scalable data pipelines and data platforms
Develop and optimize ETL/ELT workflows for ingesting and transforming data
Implement and maintain cloud-based data solutions on AWS, Azure, or GCP
Develop and maintain data models and data transformation layers
Ensure data quality, governance, and compliance within data pipelines
Optimize performance, scalability, and cost-efficiency of data infrastructure
Integrate data from multiple internal and external sources
Contribute to monitoring, testing, and reliability of data pipelines
3–5+ years’ experience in Data Engineering
~ Proficiency in Python, Java, or similar programming languages
~ Experience building data pipelines using Spark, Hadoop, or similar big data frameworks
~ Experience working with cloud data platforms (AWS, Azure, or GCP)
~ Experience building batch and/or real-time data pipelines (e.g., Knowledge of data modeling and warehouse design
~ Experience supporting AI/ML data workflows
~ Familiarity with modern data architectures (Data Mesh / Data Fabric) is a plus
Collaborative and innovative engineering environment