Role: Freelance Databricks Engineer
For a complete understanding of this opportunity, and what will be required to be a successful applicant, read on.
Location: Brussels (2 days per month onsite, remainder remote)
Contract Length: 12 months
Rate: €950 per day
Client: Sector-leading banking organisation
Overview
We are partnering with a leading banking client in Brussels to engage an experienced Databricks Engineer for a high-impact data transformation programme. This role will focus on building and optimising scalable data pipelines and enabling advanced analytics and AI use cases on a modern data platform.
Key Responsibilities
* Design, build, and optimise data pipelines using Databricks and Apache Spark
* Develop and maintain scalable data solutions supporting analytics, reporting, and AI initiatives
* Work with large, complex datasets in a secure and regulated banking environment
* Collaborate closely with data engineers, data scientists, product teams, and business stakeholders
* Implement best practices for data quality, governance, and performance optimisation
* Support the evolution of the data platform, including architecture and tooling decisions
* Enable data and AI use cases through efficient data modelling and processing
* Contribute to CI/CD pipelines and automation for data workflows
Key Requirements
* Strong hands-on experience with Databricks in production environments
* Proven expertise in Apache Spark (PySpark/Scala)
* Solid experience building and maintaining data pipelines (ETL/ELT)
* Experience with cloud platforms (Azure preferred; AWS/GCP also relevant)
* Strong SQL and data modelling skills
* Experience working in complex, regulated environments (banking or financial services preferred)
* Understanding of data governance, security, and compliance requirements
* Ability to work independently in a remote-first xphnsxz setup while collaborating with distributed teams
Nice to Have
* Experience with Delta Lake and modern data lakehouse architectures
* Exposure to ML/AI workflows on Databricks
* Familiarity with orchestration tools (e.g., Airflow, Azure Data Factory)
* Experience with DevOps practices (CI/CD, version control, infrastructure as code)
Working Setup
* Remote-first role with 2 days per month onsite in Brussels
* Long-term contract with a leading financial institution
* Opportunity to work on large-scale, business-critical data initiatives