Data Architect
We are looking for an experienced Data Architect to support the transformation of a large legacy data environment into a modern, scalable, cloud-based data platform. This role plays a key part in shaping data architecture strategy while working closely with technical and business stakeholders whilst working for a IT consultancy with a European institution
Key Responsibilities
* Define, maintain, and evolve a data architecture strategy that supports both Business Intelligence and AI-driven workloads.
* Design and deliver vendor-neutral cloud architectures aligned with open-source principles.
* Lead the design of a modern, scalable data platform, replacing a complex legacy system through a phased migration approach.
* Ensure architectural decisions align with data governance standards and the organisation’s cloud adoption strategy.
* Establish and oversee data management standards covering data quality, security, monitoring, and platform reliability.
* Guarantee compliance with regulatory and audit requirements.
* Provide technical leadership, guidance, and mentoring to data analysts and data engineers.
* Support change management by guiding users and stakeholders through the migration journey.
* Produce and maintain detailed documentation of data architecture, platforms, and data assets.
* Assist with deployment, configuration, validation, and testing activities.
* Collaborate with other project teams and contribute to cross-functional meetings.
Required Skills and Experience
* Proven experience migrating legacy data platforms to modern cloud-based, open-source solutions, ideally using a Data Lakehouse approach.
* Strong expertise in designing scalable, flexible cloud data architectures.
* Hands-on experience leveraging AI-powered assistants such as Amazon Q to support innovative data platform design.
* Solid understanding of Kubernetes and containerised environments.
* Experience with relational databases such as PostgreSQL and Oracle, and non-relational technologies including Elasticsearch and MongoDB.
* Strong background in ETL and ELT pipelines using tools such as Spark, dbt, and Trino.
* Experience with orchestration tools like Airflow, Dagster, or Luigi.
* Good knowledge of data governance frameworks and tools, including DataHub or OpenMetadata, as well as data quality, security, access control, and compliance.
* Experience integrating systems via RESTful APIs.
* Familiarity with DevSecOps practices and CI/CD pipelines for data platforms.
* Strong communication skills, able to explain complex data architecture concepts to both technical and non-technical audiences.
* Experience using data and architecture modelling tools.
Contract Details
* Day Rate: Flexible
* Length: Long-term
* Language: English
* On-site Requirement: 1 to 2 days per week
If you are a Data Architect who enjoys working on complex transformation programmes and shaping future-proof data platforms, this is a strong long-term opportunity to make an impact.