Requirements
What do you need to succeed in this position?
* Master’s degree in IT and at least 13 years of IT experience (or Bachelor's degree and at least 17 years of IT experience).
* Experience in migrating legacy data systems (SAP DataServices, SAS Data Integration) to a modern cloud-based, open-source data platform solution (preferably Data Lakehouse).
* Excellent knowledge of designing scalable and flexible modern cloud-based and open-sources data architectures.
* Experience with AI-powered assistants like Amazon Q for innovative data solutions design.
* Strong exposure with Kubernetes.
* Previous experience with relational and non-relational database systems (PostgreSQL, Oracle or Elasticsearch, MongoDB).
* Experience with ETL/ELT processes and related data ingestion and transformation tools (like Spark, dbt, Trino).
* Proficiency in data pipeline orchestration tools (like Airflow, Dagster, Luigi).
* Knowledge of data governance frameworks and tools (like DataHub, Open Metadata).
* Familiarity with data quality management, data security, access control and regulatory compliance.
* Proficiency with system-to-system integration via RESTful APIs.
* Experience with DevSecOps practices and tools related to data pipelines, including CI/CD for data infrastructure.
* Good knowledge of modelling tools.
* Advanced English (C1) communication skills (written and spoken).