About the Opportunity:
One of our clients, a fast-growing MedTech innovator in the connected-health space, is seeking a Data Engineer to drive the development of scalable, compliant, and insight-driven data infrastructure supporting both commercial and clinical operations.
This is a hands-on role responsible for designing data pipelines, building robust models, and enabling self-service analytics that accelerate research, product development, and decision-making across multiple business units.
Key Responsibilities:
ETL & Data Ingestion
* Design and monitor pipelines ingesting data from multiple internal and third-party systems (e.g., CRM, ERP, e-commerce, telephony, and flat-file sources).
* Orchestrate workflows using AWS services such as Glue, Step Functions, and Airflow.
* Ensure data accuracy, schema evolution, and full auditability.
Data Modeling & Normalization
* Build dimensional models and maintain a unified data layer across patient, order, and device domains.
* Implement privacy-by-design practices, including PII redaction and GDPR compliance.
* Version and document all models to support transparency and reproducibility.
Reporting & Analytics Enablement
* Develop and maintain dashboards for business and clinical teams.
* Enable self-service analytics through reusable components and scheduled reporting.
* Maintain a central data catalog and train stakeholders on data usage and governance.
AI/ML & Research Support
* Prepare and package curated datasets for R&D initiatives.
* Prototype and productionize machine-learning features, ensuring regulatory compliance.
* Collaborate with external partners under secure data-sharing protocols.
Business Unit Collaboration
* Serve as the primary data liaison for Marketing, Sales, Finance, and Regulatory.
* Deliver ad-hoc data requests in a timely and controlled manner.
* Proactively surface insights and recommend data-driven improvements.
Qualifications:
Core Data Engineering
3+ years building production ETL pipelines (dbt, Airflow, Glue, Spark); strong SQL and Python skills
Modeling & BI
Dimensional modeling, Tableau or equivalent, and data governance best practices
Observability
Experience with pipeline monitoring tools and automated testing frameworks
APIs & Scripting
REST/GraphQL integration and secure authentication (OAuth 2.0)
Version Control
Git-based CI/CD workflows and infrastructure-as-code mindset
Soft Skills
Excellent English communication, documentation discipline, and stakeholder management in regulated settings
Nice to Have: AWS SageMaker, Feature Store, or HealthLake; Amplitude or similar analytics schema; Salesforce SOQL; R for statistical analysis; ISO 13485 or SOC 2 controls.
What’s Offered:
* €65.000 - €75.000 base salary
* Meal vouchers
* Company phone with subscription
* External training opportunities
* Family-wide DKV hospitalization insurance
* 13th month salary
* Performance-based bonus
* Up to 2 days of teleworking per week
* Flexible working hours