Three converging regulations define what compliant health AI looks like in Germany. Here is what they require, and how Loretta meets each one.
GDNG, EHDS, and the EU AI Act converge on a shared principle: health AI must be sovereign, auditable, and fair. Each adds specific obligations.
Creates the legal framework for using health data for research, quality assurance, and AI training in Germany. Requires pseudonymisation, purpose limitation, and a legitimate basis for every processing operation.
Strongly favours privacy-preserving architectures like federated learning, where patient data stays within institutional boundaries.
Federated learning ensures data never leaves your institution. Models travel to data, not the reverse. Full audit trails for every training run.
EU-wide framework for primary and secondary use of health data. Promotes interoperability through standardised data exchange formats, requires audit trails for data access, and establishes national Health Data Access Bodies.
Article 54 explicitly prohibits using health data for insurance underwriting or coverage decisions. Full purpose limitations apply.
Standards-based data layer with structured export formats. Tamper-proof processing logs for every data access event.
Health AI systems classified as high-risk under Annex III. Requires conformity assessments, data governance, explainability, human oversight, risk management, and post-market monitoring.
Providers must maintain quality management systems, complete technical documentation, and report serious incidents within 15 days.
Causal models provide interpretable recommendations. Built-in monitoring for model performance and outcome fairness. Designed from the ground up for high-risk classification requirements.
Key milestones across GDNG, EHDS, and the EU AI Act. Some are already in force.
| Regulation | Requirement | Loretta Capability |
|---|---|---|
| GDNG | Pseudonymisation and data protection safeguards (§6 GDNG, GDPR Art. 9) | Federated learning: models train locally, only encrypted parameters aggregate |
| GDNG | Purpose limitation and data minimisation | Role-based access control with per-operation audit logging |
| GDNG | Legitimate basis for each processing operation | Configurable consent and legal basis mapping per data type |
| EHDS | Cross-border interoperability | HL7 FHIR-native data layer with standardised export formats |
| EHDS | Audit trail for secondary data use | Immutable processing logs with cryptographic verification |
| EHDS | Anonymisation and pseudonymisation | Built-in differential privacy and k-anonymity guarantees |
| EU AI Act | Explainability and human oversight | Causal models with interpretable intervention recommendations |
| EU AI Act | Risk management and post-market monitoring | Continuous model performance monitoring with drift detection |
A practical guide to meeting GDNG requirements when deploying AI in German health organisations.