GDPR
Compliance
Health

GDPR for health SaaS: Article 9 without the panic

Oscar Rovira
6 de maig del 2026
4 min
GDPR for health SaaS: Article 9 without the panic

Why Article 9 matters

The General Data Protection Regulation (GDPR) reserves a special regime for special categories of personal data — the well-known Article 9. Health data is one of them: any information about the physical or mental health of a natural person, including the provision of healthcare services, that reveals information about their health status.

An aesthetic clinic that receives a WhatsApp message like "I have rosacea, is laser an option?" is receiving — whether they intend to or not — a piece of health data. If that message passes through an automated system (an AI agent, a CRM tool, an autoresponder template), the system is processing Article 9 data. The legal bar moves up a notch.

Three questions you must be able to answer

Before signing with any SaaS vendor that touches patient conversations, ask three concrete questions. If they can't answer clearly, don't sign.

1. Where is the data stored?

GDPR allows processing inside the European Economic Area without friction. Outside — United States, India, China — the transfer must satisfy specific mechanisms (Standard Contractual Clauses, adequacy decisions, etc.). In 2020 the CJEU invalidated the EU–US Privacy Shield (Schrems II); the 2023 Data Privacy Framework reopened the flow but with reinforced obligations.

Acceptable answer: "Servers in Frankfurt, Dublin or Paris, under EEA providers". Unacceptable answer: "In the cloud" with no further detail.

2. Is there a signed DPA with subprocessors?

When a clinic uses a SaaS, the clinic is the controller and the SaaS is the processor. A signed Data Processing Agreement (DPA) is required. If the SaaS in turn uses other services (Resend for email, OpenAI for the language model, AWS for hosting), those are subprocessors and each one needs its own DPA in the chain.

Acceptable answer: the vendor sends you their standard DPA, lists subprocessors publicly, and notifies you in advance of any change. Unacceptable answer: "we'll handle it on our side".

3. What happens when you request erasure?

Article 17 (right to erasure) is a real right, not a decorative one. If a client of yours asks you to erase all their data, you must be able to do it — and the SaaS must be able to do it on the systems it manages. If a vendor tells you "conversations stay in history, they can't be deleted individually", you have a problem.

The cost of doing it right

A GDPR-compliant system for the health sector isn't more expensive than a non-compliant one — it requires more rigour in the setup. You can have an elegant, cheap AI agent up in a day, but if it processes data in the United States without a valid legal basis, at some point the data protection authority or a client will ask, and you won't have an answer.

The real investment is in the initial phase: three weeks to validate the legal flow, sign DPAs, configure retention, and document legal bases. From there, operating cost is the same.

What we offer at Auratech

Our AI-agent product for clinics complies with Article 9 because it was designed that way from day one: servers in Frankfurt, standard DPA signed with each clinic, public subprocessor list, ARCO rights operable from the first month. It's not an add-on; it's a requirement of the product itself.

If you're evaluating vendors, we're happy to receive your compliance questionnaire and answer it in writing, before any demo.

Request more information