
Picture this: you’re reviewing a vendor contract for a shiny new generative AI tool, and your DPO pings you: “Does this comply with GDPR?” You pause. Because when it comes to GenAI and data protection, the answers aren’t always obvious.
Generative AI is rewriting the rules on how we handle personal data – quite literally. Models can ingest, memorise, and reproduce personal information with surprising ease. And yet, the law they're bumping up against – the GDPR – predates them. So what’s an in-house lawyer to do?
Let’s unpack where the friction lies – and what you can do about it.
1. GenAI and personal data: the sneaky overlap
GDPR applies when personal data is processed. Simple. But with GenAI, it’s not always clear when that happens.
- Training data: If personal data is used to train a GenAI model (especially scraped from public sources), the GDPR kicks in.
- Outputs: If a prompt returns information that identifies someone – bingo, personal data.
- Inference: Even if outputs seem anonymised, there’s risk if identity can be inferred with other data.
The kicker? Many GenAI providers don’t disclose what training data they’ve used. So assessing data protection risks becomes a guessing game.
2. Lawful basis: consent isn’t the easy fix
Under GDPR, you need a lawful basis for processing personal data. But here’s where GenAI gets sticky:
- Consent: Nearly impossible to obtain from every data subject scraped from the internet.
- Legitimate interest: Risky, unless you can clearly show a balance between business use and individual rights.
- Contractual necessity: Doesn’t apply to training models, only to outputs tied to a service.
Even where providers claim to anonymise training data, the Article 29 Working Party’s guidance makes clear that if reidentification is possible, it’s still personal data.
3. Data subject rights: easier said than done
Let’s say someone invokes their right to be forgotten. Or asks to access the data a GenAI model holds on them. Sounds fair – but…
- Access: Most GenAI tools can’t isolate what they “know” about a specific person.
- Erasure: Forgetting is hardwired into GDPR – but not into machine learning models. You can’t always delete training data retrospectively.
- Correction: If a model hallucinates false info about someone, how do you correct it?
The result? A growing tension between GDPR rights and GenAI architecture.
4. The controller-processor puzzle
Under GDPR, the controller decides why and how data is used; the processor acts on their behalf. But with GenAI, roles blur.
- Who’s who? A company deploying a GenAI tool could be a controller for inputs, and outputs, but what about the training stage?
- Joint controllers? Where vendor and customer co-determine purpose, you might both be on the hook.
This matters because responsibilities – and liabilities – flow from your role.
5. So, what should in-house counsel do?
Let’s be honest: you’re probably not going to reinvent GDPR for the AI age. But you can still protect your business.
- Audit your GenAI use: Map where personal data could be involved – in prompts, outputs, or vendor tools.
- Ask your vendors tough questions: What data was used to train this model? Where is it stored? Can data be deleted or excluded?
- Update your privacy notices: Be transparent with users if personal data could be processed via GenAI tools.
- Push for privacy by design: Advocate for models that support data minimisation and subject rights from the outset.
- Stay close to evolving guidance: Regulators are watching. Keep an eye on the EDPB and national authorities’ take on GenAI.
Final thought
GDPR wasn’t written with GenAI in mind – but it’s still the law of the land. Until guidance evolves, in-house lawyers must operate in a grey zone, balancing innovation with compliance.
And remember: if your business is using GenAI, you are responsible for ensuring it plays by the rules – even if the tech is new, the obligations aren’t.
the plume press
THE NEWSLETTER FOR IN-THE-KNOW IN-HOUSE LAWYERS
Get the lowdown on legal news, regulatory changes and top tips – all in our newsletter made especially for in-house lawyers.