“The Data You Share Is Never Really Yours”: An Interview with a Former Tech Policy Advisor

Dr. Annika Hoffman spent seven years as a senior policy advisor to the European Data Protection Board before moving to academia. She now leads the Digital Rights Research Centre at Utrecht University. We spoke for ninety minutes over video call. She was direct in a way that regulators rarely are while still in post.


Let’s start with the obvious question. Did GDPR work?

It depends entirely on what you think it was supposed to do. If the goal was to force companies to write privacy policies and show cookie banners — then yes, absolutely, it worked. Every website you visit now has a popup you click through without reading. Mission accomplished, I suppose.

If the goal was to give individuals meaningful control over how their data is collected and used — which is what the regulation actually says — then the record is much weaker. The enforcement has been slow, the fines have been insufficient as a deterrent for companies operating at scale, and the consent mechanisms we ended up with are largely theater.

Theater in what sense?

Think about what informed consent requires. It requires that you understand what you are agreeing to, that you have a genuine choice, and that refusing consent does not come at a significant cost. None of those conditions are meaningfully met by a cookie popup. The average privacy policy takes eighteen minutes to read. Nobody reads it. The opt-out is three layers deep in a settings menu. And if you say no to tracking on a major platform, your experience degrades or the service is withheld entirely.

“Consent in digital environments is a legal fiction. We wrote the law assuming a level of comprehension and agency that doesn’t reflect how human beings actually interact with technology. That was a failure of imagination on our part.”

What should have been done differently?

We should have regulated the use of data, not the collection of it. The distinction matters enormously. Right now, the law says companies must tell you what they are collecting and get your permission. But what they do with it — how long they keep it, who they sell it to, how it is combined with other data sets, what decisions it informs — remains largely unregulated once consent is obtained.

You consent to give a fitness app your heart rate. That data is then sold to a data broker, combined with your purchase history, your location data, and your browsing behavior, and used to produce an insurance risk profile that you never see and cannot challenge. The consent you gave covered none of that. But it was technically legal.

Is there a technology that particularly concerns you right now?

Facial recognition, without question. We have normalised a surveillance capability in public spaces that would have been unthinkable fifteen years ago. There are cities in Europe where your face is being captured and matched to a database every time you walk past a camera. The legal basis for this is, in most cases, entirely unclear.

But I’d also say I’m watching the development of inference technology very closely. This is the ability to infer things you have not disclosed — your political views, your sexual orientation, your likelihood of developing a particular illness — from data that appears unrelated. You didn’t tell the algorithm you were depressed. But your scrolling patterns, your sleep times, your purchase behavior, and your location history collectively predicted it to 83% accuracy. Consent frameworks are not built for this at all.

What would you tell an ordinary person who wants to protect themselves?

Honestly? There is a limit to what individual action can achieve against structural problems. That said:

  1. Use a password manager and enable two-factor authentication on every account that matters. This is basic hygiene.
  2. Use a privacy-focused browser and search engine for your default browsing. Firefox with uBlock Origin, DuckDuckGo — these are easy changes with real impact.
  3. Be selective about which apps you install and what permissions you grant. Most apps do not need access to your contacts, location, and microphone simultaneously.
  4. Opt out of data broker databases where your jurisdiction gives you the right to do so.

But the more important answer is: vote, engage with the democratic process, and support organizations that litigate on digital rights. The GDPR, for all its limitations, was passed because civil society made it politically costly not to act. That is still how change happens.

Last question — is there any reason for optimism?

There is always reason for optimism if you take the long view. The fact that we are having this conversation publicly — that data rights is a mainstream political issue — is genuinely new. Five years ago this was a niche concern for technologists and privacy lawyers. Now it appears in party manifestos. That is real progress, even if the legislation hasn’t kept pace yet.

I also think the AI Act in Europe is more significant than people are crediting it. It is the first attempt anywhere in the world to regulate AI systems by risk category and require transparency about training data. It is imperfect and it will be contested. But it establishes a principle that these systems can be governed. That principle matters.


Dr. Annika Hoffman’s book, The Consent Economy, is published by MIT Press.

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *