Your Consent Doesn’t Scale
When your life becomes training data and forgetting becomes a subscription, the most radical thing left is to disappear.
Special announcement: I’ve got two keynotes ahead - On Sale Live (how live experiences have become the most powerful cultural currency in the attention economy) and CineEurope (Cultural ignition through theatrical reinvention). Send me a message and say hi. Would love to see you in person.
Please Don’t Remember Me Like This
You’re wearing a Bee wearable AI pendant. You just got it. Seems cool.
You forgot it was on — you usually do. It’s just part of your outfit now, like a watch or a lucky necklace from someone you don’t talk to anymore.
The fight had started with something small. Dishes, maybe. Something to do with how you spoke or didn’t speak that day. It got louder. Familiar phrases came out. Old ones. Afterward, everything went still in that way things do when it’s too late to take them back.
Later, you opened the Bee app.
Just curious. Just to see what it had picked up.
The summary read: “A spirited household conversation between two cohabitants regarding division of tasks.”
Cohabitants.
You scroll the transcript. It’s all there — the raised voices, the careful pauses, the messy tangle of frustration and care that happens when two people try and fail to speak the same emotional language.
But the AI has no opinion about any of it.
It doesn’t weigh context or mood. It just logs. Categorises. Tags.
What felt like a rupture in your relationship is now a bullet point on a list of daily highlights.
That’s when it hits you: the machine remembers everything.
It just doesn’t understand any of it.
This isn’t some dystopian moment. You’re not being watched by the NSA or profiled by Palantir. You’re just — living. With a wearable that records. With an inbox that gets scraped. With posts from five years ago training a model you’ll never meet.
This is the new ordinary.
You talk. You type. You exist. And somewhere, a system absorbs it. Classifies it. Feeds it forward into someone else’s interface.
Not because you said yes, but because you never figured out how to say no.
The Universal Opt-Out is the attempt to change that. Not just legally — economically, intellectually, spiritually. It’s the line between experience and data, between memory and mining. It’s not just about removing yourself from a dataset. It’s about reclaiming the right to live unrecorded. Or at least, to choose when and how you’re remembered.
The Universal Opt-Out is a proposed mechanism that gives individuals the right to exclude their data — past, present and future — from being used to train AI models. It’s a structural reversal of the current default, where participation is automatic unless you actively resist. The goal is to make forgetting the baseline — and remembering something that requires permission.
It’s not just a technical setting. It’s a legal safeguard, an ethical stance, and, increasingly, a design challenge:
How do you tell a machine not to learn from you? How do you make that refusal portable, enforceable and respected — even after your data has been scraped?
Maybe this sounds dramatic. But when remembering becomes automated, so does forgetting.
And that, more than anything, might be what we lose: the gentle human right to let something fade.
Because “don’t train on my data” isn’t just a checkbox, it might be the most important sentence of the decade.
Today We’re Talking About
What it means to be remembered forever by a machine that never asked and doesn’t understand.
Why Meta’s defence — “we couldn’t contact all the authors” — is corporate for “your consent doesn’t scale.”
How generative AI models became the new colonial archives: extracting memory, remixing culture, selling it back to us.
Why forgetting isn’t a glitch — it’s a survival mechanism, and the Universal Opt-Out is our tool to bring it back.
The idea of strategic forgetting: not deletion, but dignity.
Why your childhood posts may have trained the model replacing your job — and there’s no Ctrl+Z.
How “opt-out” is becoming a new kind of currency, and why the memory economy is already here.
Whether your Spotify breakup playlist is now helping train an AI therapist. (He didn’t get paid. Neither did you.)
Why your personal AI diary might get subpoenaed.
What it means to build etiquette for digital amnesia: mute buttons, forgetting zones and the right to disappear.
What happens when the most luxurious product of the 2030s… is not being remembered at all.
And why the most humane thing a machine could learn next isn’t how to speak — it’s how to forget.
THE ARCHIVE IS A WEAPON
“We Regret to Inform You That You’ve Been Trained”
Meta’s lawyers recently argued that they had no way to contact all the authors whose work helped train their LLaMA models. The dataset was too vast, too messy, too scraped-from-the-open-internet to allow for anything as quaint as permission.
In other words: your consent doesn’t scale.
This isn’t a legal glitch as much as a feature of the system.
Today’s most powerful AI models are built on mass unconsented memory. Books, tweets, forum posts, medical journals, product reviews, journal entries, teenage blogs — all processed, abstracted, embedded. This is how the machine learns: by devouring.
But not all memory systems are neutral. And not all forgetting is accidental.
Who gets remembered, who gets erased
If AI is the archive of the 21st century, then we have to ask: who’s building it? Who’s curating the memory? And who decides what’s worth remembering?
Philosopher Michel Foucault warned that archives are not just collections — they are systems of power. They determine what’s possible to know. What’s visible. What’s valuable. And what never happened, officially.
Jacques Derrida, in Archive Fever, added another twist: that every archive is also an act of forgetting. You record one thing and, in doing so, erase a thousand others. The archive doesn’t preserve memory — it shapes it. And in doing so, it does violence.
Today’s AI systems are archival machines at industrial scale. They sweep up vast swaths of human expression, flatten it into tokens and probabilities and call it knowledge. But there’s no context. No nuance. No consent. Just pattern recognition with a business model.
This is what some theorists call data colonialism — the extraction of human experience for computational profit. Not by force, but by infrastructure.
What the lawsuits are really about
The lawsuits are piling up.
The New York Times is suing OpenAI and Microsoft for using its articles to train models that now regurgitate paywalled content verbatim — undermining both its journalism and its revenue.
Getty Images is suing Stability AI for scraping 12 million copyrighted photos, including ones with visible watermarks, to train image generators that now bypass licensing altogether.
In the U.S., a federal judge recently ruled against AI startup Ross Intelligence, which had copied thousands of legal headnotes from Westlaw to train a legal assistant. The court rejected the fair use defence, noting that building a direct substitute using someone else’s curation “pushes beyond fair use” and undercuts the original market.
Translation: the courts are beginning to recognise that just because the data is there doesn’t mean it’s yours to use.
Still, most people will never sue. Most people won’t even know their work was used.
And that’s the deeper structure: if you’re not part of the memory, you’re excluded. If you are, you’re exploited.
THE RIGHT TO BE LEFT OUT
“Forgetting Isn’t a Bug. It’s the Whole Point.”
If the archive is power, then maybe the most radical act left is to refuse it. Not with a protest sign. Not with a court filing.
With silence. With absence. With no.
French philosopher Bernard Stiegler described modern technologies — from writing to photography to AI — as tertiary memory systems. They hold what we forget. They externalise what once lived in us. They let us remember more — and, in doing so, make us remember less.
The trade is subtle: more capacity, less control.
Because the more we offload memory to machines, the more we rely on them to tell us who we were. What happened. What mattered.
But forgetting isn’t a failure. It’s a function.
To forget is to heal.
To forget is to grow.
To forget is to survive the things memory won’t let us metabolise in real time.
Narrative vs. hoarding
Human memory doesn’t work like a hard drive. It’s narrative. It’s selective. It’s messy and merciful.
As Byung-Chul Han puts it: “Human memory is selective… it is narrative. Digital memory is cumulative.”
We forget in order to make meaning.
We edit the story of ourselves as we go — not because we’re unreliable, but because we’re human.
We don’t need to remember every text, every conversation, every version of ourselves. We need to remember the ones that let us go on.
But AI doesn’t edit. It archives. Every word is equal. Every post is preserved. Every scrap of experience is tokenised and transformed into training data — flattened into fuel.
There is no narrative arc. No emotional priority. No discretion.
The machine remembers everything. It just doesn’t know what any of it means. That’s the mismatch. We let things die. The machine reanimates.
Strategic forgetting: the right to burn the letter
So what does it mean to be remembered forever by something that doesn’t understand you?
It means a joke you regret, a blog post you wrote while depressed, a love letter you never sent — any of it might still be alive in the model.
Pulled up by a prompt. Repurposed into tone-matching. Echoed back by a chatbot that was never there.
No shame. No tenderness. Just recall.
To opt out, then, is not just to protect your data. It’s to declare that not all memory belongs in the archive. It’s to practice what we might call strategic forgetting.
It’s like burning a letter. Like letting a joke die in the group chat. Like deciding that not every moment deserves to be searchable.
This isn’t about censorship. It’s about curation.
It’s about defending the right not just to speak, but to let a memory rest.
Nietzsche warned that too much history can paralyse a society.
Viktor Mayer-Schönberger made a similar case for individuals: that when everything is preserved — every mistake, every half-formed thought — we’re frozen in place. Trapped by versions of ourselves we no longer recognise.
What AI now calls “training data,” we used to call heartbreak.
Maybe it still is.
And maybe the real future of memory isn’t total recall — it’s the right to be remembered only if you want to be.
TERMS OF DISSERVICE
“You Agreed to This While Asleep”
You never gave permission, exactly.
Or maybe you did — sometime in 2013, when you clicked “I agree” on a terms of service you didn’t read, in a browser window you closed before finishing the first paragraph. That click is now doing legal heavy lifting on behalf of some of the most powerful AI companies in the world.
We’ve entered an era of ambient consent.
You exist. You post. You type. You search.
The system logs it. Trains on it. Profits from it.
The legal scaffolding of this bad dream
Across Europe and the US, the rules are starting to catch up — or at least pretend to.
The EU AI Act and the Digital Single Market Directive now require that content owners explicitly “reserve their rights” if they want to stop their work from being used to train AI. In theory, that means artists and writers can opt out. In practice, they have to find the right form, learn the right language, file the right request — before their content is scraped.
The GDPR gives people the right to be erased, to object and to demand transparency around how their data is used. But once that data is embedded in a model, it’s already lost its shape — there’s no clear way to remove it from the weights.
In California, the CCPA/CPRA offers the right to opt out of the sale or sharing of personal data. And if AI training qualifies as “sharing,” that right should apply. But again: by the time you find out your diary entries from 2009 were in the training set, the model has already moved on.
The implementation problem
Most current opt-out systems are decorative.
robots.txt, ai.txt, DoNotTrain tags — these are the “keep out” signs on the front lawn of your website. But there’s no fence. No enforcement. And many scrapers don’t bother to look.
Even when companies say they’ll honour these tags, it’s not retroactive. If your data has already been ingested, you’re not getting it back. There is no Ctrl+Z for memory.
So the legal conversation about training data isn’t really about compliance - it’s about philosophy.
Is remembering me without asking a breach of contract — or of identity?
Because when your old tweets train a new chatbot, or your childhood blog becomes a tone-matching prompt, the harm isn’t just commercial. It’s existential.
You’re being remembered by a system that doesn’t know who you are, what you meant or why you stopped posting.
Your past is now predictive infrastructure. And you never agreed to that.
WHO GETS PAID TO BE REMEMBERED?
“Opt-In Is the New Monetisation Layer”
There’s another path emerging. Not just opt-out — but opt-in, for a fee.
Some call it data dignity. Others call it a distraction from regulation.
But either way, the future is being priced.
Memory as labour
Jaron Lanier has argued for years that data isn’t exhaust — it’s labour. If AI systems are built on our conversations, our clicks, our patterns, then those patterns should be paid for.
Andrew Yang’s Data Dividend Project agrees. It imagines a world where your data — your digital memory — is treated like a resource. If companies want to train on it, they need to license it. If they profit, you get a cut.
Co-ops for memory
Some groups are already testing this.
Streamr and Swash let users sell anonymised browsing data, aggregated into collective pools.
The idea is simple: you and thousands of others form a data union. You sell access together. You get paid together.
It’s like a farmers’ co-op. But instead of milk, it’s your late-night TikTok rabbit holes.
Other models are even more granular:
Ocean Protocol lets people tokenise their data, put it on a blockchain and sell access via smart contracts.
Reclaim Protocol gives you control over specific types of data — like your Spotify listening history or Uber rides — letting you choose who can use it and when.
These systems turn consent into a programmable layer — and memory into a marketable asset.
The future of licensing your past
If this scales, we may see:
Memory treated like music publishing — where every instance of reuse, remix or sampling pays the original creator.
Smart contracts that auto-pay contributors when their data helps train a model.
Memory marketplaces, where you license your heartbreaks like a sync deal for TV.
But there’s a tension underneath it all:
“Data-rich” individuals — influencers, experts, niche creators — will have more valuable memory to sell.
“Data-poor” people, whose online lives are less polished or less visible, may get nothing — or worse, be scraped for free because they don’t know how to opt out.
The system already remembers the rich better than the rest.
This just turns that into a feature.
And the weirdest part?
Your ex’s Spotify breakup playlist might be training an AI therapist right now.
He didn’t get paid. You didn’t consent. But someone, somewhere, is definitely monetising the vibe.
THE CULTURE OF NEVER FORGETTING
“You Are Being Recorded. Always.”
If the legal frameworks and economic models feel theoretical, the experience of being remembered by a machine is anything but.
You’re not debating copyright. You’re just living your life. Wearing a pendant that records it. Talking, moving, breaking down — and somewhere, quietly, your second brain is taking notes.
Welcome to the era of life-logging, passive surveillance and algorithmic diaries.
Welcome to the age of perfect recall, whether you want it or not.
Emotional side effects of total recall
The more the machine remembers, the more you become aware that you’re being written down.
You start watching yourself speak.
You hesitate mid-sentence, imagining the future summary.
You worry about how your grief will be formatted.
For some, this is convenient. For others, it’s deeply unsettling.
People report:
Anxiety, knowing their thoughts might be replayed back to them, out of context.
Re-traumatisation, when past moments resurface as sterile summaries.
A kind of performative disassociation — where you stop being present and start imagining how the machine is capturing you.
You’re no longer a person having a life.
You’re a character in a story your device is writing without you.
New etiquette for forgetting
As tools like Bee, Rewind and AI-enabled smart glasses creep into everyday use, a new social protocol is forming — one built around interpersonal opt-out.
“Mute the Bee” has become shorthand in some friend groups for “Please, not this. Not now.”
Some therapists and doctors now request patients not bring wearable recorders into sessions.
Intimate gatherings are adopting no-AI zones — not just no phones, but no transcription, no life-logging, no silent witness embedded in your shirt collar.
It’s a return to ephemerality as safety.
A reminder that some things only happened because they weren’t recorded.
A joke. A breakdown. A reconciliation.
All moments that used to fade, as they should.
FUTURES OF FORGETTING
“Premium Amnesia, Available by Subscription”
Somewhere in the near future, a subscription service will offer to scrub you from AI models.
A sleek dashboard. A toggle marked “Forget Me.”
An email receipt confirming that your college blog, your old SoundCloud raps, your DMs from 2016 — have all been queued for deletion across the major training sets.
This isn’t science fiction. It’s product design.
The desire to disappear has always existed.
What’s new is that it now requires infrastructure.
Forgetting-as-a-Service
Companies are already exploring the tech stack of deletion.
Tools that scan open-source models and datasets for your content, issuing takedown or opt-out requests on your behalf.
AI equivalents of credit score monitoring: tracking where you’ve been scraped, sold or simulated.
Entire startups building toward “machine unlearning” APIs — systems that can flag, isolate and purge specific data from trained models.
Of course, it won’t be free. There will be tiers. You’ll pay to be forgotten faster. You’ll pay more to ensure the deletion sticks.
Forgetfulness will become a luxury product; privacy rebranded as premium.
Digital Memory Trusts and Posthumous Consent
The question doesn’t stop with the living.
In the coming years, wills may include “Digital Memory Trusts”: legal instruments specifying which AIs may remember you and how.
The right not to be reconstructed will emerge as a posthumous opt-out — your family can’t build a chatbot version of you unless you explicitly said yes.
Services may pop up that resurrect your digital self, for a fee — raising the legal and emotional question of whether a synthetic replica counts as identity theft or memorialisation.
We already have the first prototypes:
Chatbots trained on dead friends’ texts.
Voice clones of parents.
AI-written eulogies scraped from a decade of email drafts.
The grief industry is becoming computational.
Soon, memorialisation will be a licensed experience.
And the refusal to be simulated — the ability to stay dead — will become an act of control.
The Rituals of Letting Go
In this new landscape, we’ll need new rituals.
Not just deleting an ex’s number, but issuing a global memory purge.
Not just cleaning your apartment, but hiring a “mind cleaner” — a professional trained to remove you from the archives.
Digital funerals where the final act isn’t flowers, but signing a do not resurrect clause.
Data tombstones. Redacted legacies. Memory with a half-life.
All of this will seem strange — until it doesn’t.
Until forgetting becomes a sign of self-respect.
Until erasure becomes an option in the settings menu.
Until the most valuable product of 2030 isn’t intelligence, but absence.
Not being remembered might become the ultimate form of luxury.
A signal that you mattered enough to disappear on your own terms.
THE OBLIVION CLAUSE
“What the Machine Forgets Is the Most Human Thing About It”
Let’s be clear: the Universal Opt-Out isn’t just a legal tool.
It’s not a checkbox.
It’s not a preference panel.
It’s a demand.
A demand to be forgotten not because you were insignificant — but because you were real.
Because being remembered by a system that doesn’t know how to understand you isn’t intimacy.
It’s extraction.
It’s flattening.
It’s theft, dressed as permanence.
This is about survival
To opt out is to reclaim time.
To curate your past.
To say: this, and not that. Here, and no further.
It’s the right to decay.
To fall out of relevance.
To be irretrievable.
And that, strangely enough, might be the only humane way to live in a world where machines never forget.
Toward an ethics of forgetting
The future worth building isn’t one where memory is infinite.
It’s one where memory is earned.
Where being remembered is consensual.
Where forgetting is designed in — not patched on.
Imagine:
AI systems that forget on purpose.
Protocols for ephemerality, not just retention.
A cultural economy that values silence, pause, and expiration — not just documentation.
Maybe the next great innovation isn’t another memory model. Maybe it’s teaching the machine how to forget.
Not because it has to. But because it finally understands why.
Please Don’t Remember Me Like This
Maybe later — a week, a month, a year from now — you’ll open the app again.
Just curious. Just to see what it still remembers.
The summaries will be cleaner now. More confident. The model will have updated. The emotional tagging more precise, maybe. Less Bee, more therapist.
You’ll scroll through your life, flattened and timestamped.
And then, without ceremony, you’ll tap the settings icon.
You’ll find the toggle that says: Use this data to train future models?
And you’ll pause.
Because maybe by then, you’ll have realised something: You don’t want to be erased. You just want to be asked.
You want the right to opt in — on your terms. To decide what parts of you are remembered. To choose the version of your voice that gets carried forward. To give permission not as default, but as declaration.
Because the problem was never memory.
It was being remembered without your consent.
And maybe the next phase of this future isn’t deletion, it’s design.
A world where being forgotten is the baseline.
And being remembered is something sacred, negotiated, rare.
Something you say yes to. When you’re ready.
And not a second before.