Facebook Just Won a Competition to Read Your Mind. Literally. Hollywood is Next.
Meta’s TRIBE AI can predict how your brain will react to a scene it’s never seen, a breakthrough that could rewrite how films, TV, and streaming are made, marketed, and consumed.
Before we get to it, a programming note: Attention Economy Part 2 is coming later this week, so stay tuned.
Also, I’ve got a few public appearances on the horizon so if you’re there or around, say hi!
Opening Keynote at CineLATAM in Miami (15–18 Sept): Discussing why Latin America remains the region where theatrical cinema still feels like shared electricity.
Opening keynote at Content Canada in Toronto (3 Sept): Diving into Canadian content’s global future.
Plus, I’ll be speaking at the LA Tourism Market Forum (9 Sept), delivering The Attention Economy Changes Everything: LA in the Age of Infinite Distraction.
Now on to today’s shockingly real story, today we’re talking about:
Facebook just built an AI that can read your mind and proved it by winning a global neuroscience challenge.
Its billion-parameter TRIBE model predicts how your brain will respond to films and TV, even ones it’s never seen.
Forget focus groups, this is predictive neuro-engagement mapping, where your neural patterns become the dataset.
Once tied to consumer devices, it could guide how studios greenlight, edit, and market content in real time.
The leap from lab to Hollywood is shorter than you think and the first movers will own the mental runway hits take off from.
Subscribers also get:
The short list of companies most likely to weaponise this first.
The hardware already in your home that could power it.
My M&A watchlist of neuro-tech startups that could be snapped up to make it all happen.
Somewhere in Paris, a billion-parameter transformer is watching Friends reruns and trying to guess exactly how your brain would light up when Ross says, “We were on a break.”
It’s called TRIBE. It’s made by Facebook. And it just won the 2025 Algonauts Challenge, which is basically the Eurovision Song Contest for brain-reading AIs, except instead of singing, contestants feed hours of human fMRI scans into machines to see who can predict the squiggly patterns in your cortex when you watch TV.
The goal: build AI that can predict how the human brain responds to complex, real-world media, the holy grail for both neuroscience and, as you’re about to see, the entertainment industry.
In practice, this means hours of squiggly, colour-coded 3D brain maps captured as volunteers watch movies: a literal record of what lit them up, bored them, or made them cringe.
TRIBE doesn’t just get it right for the shows it’s seen before, it nails entirely new scenes. Out-of-distribution mind-reading. This is Minority Rerun.
The Big Idea
Facebook — sorry, Meta — now has a research-grade model that can take the pixels, the soundtrack, and the script from a show or film and tell you how your brain will respond before you’ve even hit play.
Forget likes, clicks, or “Top 10 in Your Country Today,” this is predictive neuro-engagement mapping. Your skull is no longer a walled garden, et voila!, it’s just another dataset.
Hollywood, Meet Your New Focus Group
TRIBE-class models can:
Tell a showrunner exactly which moments will spike attention, drop it, or melt it into confusion.
Forecast whether that 7-minute drone shot of a beige Airbnb will soothe or sedate your target audience.
Decide which cut of your trailer maximises curiosity without tripping the “skip ad” reflex.
No more awkward test screenings in Burbank. Just upload your cut, get a neural heatmap, and brace yourself when Scene 17 lights up no one’s amygdala.
The Future Studio Notes
“The left temporoparietal junction isn’t firing enough in Scene 17. Can you make the goat explode sooner?”
“We’re losing orbitofrontal cortex activity in the final act. Add a shirtless Chris Hemsworth.”
“This subplot tested well with hippocampal recall pathways in 18–24s. More flashbacks.”
Shockingly, this is not speculative. This is exactly the kind of prescriptive power TRIBE unlocks once the lab-grade brain scans get replaced by cheaper biometric proxies, all of which your phone, TV, or headset will be quietly collecting anyway.
Why This Is a Big Deal for Film/TV/Streaming
Creative Darwinism: Scripts and cuts survive based on predicted neural ROI, not gut feel.
IP Moats: If you own the neuro-signature of “Marvel-level audience engagement,” you own a data moat no rival can clone.
Cultural Weaponisation: Streaming platforms could tune content to cognitively addict specific demographics. (Imagine TikTok’s algorithmic grip but at the level of synaptic firing.)
The Punchline
If this scales, the most powerful person in Hollywood won’t be a director, showrunner, or head of marketing. It’ll be the Chief Neural Engagement Officer, the one who decides exactly how to light, pace, score, and cut your story so the collective audience brain never drifts toward TikTok.
And when that happens, Facebook won’t just know what you like, but what you will like, and how to rewire your attention so you like it harder.
So enjoy your blissful ignorance while it lasts. Your thoughts are still your own… mostly. And somewhere, a studio exec will call it “audience-led storytelling” while quietly A/B testing your hippocampus.
And here’s where it gets uncomfortable: I’ve mapped out the companies, devices, and deals that could turn TRIBE’s mind-reading into Hollywood’s next competitive weapon, and the shortlist of who’s likely to move first.
Subscriber-Only: The Next Three Moves in Neuro-Optimised Entertainment
1. The First Movers to Watch
These are the players with the data stack, hardware footprint, and market power to operationalise TRIBE-class models fastest:
Keep reading with a 7-day free trial
Subscribe to Strange Loop to keep reading this post and get 7 days of free access to the full post archives.