Scientists can now caption your thoughts. What could go wrong?
Mind-captioning is still in its infancy. It needs vast amounts of data, fully cooperative participants, and a very patient AI that doesn’t mind decoding thousands of silent dog videos. But its existence marks a new line in the sand—a moment where reading the human mind shifted from sci-fi speculation to early-stage prototype
In a world where your phone already knows what you're about to type and your smart fridge judges your midnight snacking, Japan has decided to up the stakes. Enter "mind-captioning," the latest feat from researcher Tomoyasu Horikawa, author of a study published 5 November in the journal Science Advances—a man who apparently looked at brain waves and thought, "What if these were subtitles?"
Horikawa, working at NTT's Communication Science Laboratories, has built a system that turns complex mental images into full English sentences, even though the participants themselves spoke Japanese. Yes: the AI is officially more bilingual than you during vacation season.
How to translate a thought
The process sounds a bit like teaching a robot to gossip. Six volunteers sat in a scanner while watching 2,180 silent video clips—people running, things falling, animals behaving badly, the usual internet stuff. Their brain activity was logged meticulously.
Then:
-
Big language models transformed the videos' captions into numerical sequences.
-
Smaller "decoder" models learned to match brain activity to these sequences.
-
The decoders were tested on completely new videos.
-
A final algorithm stitched together words that best matched the neural patterns.
Somehow, through this multi-layered magic trick, the system learned to generate full descriptions of what the participants were imagining, even without tapping into the brain's language circuits. As Horikawa put it, the method works "even when someone has damage around that language network."
So if you can picture it, the AI can—eventually—write it. Whether that's incredible or mildly alarming depends on how comfortable you are with the phrase "brain-reading."
A thought translator with… conditions
The system is nowhere near ready to narrate your daydreams for Netflix. Horikawa himself says it's "not so accurate for practical use," and that while people worry it might expose private thoughts, "the current approach cannot easily read a person's private thoughts."
Also, it has been tested only on normal video scenes. Nobody knows what would happen if the mental content got weird. A dog biting a man? Fine. A man biting a dog? An open question.
The hope: helping people speak without speaking
Despite its quirks, experts are genuinely excited about the assistive potential.
Psychologist Scott Barry Kaufman thinks the study "paves the way for profound interventions" for people who cannot communicate verbally—from those with aphasia to ALS patients to non-verbal autistic individuals. But he also warns that "we have to use it carefully and make sure we aren't being invasive."
Because once a machine can caption thoughts, someone is eventually going to try livestreaming them.
The worry: mental privacy might become vintage
Ethics scholars are, predictably, concerned. Marcello Ienca calls mind-captioning "the ultimate privacy challenge," while others worry that our neural data—already a goldmine of personal details—could expose everything from early dementia to private anxieties.
Łukasz Szoszkiewicz warns that neuroscience is "moving fast" but that protections for mental privacy "can't wait." His proposed rulebook is not subtle: treat neural data as sensitive by default, demand explicit consent, and ensure the user controls when their brain is—or is not—being read.
In other words: a mental airplane mode, urgently required.
Welcome to the era of brain subtitles
Mind-captioning is still in its infancy. It needs vast amounts of data, fully cooperative participants, and a very patient AI that doesn't mind decoding thousands of silent dog videos. But its existence marks a new line in the sand—a moment where reading the human mind shifted from sci-fi speculation to early-stage prototype.
It could help millions communicate. It could revolutionize neuroscience. It could also, if left unchecked, become the most invasive technology ever invented.
For now, relax. No one can read your private thoughts.
Unless you're thinking really loudly.
