The Race to Create A Mind-Reading Algorithm
We’re nearing closer to mind-reading algorithms that could understand what we’re thinking as we’re thinking about it, translating thoughts into words. The research has major implications to the verbally-handicapped, but also how society interacts with technology as a whole.
According to new research presented at the recent Society for Neuroscience conference, the supramarginal gyrus could help scientists translate people’s inner thoughts.
Wandelt’s team then set off to capture this “inner speech” — which the study defines as “engaging a prompted word internally” without moving any muscles.
The scientists presented FG with a word, asked him to “say” the word in his head, and then had him speak the word out loud. At every stage, the electrodes in FG’s brain gathered information and taught machine-learning algorithms to recognize the brain activity that corresponds to each word.
The team ran through a total of six words in English and Spanish, in addition to two made-up, “nonsense” words. When they asked FG to think of the words again, the computer could predict the result with up to 91 percent accuracy.
As the number of possible word choices grows, the odds that the computer will pick the right one shrink. This means that scaling the program up to handle a larger vocabulary will be challenging. The researchers also need to make sure they can reproduce their results in other subjects. This is a difficult task because everyone’s brain is unique, and it’s tough to place the implant in precisely the same spot in different people. – Inverse
The number of words the machine learning algorithm can recognize today is small. But ideally as this expands its recognition, the system would be beneficial to those who can hear and think but have lost the ability to speak, like those who’ve had strokes or those with advanced Lou Gehrig’s disease.
Ultimately, this research is core to the proliferation of advanced brain-computer interfaces, like the ones enabling the Cyborg Olympics.
A brain-computer interface (BCI) is a direct communication pathway between a human brain and an external device. It allows a person to send commands to an external device or system using their thoughts, bypassing the need for the person to use their muscles to produce a physical action. BCIs can be used to control devices such as computers, prosthetics, or robots, and they have the potential to enable people with paralysis or other conditions that impair their ability to communicate or move to communicate and interact with the world around them.
Although BCIs are still in the early stages of development, companies like Neuralink, Kernel, and Synchron are developing their own implants and high-tech devices, like this helmet that can read brain activity using near-infrared light.
BCIs are a far future concept. They’re nowhere near practical for everyday use. However, they present us with a fascinating transhumanist vision of the future. One where brain-computer interfaces augment our brains’ capacity, interact with our computers and devices via thought, and ultimately become closer to the tech that we use most.
Member discussion