Scroll Top

Scientists unveil yet another AI that converts your thoughts into sentences

WHY THIS MATTERS IN BRIEF

Being able to use brain machine interfaces to communicate with one another, as well as control machines, will revolutionise society in a range of ways.

 

Interested in the Exponential Future? Connect, download a free E-Book, watch a keynote, or browse my blog.

Having a machine read your dreams and thoughts and then decode them into images, video, or text, or being able to use them to control a fleet of F-35 fighter jets, control prosthetic limbs, or even play games telepathically, all used to be something you could only see in a science fiction movie but they  all became science fact a while ago, and in time the quality of their outputs are all going to improve dramatically – something that will not only hail a breakthrough in human-machine communication but also human-human communication in the form of telepathy and hive minds.

 

See also
A ball of brain cells on a chip just learned maths and speech recognition

 

This week scientists in the US announced they’ve developed another Artificial Intelligence (AI) system that can translate a person’s thoughts into text by analysing their brain activity, and it’s the latest in an increasingly long line of developments that also recently saw scientists decode even monkey’s thoughts into text.

The researchers from the University of California, San Francisco, developed the AI to decipher up to 250 words in real time from a set of between 30 and 50 sentences. The algorithm was then trained using the neural signals of four women with electrodes implanted in their brains, which were already in place to monitor epileptic seizures.

The volunteers repeatedly read sentences aloud while the researchers fed the brain data to the AI to unpick patterns that could be associated with individual words, and the average word error rate across a repeated set was as low as 3 per cent.

 

See also
First universal quantum computer programming language unveiled

 

“A decade after speech was first decoded from human brain signals, accuracy and speed remain far below that of natural speech,” states a paper detailing the research, published this week in the journal Nature Neuroscience.

“Taking a cue from recent advances in machine translation, we trained a recurrent neural network to encode each sentence-length sequence of neural activity into an abstract representation, and then to decode this representation, word by word, into an English sentence.”

The average active vocabulary of an English speaker is estimated to be around 20,000 words, meaning the system is still a long way off being able to understand regular speech. The researchers are also unsure about how well the technology will scale up, as the decoder relies on learning the structure of a sentence and using it to improve its predictions. This means that each new word increases the number of possible sentences, therefore reducing the overall accuracy.

 

See also
Criminal actors created and weaponised a network of DeepFakes to discredit sovereign countries

 

“Although we should like the decoder to learn and to exploit the regularities of the language, it remains to show how much data would be required to expand from our tiny languages to a more general form of English,” the paper states.

They then go on to suggest that one possibility could be to combine it with other Brain Machine Interface (BMI) technologies, such as Elon Musk’s Neuralink or Facebook’s mind reading technology, that use different types of implants and algorithms to read people’s minds and that the technology could one day be used to enable telepathic communication between people – something that’s already also happened not once but twice in a move that will bring about “the next great wave in human-oriented computing.”

Related Posts

Leave a comment

FREE! 2024 TRENDS AND EMERGING TECHNOLOGY CODEXES
+

Awesome! You're now subscribed.

Pin It on Pinterest

Share This