Scientists can now ‘read’ your thoughts using an AI-powered model that is specifically designed to decode brain scans. 

The non-invasive breakthrough, developed by the University of Texas, could help those who are unable to speak or type to communicate for the first time. The method however, does decode language in real-time. 

The method works by feeding functional magnetic resonance imaging (fMRI) to the algorithm, which then reconstructs arbitrary stimuli that the person is hearing or thinking into natural language.

Until now, this process has only been accomplished by implanting electrodes in the brain.

The system produces an idea of the thought by analyzing the scans, and cannot decode what they are thinking word-for-word. 

The thoughts can be as simple as a single word, such as dog, or as complex as ‘I must walk to the dog.’

Previous research has shown that with complex thoughts, our brains break them down into smaller pieces – and each piece corresponds to a different aspect of the thought, Popular Mechanics reports.

This is the first non-invasive technique used to read brain signals. Previously this was only possible by implanting electrodes in the brain

This is the first non-invasive technique used to read brain signals. Previously this was only possible by implanting electrodes in the brain

The brain also has its own alphabet composed of 42 different elements that refer to a specific concept like size, color or location, and combines all of this to form our complex thoughts.

Each ‘letter’ is handled by a different part of the brain, so by combining all the different parts it is possible to read a person’s mind.

While the system cannot decode the brain scans with word for word to what the individual is thinking, it produces an idea of the thought

While the system cannot decode the brain scans with word for word to what the individual is thinking, it produces an idea of the thought

The system can also describe what a person was seeing in pictures while they were under the MRI machine

The system can also describe what a person was seeing in pictures while they were under the MRI machine

The team did this by recording fMRI data of three parts of the brain that are linked with natural language while a small group of people listened to 16 hours of podcasts. 

The three brain regions analyzed were the prefrontal network, the classical language network and the parietal-temporal-occipital association network, New Scientist reports.

The algorithm was then given the scans, which compared patterns in the audio to patterns in the recorded brain activity, according to The Scientist.

And the system showed it was capable of taking a scan recording and transforming it into a story based on the content, which the team found matched the idea of the narrated stories.

Although the algorithm is not able to break down every ‘word’ in the individual’s thoughts, it is able to decipher the story each person heard.

The study, pre-printed in BioXiv, provides an original story: ‘Look for a message from my wife saying that she had changed her mind and that she was coming back.’

The algorithm decoded it as: ‘To see her for some reason I thought maybe she would come to me and say she misses me.’

The system is unable to spit out word for word what a person is thinking, but is capable of providing an idea of their thoughts.

This post first appeared on Dailymail.co.uk

You May Also Like

Voxi boosts £12 SIM-only deal to 25GB for LIMITED TIME

BIG data deals often come with big price tags but not this…

Get 100GB data for £7.50 with this SIM-only cashback deal

LOOKING for a generous SIM-only deal? You can grab huge savings on…

Call of Duty Mobile Season 1 is live – maps, modes, weapons and more

GRAB your smartphone and get ready for the Call of Duty Mobile’s…

FIFA 23 Team of the Year: How to vote for your favourites

FIFA 23 has announced a hundred nominees that fans can vote for…