Dogs are able to pick up on individual words in sentences spoken to them using similar computations and brain regions as human babies, a study has found.

When we are infants, we learn to spot new words in a stream of speech first, before we actually learn what each individual word means. 

To tell where each word ends and another begins, babies use complex calculations that keep track of which syllables appear together — and thus likely form words. 

By using a combination of brain imaging techniques, experts led from Hungary’s Eötvös Loránd University have shown that dogs are capable of similar feats.

This is the first time that the capacity to apply so-called statistical learning has been shown to be demonstrated in a non-human mammal.

The findings come in the same week that a study revealed that dogs tilt their heads when listening because it helps them to hear and process information more easily.

Scroll down for video

Dogs are able to pick up on individual words in sentences spoken to them using similar computations and brain regions as human babies, a study has found

HOW AN EEG WORKS 

An electroencephalogram (EEG) is a recording of brain activity which was originally developed for clinical use.

During the test, small sensors are attached to the scalp to pick up the electrical signals produced when brain cells send messages to each other.

In the medical field, EEGs are typically carried out by a highly trained specialist known as a clinical neurophysiologist.

These signals are recorded by a machine and are analysed by a medical professional to determine whether they’re unusual.

An EEG can be used to help diagnose and monitor a number of conditions that affect the brain.

It may help identify the cause of certain symptoms, such as seizures or memory problems.

More recently, technology companies have used the technique to create brain-computer interfaces, sometimes referred to as ‘mind-reading’ devices.

This has led to the creation and design of a number of futuristic sounding gadgets.

These have ranged from a machine that can decipher words from brainwaves without them being spoken to a headband design that would let computer users open apps using the power of thought. 

<!—->

Advertisement

‘Keeping track of patterns is not unique to humans — many animals learn from such regularities in the surrounding world, this is called statistical learning,’ explains paper author and ethologist Marianna Boros of the Eötvös Loránd University.

‘What makes speech special is that its efficient processing requires complex computations. To learn new words from continuous speech, it is not enough to count how often certain syllables occur together.

‘It is much more efficient to calculate how probably those syllables occur together.

‘This is exactly how humans, even 8-month-old infants, solve the seemingly difficult task of word segmentation — they calculate complex statistics about the probability of one syllable following the other.

‘Until now we did not know if any other mammal can also use such complex computations to extract words from speech. We decided to test family dogs’ brain capacities for statistical learning from speech.

‘Dogs are the earliest domesticated animal species and probably the one we speak most often to. Still, we know very little about the neural processes underlying their word learning capacities.’

In the study, the researchers measured dogs’ electric brain activity using an electroencephalogram (EEG). 

The scans revealed key differences in dogs’ brain waves for frequent and rare words. 

Lilla Magyari, an author of the study, explained: ‘We saw differences in dogs’ brain waves for frequent compared to rare words.

‘But even more surprisingly, we also saw brain wave differences for syllables that always occurred together compared to syllables that only occasionally did, even if total frequencies were the same. 

‘So it turns out that dogs keep track not only of simple statistics (the number of times a word occurs) but also of complex statistics (the probability that a word’s syllables occur together).

‘This has never been seen in other non-human mammals before. It is exactly the kind of complex statistics human infants use to extract words from continuous speech.’

Next, the researchers used functional MRI scanning to explore how similar the brain regions responsible for this complex computational capacity in dogs are to those in the human brain. 

In the study, the researchers measured dogs' electric brain activity using an electroencephalogram (EEG)

In the study, the researchers measured dogs' electric brain activity using an electroencephalogram (EEG)

In the study, the researchers measured dogs’ electric brain activity using an electroencephalogram (EEG)

As with the EEG scans, the tests were performed on awake, cooperating, unrestrained animals, although the dogs involved in the fMRI experiments were previously trained to lie motionless for the duration of the scans. 

‘We know that, in humans, both general learning-related and language-related brain regions participate in this process. And we found the same duality in dogs,’ explained Dr Boros.

‘Both a generalist and a specialist brain region [the basal ganglia and auditory cortex, respectively] seemed to be involved in statistical learning from speech, but the activation patterns were different in the two. 

The researchers used functional MRI scanning to explore how similar the brain regions responsible for this complex computational capacity in dogs are to those in the human brain

The researchers used functional MRI scanning to explore how similar the brain regions responsible for this complex computational capacity in dogs are to those in the human brain

The researchers used functional MRI scanning to explore how similar the brain regions responsible for this complex computational capacity in dogs are to those in the human brain

As with the EEG scans, the tests were performed on awake, cooperating, unrestrained animals, although the dogs involved in the fMRI experiments were previously trained to lie motionless for the duration if the scans

As with the EEG scans, the tests were performed on awake, cooperating, unrestrained animals, although the dogs involved in the fMRI experiments were previously trained to lie motionless for the duration if the scans

As with the EEG scans, the tests were performed on awake, cooperating, unrestrained animals, although the dogs involved in the fMRI experiments were previously trained to lie motionless for the duration if the scans

‘The generalist brain region responded stronger to a random speech stream (where no words could be spotted using syllable statistics) than to a structured speech stream (where words were easy to spot just by computing syllable statistics).

‘The specialist brain region showed a different pattern: here we saw brain activity increase over time for the structured but not for the random speech stream.

‘We believe that this activity increase is the trace word learning leaves on the auditory cortex.’

When we are infants, we learn to spot new words in a stream of speech first, before we actually learn what each individual word in fact means. To tell where each word ends and another begins, babies use complex calculations that keep track of which syllables appear together — and thus likely form words

When we are infants, we learn to spot new words in a stream of speech first, before we actually learn what each individual word in fact means. To tell where each word ends and another begins, babies use complex calculations that keep track of which syllables appear together — and thus likely form words

When we are infants, we learn to spot new words in a stream of speech first, before we actually learn what each individual word in fact means. To tell where each word ends and another begins, babies use complex calculations that keep track of which syllables appear together — and thus likely form words

Overall, the findings suggest that the neural processes known to be key for human language acquisition may not be unique to humans after all, according to the researchers. 

Attila Andics, an author of the study, added: ‘But we still don’t know how these human-analogue brain mechanisms for word learning emerged in dogs.

‘Do they reflect skills that developed by living in a language-rich environment, or during the thousands of years of domestication, or do they represent an ancient mammalian capacity?

‘By studying speech processing in dogs, even better dog breeds with different communication abilities and other species living close to humans, we can trace back the origins of human specializations for speech perception.’

The full findings of the study were published in the journal Current Biology

DOGS FIRST BECAME DOMESTICATED ABOUT 20,000 TO 40,000 YEARS AGO

A genetic analysis of the world’s oldest known dog remains revealed that dogs were domesticated in a single event by humans living in Eurasia, around 20,000 to 40,000 years ago.

Dr Krishna Veeramah, an assistant professor in evolution at Stony Brook University, told MailOnline: ‘The process of dog domestication would have been a very complex process, involving a number of generations where signature dog traits evolved gradually.

‘The current hypothesis is that the domestication of dogs likely arose passively, with a population of wolves somewhere in the world living on the outskirts of hunter-gatherer camps feeding off refuse created by the humans.

‘Those wolves that were tamer and less aggressive would have been more successful at this, and while the humans did not initially gain any kind of benefit from this process, over time they would have developed some kind of symbiotic [mutually beneficial] relationship with these animals, eventually evolving into the dogs we see today.’

<!—->

Advertisement

This post first appeared on Dailymail.co.uk

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

How to Text From Your PC Through Your Phone

The other option is Pair manually, which means signing in to the…

I’m an iPhone expert – four tricks will totally change how you use your device

YOUR iPhone is packed with handy features, and some are better known…

How the World Will Know If Russia Is Preparing to Launch a Nuclear Attack

This week, NATO is conducting its regular, long-planned nuclear strike exercise known…

AI expert warns against telling your secrets to chatbots such as ChatGPT

Prof Mike Wooldridge will address looming questions around AI in this year’s…