RESEARCHERS have uncovered that they can create fake memories via artificial intelligence-generated deepfakes.

Deepfakes are synthetic media that have been digitally altered to replace one person’s face with that of another.

Researchers have uncovered that they can create fake memories

1

Researchers have uncovered that they can create fake memoriesCredit: Getty

Now, in a new study, scientists said they have been able to successfully create “fake memories” using deepfake technology.

In the experiment, 436 participants were shown deepfake clips of fictitious movies.

These included remakes of popular movies like The Matrix; in the deepfake version, Will Smith staring as Neo.

Another clip showed fake videos of Brad Pitt starring in “The Shining” – in reality, the role belonged to Jack Nicholson.

After reviewing the clips, participants were asked questions about the movies, as well as asked to rank the movies.

Shockingly, many subjects identified the deepfake movie remakes as the real versions.

In total, scientists observed a staggering average false memory rate of 49 percent.

What’s more, scientists found that even just simple text descriptions of fictitious events were enough to distort memories.

Most read in News Tech

“Deepfakes were no more effective than simple text descriptions at distorting memory,” the paper reads.

In fact, researchers noted that many studies have shown “misinformation in non-technical forms like simple narratives are extremely effective at distorting memory.”

The study has prompted concern about deepfake technology among experts, however, the study authors believe more research needs to be done.

“We shouldn’t jump to predictions of dystopian futures based on our fears around emerging technologies,” lead study author Gillian Murphy, a researcher at University College Cork in Ireland, told The Daily Beast.

“Yes there are very real harms posed by deepfakes, but we should always gather evidence for those harms in the first instance, before rushing to solve problems we’ve just assumed might exist.”

In conclusion, the team said that their findings suggest we might be underestimating “how readily our memories can be distorted without any technological input.”

They added that they support “growing calls to understand deepfakes as a cultural technology, where social concerns and fears should be engaged with critically and any interventions or regulations should be evidence-based.”

This post first appeared on Thesun.co.uk

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

Vodafone Unlimited 5G SIM deal includes FREE Apple Airpods

There’s a brand new combination for anyone looking for a 5G SIM…

These Are Our Favorite Walmart Cyber Monday Deals

Remember when people would break down the doors to get inside Walmart…

Kukri snakes swallow organs of toads and frogs while keeping them alive

Kukri snakes slice open toads and frogs before inserting their heads into…

Nintendo Switch OLED has hidden THIRD mode that isn’t TV or handheld

THE LATEST version of the Nintendo Switch has a little-known feature that…