RESEARCHERS have uncovered that they can create fake memories via artificial intelligence-generated deepfakes.

Deepfakes are synthetic media that have been digitally altered to replace one person’s face with that of another.

Researchers have uncovered that they can create fake memories

1

Researchers have uncovered that they can create fake memoriesCredit: Getty

Now, in a new study, scientists said they have been able to successfully create “fake memories” using deepfake technology.

In the experiment, 436 participants were shown deepfake clips of fictitious movies.

These included remakes of popular movies like The Matrix; in the deepfake version, Will Smith staring as Neo.

Another clip showed fake videos of Brad Pitt starring in “The Shining” – in reality, the role belonged to Jack Nicholson.

After reviewing the clips, participants were asked questions about the movies, as well as asked to rank the movies.

Shockingly, many subjects identified the deepfake movie remakes as the real versions.

In total, scientists observed a staggering average false memory rate of 49 percent.

What’s more, scientists found that even just simple text descriptions of fictitious events were enough to distort memories.

Most read in News Tech

“Deepfakes were no more effective than simple text descriptions at distorting memory,” the paper reads.

In fact, researchers noted that many studies have shown “misinformation in non-technical forms like simple narratives are extremely effective at distorting memory.”

The study has prompted concern about deepfake technology among experts, however, the study authors believe more research needs to be done.

“We shouldn’t jump to predictions of dystopian futures based on our fears around emerging technologies,” lead study author Gillian Murphy, a researcher at University College Cork in Ireland, told The Daily Beast.

“Yes there are very real harms posed by deepfakes, but we should always gather evidence for those harms in the first instance, before rushing to solve problems we’ve just assumed might exist.”

In conclusion, the team said that their findings suggest we might be underestimating “how readily our memories can be distorted without any technological input.”

They added that they support “growing calls to understand deepfakes as a cultural technology, where social concerns and fears should be engaged with critically and any interventions or regulations should be evidence-based.”

This post first appeared on Thesun.co.uk

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

Scientists reveal how to make the perfect cup of tea – so do YOU agree with their method? 

Should you put milk in before you make a cup of tea,…

What Our Phones, Cars and Refrigerators Need: More Buttons

What to Read Next This post first appeared on wsj.com

Ultrahuman Ring Air Review: A Subscription-Free Smart Ring

The Ring Air makes suggestions based on your circadian rhythm and the…

I’m an actress – I worry that AI is an ‘existential threat’ to my career in Hollywood as deep-fake technology improves

AN ACTRESS is concerned that her career in the movie industry will…