EXPOSURE to deepfakes will leave us with a warped sense of reality and the problem’s only getting worse, warns an AI expert.

AI and disinformation expert Wasim Khaled told The U.S. Sun that deepfakes are a so-called Pandora’s box, and the worst is yet to come.

Taylor Swift recently faced a deepfake nightmare when explicit AI-generated images of her went viral on Twitter

1

Taylor Swift recently faced a deepfake nightmare when explicit AI-generated images of her went viral on TwitterCredit: Getty

Khaled is the CEO and co-founder of Blackbird.AI, an AI-driven risk and narrative intelligence platform that’s fighting against disinformation.

“Unfortunately, it’s very much a Pandora’s box situation,” he told us.

“The ability for people to drive this kind of behaviour is going to get easier and cheaper and more realistic.

“Today, the videos are much harder to produce than the images.

“Just like the images used to be harder to do than the text.

“At the speed at which we’re moving, the models get infinitely better every month or every quarter.”

Khaled isn’t alone with these concerns and other experts agree that the deepfake danger is rising.

“Every day, these tools are becoming cheaper and more accessible to the public.

Most read in News Tech

“We’re seeing a 900% year-over-year increase of deepfake creation, and we estimate that January 2024 already sees more deepfakes online than the entire 2023.

“Eventually, it could be possible that every person may get deepfaked,” Michael Matias, CEO and co-founder of Clarity told The U.S. Sun.

One of Khaled’s biggest concerns with deepfakes is the impact they have on the mental health of the individuals affected as well as the people viewing them.

The AI expert said his organisation has dealt with this for some time but used to be predomintaely focussed on text based “fabricated realities.”

“That warped people’s sense of reality as well over the past four or five years and that was just text and reach.

“What we’re adding into text and reach like we saw with Taylor Swift’s images is what happens when you insert that highly realistic media content into the reach.

“I mean, for the average person, they’d have a really hard time refuting it wasn’t them.

“Whereas someone who is of her stature, obviously people know it’s not her.”

Khaled is concerned that regular people will struggle to prove a deepfaked image wasn’t them even if they have proof.

He thinks over exposure to deepfakes is just something our brains aren’t ready for and we might find it hard to determine the truth.

“Could be a leaked video, could be leaked imagery, so if ten people see that they’re going to have an impression of you regardless if you say ‘that wasn’t really me’.

“That’s why the sextortion rings exist because they know that’s going to have an impact of having believability around it. 

“At scale, it’s absolutely going to have an affect on people. It’s just old marketing principles taken into like bizarre territory.

“Where it’s like okay you see things seven times and it starts to sink it.”

As AI advances, ultra realistic imagery could make deepfakes much harder to disprove as well as not believe.

READ MORE SUN STORIES

Khaled added: “Overtime many people are going to 100 % believe that thing.

“Even if they’re shown irrefutable proof, they still will believe that thing.”

This post first appeared on Thesun.co.uk

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

Inside how kidneys from genetically-engineered pigs are being successfully transplanted into human bodies

A NEW type of organ transplant has been successfully completed twice by…

The Case for a Light Hand With AI and a Hard Line on China

That being said, we all know here, as people who love technology,…

Chinese scientists develop a laser capable of ‘seeing’ hidden objects from a mile away

Calculating the time it took photons to travel back and forth, researchers…

People living in dense UK cities are LONELIER

People who live in dense UK cities are lonelier than those in…