A DANGEROUS type of scam uses artificial intelligence to hoodwink you into handing over money.
Cyber-experts are warning over the sinister scam that can part you from significant sums of cash.
It works using deepfakes – digitally manipulated images, videos and audio clips created using artificial intelligence.
AI apps can create deepfakes that show people in fake situations – or saying things they never said.
Now a cyber-expert at McAfee has published a security memo warning people over two common types of deepfake scams.
“It challenges the long-held axiom that ‘seeing is believing’,” the McAfee cybersecurity expert warned.
“If you can’t quite believe what you see, then what’s real? What’s not?”
The first type of scam is when cybercriminals edit explicit footage and then blackmailing innocent people.
They’ll typically ask you to send money or gift cards in exchange for the content staying private.
“No response is the best response to a ransom demand. You’re dealing with a criminal,” the McAfee memo warned.
Most read in Tech
“Who’s to say they won’t release their fake documents even if you give in to the ransom? Involve law enforcement as soon as a scammer approaches you, and they can help you resolve the issue.”
The other type of scam is when crooks create fake ads and promotional videos using AI deepfake technology.
For instance, McAfee warned that crooks have created fake images of financial experts to trick victims into signing up to dodgy investments.
McAfee is urging users to remember that an ad appearing on a popular app like Facebook or Instagram isn’t necessarily legitimate.
It’s important to treat these videos and photos with caution.
Thankfully there are three things to look out for when analyzing images, videos and audio clips online.
“To identify a deepfake video or image, check for inconsistent shadows and lighting, face distortions, and people’s hands,” the McAfee expert explained.
“That’s where you’ll most likely spot small details that aren’t quite right.
“Like AI voices, deepfake technology is often accurate, but it’s not perfect.”
This post first appeared on Thesun.co.uk