A COUPLE who rushed to withdraw thousands of pounds for their grandson were horrified when they realised they had been duped by a fake voice on the phone.

Ruth Card, from Canada, was panicked when she got a call from who she believed was her grandson Brandon claiming he was in jail with no wallet and needed cash for bail.

The couple were duped into believing the voice at the end of the phone was their grandson. Stock pic

1

The couple were duped into believing the voice at the end of the phone was their grandson. Stock picCredit: Getty

The 73-year-old and her husband Greg, 75, frantically dashed to their closest bank branch in Saskatchewan and took out 3,000 Canadian dollars (£1,831) – the daily maximum.

Needing more cash, they raced to the next branch.

Ruth told the Washington Post: “It was definitely this feeling of fear, that we’ve got to help him right now.”

But after explaining why they needed the money, the manager called them into his office.

Billions warned over cursed 'AI call' – what to do if you receive it
Children of stars created by AI - can you guess the famous parents?

He told them another customer had received a similar call but the voice on the end of the phone had been faked – despite sounding the exact same.

The stunned couple realised they had been the victim of a cruel hoax using artificial intelligence.

Ruth added: “We were sucked in.

“We were convinced that we were talking to Brandon.”

Most read in Tech

It comes amid warnings that AI-powered software can recreate voices.

Crooks can imitate the voices of loved ones in a bid to dupe people into sending cash.

They can do this by using so-called deep fake audio technology, which enables criminals to replicate and recreate voices to say anything they want.

If you or your loved ones have posted videos with sound or audio files on social media, the scammers can get hold of it that way

Or they can go “old school” and simply call up someone to get voice samples from them.

Then the hacker would simply clone the voice using artificial intelligence technology and use it against the other partner

Depending on the quality of the original audio file, it might not sound exactly like your loved one but it can still be convincing

This post first appeared on Thesun.co.uk

You May Also Like

Billions of iPhone owners warned their Gmail inbox could be exposed in silent iLeakage hack to steal data

EXPERTS have warned that billions of iPhone users could have their Gmail…

Amazon Fire TV Stick’s hidden ‘remote trick’ saves you from TV-ruining nightmare – just use the secret button press

IF your Amazon Fire TV Stick is acting up, there’s a simple…

The Deep Roots of Nigeria’s Cybersecurity Problem

On April 3, Website Planet was running a web-mapping project when it…

Apple to Remove Blood-Oxygen Sensor From Watch to Avoid U.S. Ban

Updated Jan. 15, 2024 3:33 pm ET Listen (2 min) Apple is…