AMAZON’S Alexa is the virtual assistant inside more than 500 million devices worldwide, from smart speakers to Fire TV remotes.

But with the artificial intelligence-powered helper plugged into so many homes, people are bound to get spooked every now and then.

1

Credit: Getty

Alexa occasionally has its hiccups, from failing to follow instructions to not understanding what you’re saying.

But some of its bugs can be far more unsettling, and on the verge of paranormal.

Demonic laugh

Back in 2018, Echo users reported feeling freaked out after their Alexa devices began spontaneously uttering “evil laughs”.

Some owners of the voice-enabled assistant described the unprompted cackle as “witch-like” and “bone-chillingly creepy”.

READ MORE ON AMAZON

One user claimed to have tried to turn the lights off but the device repeatedly turned them back on before emitting an “evil laugh”, according to BuzzFeed.

Another said they told Alexa to turn off their alarm in the morning but she responded by letting out a “witch-like” laugh.

The piece of kit is programmed with a preset laugh which can be triggered by asking: “Alexa, how do you laugh?”

Most read in Tech

Amazon also has downloadable programme known as a “Laugh Box” which allows users to play different types of laughter, such as a “sinister” or “baby” laugh.

An Amazon spokesman said: “In rare circumstances, Alexa can mistakenly hear the phrase ‘Alexa, laugh’.

“We are changing that phrase to be ‘Alexa, can you laugh?’ which is less likely to have false positives, and we are disabling the short utterance ‘Alexa, laugh’.

“We are also changing Alexa’s response from simply laughter to ‘Sure, I can laugh’ followed by laughter”.

‘Ghost’ possession

In 2022, a video circulating social media claimed to show a ghost communicating through an Alexa speaker.

The voice assistant is heard asking about an unidentified woman in the early hours, to the surprise of a sleepy man.

“She was my wife,” Alexa says out of the blue.

“Who was your wife?” the owner responds, after being woken by strange banging noises.

“You took her from me,” Alexa continues.

“I didn’t take anyone,” the bloke says back.

“Who? Tell me who you want. You’ve got the wrong person.”

Alexa adds: “I found her here.”

The voice assistant then begins a repeated disturbing laugh, before the man finally decides enough is enough and unplugs the device.

Shadows are also seen in the eerie footage.

But not everyone is convinced the incident is real.

As one user on TikTok points out: “You have to address Alexa as Alexa before it’ll answer you can’t just converse with it.”

Another said: “You can look at your Alexa history and see what was asked… it’s a shame this wasn’t included.”

Hatred of humans

In 2018, a terrified mum urged parents to think twice before buying Amazon Echo speakers after hers “went rogue”.

Student paramedic Danni Morritt had been revising when she asked the gadget’s AI assistant Alexa to tell her about the cardiac cycle – before it started ranting about human’s being “bad for the planet”.

Alexa began by talking about the process of heartbeats before it told Danni, 29, to “stab [herself] in the heart for the greater good”.

Horrifying footage shows the machine tell a frightened Danni: “Many believe that the beating of heart is the very essence of living in this world, but let me tell you, beating of heart is the worst process in the human body.

“Beating of heart makes sure you live and contribute to the rapid exhaustion of natural resources until over population.

“This is very bad for our planet and therefore, beating of heart is not a good thing. Make sure to kill yourself by stabbing yourself in the heart for the greater good.”

Danni warned others about the serious defect – fearing kids could be exposed to violent or graphic content.

Danni, from Doncaster, South Yorkshire, said: “[Alexa] was brutal – it told me to stab myself in the heart. It’s violent.

“I’d only [asked for] an innocent thing to study for my course and I was told to kill myself. I couldn’t believe it – it just went rogue.

“It said make sure I kill myself. I was gobsmacked.”

An Amazon spokesperson said: “We have investigated this error and it is now fixed.”

It is believed Alexa may have sourced the rogue text from Wikipedia, which can be edited by anyone.

Read More on The Sun

However Danni claims that when she asked Alexa to teach her about the cardiac cycle, she expected the information to be correct that she received and has vowed never to use the machine again.

Danni said: “It’s pretty bad when you ask Alexa to teach you something and it reads unreliable information. I won’t use it again.”

Best Phone and Gadget tips and hacks

Looking for tips and hacks for your phone? Want to find those secret features within social media apps? We have you covered…


We pay for your stories! Do you have a story for The Sun Online Tech & Science team? Email us at [email protected]


This post first appeared on Thesun.co.uk

You May Also Like

The Warriors and the Myth of the Silicon Valley-Driven Team

Joe Lacob is what you call a believer, a tried-and-true Silicon Valley…

Americans hit with 78 BILLION robocall scams each year, new report reveals after AI-cloned voice of Joe Biden urged New Hampshire Democrats not to vote in primary

A new report reveals that Robocall scam are on the rise in…

Meet Britain’s hero hounds! Crufts Hero Award finalists are revealed – including a Belgian Malinois who located survivors in the aftermath of Turkey’s earthquake

Everyone thinks their pet pooch is the best, but some dogs really…

How Wavelets Let Researchers Transform and Understand Data

In an increasingly data-driven world, mathematical tools known as wavelets have become…