DEEPNUDE caused huge concern a few years ago exposing one of the creepier sides of AI.
The disturbing app was taken down – but clones have since appeared as AI tech has become even better.
What is DeepNude?
DeepNude was a site that could make a person appear naked when you upload a photo of them.
It used AI to work out what a person might look like without their clothes on.
The results looked pretty realistic, even though it couldn’t possibly show your actual body.
Nevertheless, there was huge concern about how the app could be misused.
Why is DeepNude dangerous?
Firstly, DeepNude was disturbing, only working with women’s photos.
But the greater worry was how pervs might misuse it to create quick revenge porn – and using the software on a woman without her consent.
Why was DeepNude banned?
DeepNude’s creator decided to pull the app in 2019 following a backlash.
A rep tweeted at the time that “the world is not yet ready for DeepNude”.
Most read in Tech
“We created this project for user’s entertainment a few months ago,” they said.
“We thought we were selling a few sales every month in a controlled manner.”
“Honestly, the app is not that great, it only works with particular photos.
“We never thought it would become viral and we would not be able to control the traffic.”
“Despite the safety measures adopted (watermarks) if 500,000 people use it, the probability that people will misuse it is too high.
“We don’t want to make money this way. Surely some copies of DeepNude will be shared on the web, but we don’t want to be the ones who sell it.”
Read more about Artificial Intelligence
Everything you need to know about the latest developments in Artificial Intelligence
We pay for your stories! Do you have a story for The Sun Online Tech & Science team? Email us at [email protected]
This post first appeared on Thesun.co.uk