Meta rolled out a new tool on Thursday that stops revenge porn from spreading on Facebook and Instagram, but it requires people to upload their sexually explicit photos and videos to a website.
When someone is concerned their intimate images or videos have been posted or might be posted to either of the social media platforms, they can create a case through a global website called StopNCII.org, which stands for ‘Stop Non-Consensual Intimate Images.’
Each photo or video uploaded receives a digital fingerprint, or unique hash value, which is used to detect its copy that was shared or attempted to be posted.
However, the website was created with 50 global partners and sharing intimate images and video of yourself with a third-party website may not sit well with most users, but Meta says they ‘will not have access to or store copies of the original images.’
Scroll down for video
Meta rolled out a new tool on Thursday that stops revenge porn from spreading on Facebook and Instagram, but it requires people to upload their sexually explicit photos and videos to a website
‘Only hashes, not the images themselves, are shared with StopNCII.org and participating tech platforms,’ Antigone Davis, global head of safety for Meta, shared in a blog post.
‘This feature prevents further circulation of that NCII content and keeps those images securely in the possession of the owner.’
DailyMail.com has contacted Meta about the safety of the tool and has yet to receive a response.
StopNCII.org builds on a pilot program launched in 2017 in Australia, when it asked the public for photos of themselves to create hashes that could be used to detect similar images on Facebook and Instagram.
And this foundation is what is being used to stop revenge porn.
When someone is concerned their intimate images or videos have been posted or might be posted to either of the social media platforms, they can create a case through a global website called StopNCII.org
Meta was originally going to setup Facebook to let people upload their intimate images or videos to stop them from spreading, but the sensitive media would have been reviewed by human moderators during the process before they were converted into unique digital fingerprints, NBC News reports.
Knowing this, the social media firm opted to bringing in a third-party, StopNCII, which specialize in image-based abuse, online safety and women’s rights.
‘Only hashes, not the images themselves, are shared with StopNCII.org and participating tech platforms,’ Davis wrote.
‘This feature prevents further circulation of that NCII content and keeps those images securely in the possession of the owner.
StopNCII.org is for adults over 18 years old who think an intimate image of them may be shared, or has already been shared, without their consent.
A report in 2019, released by NBC, Meta identifies nearly 500,000 cases of revenge porn every month.
To deal with the influx of revenge porn, Facebook employs a team of 25 people which, in tandem with an algorithm developed to identify nude images, helps vet reports and take pictures down.
But with StopNCII.org, human moderators are replaced with hashes that can detect and identify the images – after potential victims share the explicit content with Meta.