Guardian exclusive: AI tools rate photos of women as more sexually suggestive than those of men, especially if nipples, pregnant bellies or exercise is involved

Images posted on social media are analyzed by artificial intelligence (AI) algorithms that decide what to amplify and what to suppress. Many of these algorithms, a Guardian investigation has found, have a gender bias, and may have been censoring and suppressing the reach of countless photos featuring women’s bodies.

These AI tools, developed by large technology companies, including Google and Microsoft, are meant to protect users by identifying violent or pornographic visuals so that social media companies can block it before anyone sees it. The companies claim that their AI tools can also detect “raciness” or how sexually suggestive an image is. With this classification, platforms – including Instagram and LinkedIn – may suppress contentious imagery.

Continue reading…

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

FIFA 22 may be last in series as footie bosses demand $2 billion fee for name

FIFA 22 could mark the end of an era as license renewal…

Holmes Testifies About ‘Big Idea’ That Led to Theranos

SAN JOSE, Calif.—Elizabeth Holmes took jurors into the Theranos Inc. lab Monday…

Trump’s Failed Blog Proves He Was Just Howling Into the Void

Former president and former “king of social media” Donald Trump decided this…

Nintendo fans rush to grab five must-play free games – including an amazing 83-rated title

WITH the holidays around the corner, our purse strings are tighter than…