Facebook created an oversight board to tackle the company’s thorniest content-moderation issues.

Photo: josh edelson/Agence France-Presse/Getty Images

Facebook Inc.’s FB 0.46% independent content-oversight board issued its first five rulings Thursday, overturning four instances where it found the company unfairly infringed upon users’ speech on the platform or misapplied “vague” rules on content that could cause imminent harm.

Among the board’s decisions were a determination that Facebook’s algorithms were wrong to remove a post about breast cancer identification that featured a woman’s nipple, and a finding that Facebook had been too strict in removing a French user’s post praising hydroxychloroquine, a once widely discussed treatment for Covid-19 that medical authorities have generally found not to be effective.

The board, created and funded by Facebook through an endowment, was created to tackle Facebook’s thorniest content-moderation issues, and the company has pledged to abide by the panel’s decisions. The group, which features 20 journalists, lawyers and former politicians from around the world, is also set to determine later this year whether Facebook erred in suspending former President Donald Trump from its platform.

So far, the panel has only been given the ability to determine whether content that has been taken down should be restored—not if Facebook should be removing live posts or videos.

Board member Helle Thorning-Schmidt, shown in 2019, pointed to pitfalls in Facebook’s use of algorithms to police content.

Photo: yannis kolesidis/Shutterstock

Thursday, the content oversight board determined the removal of the post with the nipple—which Facebook had reversed on its own after being notified that the board would review it—indicates possible overreliance by Facebook on algorithms to police content, said Helle Thorning-Schmidt, the former prime minister of Denmark and a member of the board.

“It became very clear to us that that was part of the problem—had they had human moderators, I don’t think this would have been taken down from Instagram,” she said. Facebook has recently been moving toward more expansive use of algorithmic content moderation.

Big Tech’s deplatforming of former President Donald Trump has sparked a debate about the future of content moderation on social media. WSJ speaks with a disinformation and moderation expert about what comes next.

Write to Jeff Horwitz at [email protected]

Copyright ©2020 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8

This post first appeared on wsj.com

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

Virgin Media is back online following outage

It’s the go-to internet service for millions of people across the UK,…

‘Massive fight’ breaks out in Dubai Mall as Apple fans try to get their hands on the new iPhone 15 – while thousands more queue outside stores across the world

It is an eagerly-anticipated event that sees thousands of Apple fans queue…

Scientists are studying a 116-year-old San Francisco-born grandma who is the oldest woman in the world to learn her secrets and develop cures for diseases

Scientists are studying a 116-year-old San Francisco-born woman in a bid to discover…

Remember the gory torture scene in Reservoir Dogs? Expert reveals how filmmakers use music to manipulate our memories so we recall certain parts of movies

Filmmakers have a secret weapon to manipulate our memories and emotions when…