Data released by Facebook last fall showed that during one week in October, seven of the 10 most-engaged pages were primarily political, including those of President Donald J. Trump, Fox News, Breitbart and Occupy Democrats.

Three years ago, Facebook said it would pull back on the amount of content posted to the site by news publishers and brands, an overhaul that it said put more focus on interaction among friends and family. At the time, Mr. Zuckerberg said he wanted to make sure Facebook’s products were “not just fun, but good for people.” He also said the company would take those actions even if it meant hurting the bottom line.

Still, Facebook users have had no problem finding political content. Nongovernmental organizations and political action committees paid to show millions of Americans highly targeted political advertising in the months before November’s presidential election. Users created vast numbers of private groups to discuss campaign issues, organize protests and support candidates. Until recently, Facebook’s own systems frequently suggested new, different political groups that users could join.

Facebook has backtracked on some of this in recent months. After the polls closed on Election Day, the company shut down the ability to buy new political advertising. And after the deadly Capitol riot on Jan. 6, Mr. Zuckerberg said the company would turn off the ability to recommend political groups to “turn down the temperature” on global conversations.

Under the new test, a machine-learning model will predict the likelihood that a post — whether it’s posted by a major news organization, a political pundit, or your friend or relative — is political. Posts deemed political will appear less often in users’ feeds.

It’s unclear how Facebook’s algorithm will define political content, or how significantly the changes will affect people’s feeds. Lauren Svensson, a Facebook spokeswoman, said the company would keep “refining this model during the test period to better identify political content, and we may or may not end up using this method longer term.”

It is also unclear what will happen if Facebook’s tests determine that reducing the political content also reduces people’s use of the site. In the past, the company has shelved or modified algorithm changes that aimed to lower the amount of misleading and divisive content people saw, after determining that the changes caused them to open Facebook less frequently.

Source: | This article originally belongs to Nytimes.com

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

Obamacare sign-ups reach 1 million during special enrollment window

One million people have signed up for health coverage under an Affordable…

Biden to sign executive orders on immigration, including family reunification

President Joe Biden will sign three major executive orders Tuesday aimed at…

Shopping for Tablecloths

A tablecloth can make an ugly table beautiful, hide a worn wooden…

Putin faces new challenge as Kremlin allies turn on Russia’s military leaders

Who’s to blame for Russia’s failures in Ukraine? After weeks of battlefield…