Facebook assured that will begin to reduce the political content displayed on the social network, from user complaints that “they feel there is too much political content in their news sources.”
The experiment to show less political content on Facebook began in February, when the company announced its intention to modulate its role and explore new ways to classify these posts in your feed using signals, as well as deciding what strategy to take in the future.
In particular, the platform will no longer depend so much on its algorithm that it decides the probability that someone will share or comment on a given post. based on your previous post. Instead, it will be based on what users express interest through surveys and other feedback.
“Our ranking algorithm will continue to consider other signals, such as who posted it, when it was posted, and how you previously interacted with that person or page“said the Facebook spokesperson.
The Facebook algorithm, at the center of the controversy. AP Photo
The change, which the company plans to begin testing in countries outside of the United States, could affect news publishers whose content focuses on politics. Initially, the tests were limited to only a few Facebook users in Canada, Brazil, Indonesia and the US, but now the company has reported that it will extend it to new countries: Costa Rica, Sweden, Spain and Ireland, as Facebook reported to through an updated statement.
Now, the social network will give less importance to the probability of commenting and sharing the publication, and will focus more on new signals such as probability that the user gives a bad rating to the content on certain Political topics or events that appear on the bulletin board.
Facebook acknowledged that these changes “will affect content on public topics more widely and publishers will be able to see an impact on their traffic,” so the feature will roll out gradually, although it expects announce news in the coming months.
Disinformation and the 2020 campaign
Donald Trump, former US president AFP Photo
Facebook was heavily criticized in recent years, especially in 2020, for its role in political disinformation online. Historically, a hands-off approach has been adopted to moderate all kinds of content in an attempt not to be the “arbiters of the truth”, as CEO Mark Zuckerberg has said.
Critics have focused on his algorithm specifically to push more extreme and partisan content against people who think they are likely to interact with him, which would induce them to spend more time on the platform.
The top 10 performing Facebook posts by engagement are often dominated by conservative content, according to data from Facebook-owned Crowdtangle.
Earlier this month, the platform released its first “Widely Viewed Content Report,” which ranks popular posts on the site based on what people see in their feeds rather than engagement. The New York Times later reported that Facebook, fearing criticism, I had an older version of the report that showed that the most viewed link was one with the wrong information about the coronavirus.
The measure detailed in the report does not constitute the first effort Facebook made to limit the amount of political and potentially divisive content on its platform.
In June 2020, Zuckerberg wrote in a USA Today op-ed that the company would allow users to disable political ads.
Political content on Facebook will be reduced. Reuters photo
“Everybody wants politicians to be held accountable for what they say, and I know a lot of people want us to moderate and remove more of their content,” Zuckerberg wrote.
In January, after the insurrection in the United States Capitol, whose participants had organized in advance on Facebook and other websites, the company said it would stop recommending political groups to “long-term” users.
And the company in February began testing the temporary reduction of political posts in some news sources for users in the US and Canada, among other nations. The move, according to Zuckerberg, was because many users of the platform did not want their feeds to consist largely of political content.
But the company did not begin to deal with this problem in 2020. At the end of 2019, Facebook was getting a heavy pushback for its policy of not verifying political advertising.
“We do not believe, however, that it is an appropriate role for us to arbitrate political debates and prevent a politician’s speech from reaching his audience and being the subject of public debate and scrutiny,” said Facebook’s vice president of global affairs and communications, Nick Clegg in time.