Facebook will start reducing the amount of political content users see while scrolling their primary feeds.
The social media platform will “temporarily reduce the distribution of political content in News Feed for a small percentage of people” in Brazil, Indonesia and Canada this week, it said in a blog post on Wednesday. The changes will be applied to a limited number of US users in the coming weeks.
“During these initial tests we’ll explore a variety of ways to rank political content in people’s feeds using different signals, and then decide on the approaches we’ll use going forward,” Aastha Gupta, product management director at Facebook, wrote in the blog post.
Facebook CEO Mark Zuckerberg hinted at the changes during the company’s earnings call last week. “One of the top pieces of feedback that we’re hearing from our community right now is that people don’t want politics and fighting to take over their experience on our services,” he said.
The company, which has come under fire for its shortcomings in combating election misinformation and its political ad policies, claims that political content makes up only 6% of what people see on Facebook (FB) in the United States. When asked how it defines political content, Facebook (FB) said it will use artificial intelligence known as machine learning trained “to look for signals of political content and predict whether a post is related to politics.” The test will include news stories about politics as well as political posts by family and friends.
Facebook will exempt Covid-19 information from national and regional health authorities as well as posts by official government agencies from its political content experiment.
“It’s important to note that we’re not removing political content from Facebook altogether,” Gupta wrote. “Our goal is to preserve the ability for people to find and interact with political content on Facebook, while respecting each person’s appetite for it at the top of their News Feed.”
– CNN Business’ Kaya Yurieff contributed to this report.