top of page

Facebook changes to News Feed in 2021

Press release by Anna Stepanov, Head of Facebook App Integrity.


  • Facebook is sharing a round-up of the biggest changes and tests we’ve launched this year to give people who use Facebook more control over their News Feed and reduce negative experiences.

  • Facebook continually evaluate the effectiveness of News Feed ranking signals and give people insight into how content appears in their feeds.

In 2021, Facebook made significant progress in providing greater transparency into how the News Feed ranking process works, what gets distributed and why.



New features were released to incorporate direct feedback from people who use Facebook by giving them more control over their feeds and reducing negative experiences.

Here are some of the biggest changes and tests Facebook launched this year:

  • February: new tests to reduce political content in News Feed in response to direct feedback.

  • March: the launch of Choose Who Can Comment, Favorites, the Feed Filter Bar, and an expansion of Why Am I Seeing This to suggested posts to provide people more context and control over the content they see and share in News Feed

  • April: new tests to incorporate more direct feedback from people who use Facebook about the content they want to see more or less in their News Feed

  • May: new options for where and how people can choose to see reaction counts on Facebook and Instagram

  • August: our first-ever Widely Viewed Content Report to share what content is seen by the most people in News Feed in the US

  • September: our Content Distribution Guidelines that list content and behaviors that receive reduced distribution in News Feed because they are problematic or otherwise low quality

  • November: our second Widely Viewed Content Report and new ways to make News Feed controls easier to find and use for people and advertisers

Facebook continually evaluate the effectiveness of News Feed ranking signals and update or remove them when it makes sense.


News Feed uses personalized ranking, which takes into account thousands of unique signals to understand what’s most meaningful to you. The aim isn’t to keep you scrolling on Facebook for hours on end, but to give you an enjoyable experience that you want to return to. It’s not in Facebook's interest to show you hateful or inflammatory content — our advertisers don’t want their ads shown next to it and our users tell us they don’t want it.


Facebook is incentivized to reduce it. The prevalence of hate speech is now just about 0.03% of content viewed, or about 3 views per every 10,000 and continuing to drop. In the US, nearly 90% of the content people see is from friends, pages and groups they follow or are connected to because we use algorithmic ranking.


コメント


bottom of page