Here’s how Facebook’s algorithm works

Facebook’s news feed algorithm has been blamed for fanning sectarian hatred, steering users toward extremism and conspiracy theories, and incentivizing politicians to take more divisive stands. It’s in the spotlight thanks to waves of revelations from the Facebook Papers and testimony from whistleblower Frances Haugen, who argues it’s at the core of the company’s problems.

But how exactly does it work, and what makes it so influential?

While the phrase “the algorithm” has taken on sinister, even mythical overtones, it is, at its most basic level, a system that decides a post’s position on the news feed based on predictions about each user’s preferences and tendencies. The details of its design determine what sorts of content thrive on the world’s largest social network, and what types languish — which in turn shapes the posts we all create, and the ways we interact on its platform.

Facebook doesn’t release comprehensive data on the actual proportions of posts in any given user’s feed, or on Facebook as a whole. And each user’s feed is highly personalized to their behaviors. But a combination of internal Facebook documents, publicly available information and conversations with Facebook insiders offers a glimpse into how different approaches to the algorithm can dramatically alter the categories of content that tend to flourish.

When Facebook launched the News Feed, in 2006, it was pretty simple. It showed a personalized list of activity updates from friends, like “Athalie updated her profile picture” and “James joined the San Francisco, CA network.” Most were automatically generated; there was no such thing as a “post,” just third-person status updates, like “Ezra is feeling fine.” Starting in 2009, a relatively straightforward ranking algorithm determined the order of stories for each user, making sure that the juicy stuff — like the news that a friend was “no longer in a relationship” — appeared near the top.

Over the past 12 years, almost everything about the news feed algorithm has changed. But the principle of putting the juicy stuff at the top — or at least the stuff most likely to interest a given user — has remained. The algorithm has simply grown ever more sophisticated to the point that today it can take in more than 10,000 different signals to make its predictions about a user’s likelihood of engaging with a single post, according to Jason Hirsch, the company’s head of integrity policy.

Yet the news feed ranking system is not a total mystery. Two crucial elements are entirely within the control of Facebook’s human employees, and depend on their ingenuity, their intuition and ultimately their value judgments. Facebook employees decide what data sources the software can draw on in making its predictions. And they decide what its goals should be — that is, what measurable outcomes to maximize for, and the relative importance of each.

[How Big Tech got so big: Hundreds of acquisitions]

Troves of internal documents have offered new insight into how Facebook makes those critical decisions, and how it thinks about and studies the trade-offs involved. The documents — disclosures made to the U.S. Securities and Exchange Commission and provided to Congress in redacted form by Haugen’s legal counsel — were obtained and reviewed by a consortium of news organizations, including The Washington Post. They have focused lawmakers’ attention on Facebook’s algorithm and whether it, and similar recommendation algorithms on other platforms, should be regulated.

Defending Facebook’s algorithm, the company’s global affairs chief, Nick Clegg, told ABC’s “This Week” earlier this month that it’s largely a force for good, and that removing algorithmic rankings would result in “more, not less” hate speech and misinformation in people’s feeds.

In its early years, Facebook’s algorithm prioritized signals such as likes, clicks and comments to decide which posts to amplify. Publishers, brands and individual users soon learned how to craft posts and headlines designed to induce likes and clicks, giving rise to what came to be known as “clickbait.” By 2013, upstart publishers such as Upworthy and ViralNova were amassing tens of millions of readers with articles designed specifically to game Facebook’s news feed algorithm.

Facebook realized that users were growing wary of misleading teaser headlines, and the company recalibrated its algorithm in 2014 and 2015 to downgrade clickbait and focus on new metrics, such as the amount of time a user spent reading a story or watching a video, and incorporating surveys on what content users found most valuable. Around the same time, its executives identified video as a business priority, and used the algorithm to boost “native” videos shared directly to Facebook. By the mid-2010s, the news feed had tilted toward slick, professionally produced content, especially videos that would hold people’s attention.