2
2
How do you fight fake news, misinformation, and sensationalism on Facebook? One way might be to try to somehow distinguish between reputable and sketchy news sources within the news feed. On Friday, Facebook announced that it will attempt just that—fraught as the endeavor is certain to be.
It’s a risky stand for a company that has long resisted any action that could open it to claims of political bias—especially at a time when the credibility of mainstream news sources has become a polarizing issue in itself. And, at least philosophically, it seems like a step in the right direction. Now we have to hope that Facebook’s implementation turns out to be a lot more thoughtful and nuanced than CEO Mark Zuckerberg makes it sound.
Starting next week, the company will test a change to the news feed ranking system that aims to prioritize links to “broadly trusted” sources of news over those from, well, less-broadly trusted sources. And how will it determine that? Here’s how Zuckerberg explained it:
The hard question we’ve struggled with is how to decide what news sources are broadly trusted in a world with so much division. We could try to make that decision ourselves, but that’s not something we’re comfortable with. We considered asking outside experts, which would take the decision out of our hands but would likely not solve the objectivity problem. Or we could ask you—the community—and have your feedback determine the ranking.
We decided that having the community determine which sources are broadly trusted would be most objective.
Specifically, Facebook will begin asking respondents to its ongoing user surveys to tell it whether they trust various news sources. From that data, it will rank news outlets based on the ratio of the number of people who trust the source to the number who are familiar with it.
This is the first of a series of changes, Facebook said, intended to make sure that the news people see on the social network is “high quality.” This tweak is designed to address trustworthiness; subsequent changes will aim to prioritize news that’s “informative” and “local.”
At first blush, it looks like Facebook is doing exactly what I and other critics have long been calling for it to do: acknowledge that its algorithm plays a crucial role in determining what news people read, and take some responsibility for its profound effects on the media and the spread of information. It’s about time, right?
Except that, based on its announcement, Facebook’s approach to a notoriously difficult problem—figuring out which media to trust—appears to be painfully simplistic and naïve.