/cdn.vox-cdn.com/uploads/chorus_image/image/55908793/1_jSphYJVckMFL8MikR5c--w.0.jpeg)
Fox News has not dramatically fired star reporter Megyn Kelly, yet Facebook’s trending-topics news widget — or, as the company calls it, Trending — displayed a story indicating otherwise Sunday night. The story was fake. This wasn’t a prank or a glitch. It was simply Facebook, failing.
Megyn Kelly still has a job, but Facebook fired its Trending editorial staff last week as it overhauled the widget. Trending has devolved from a crappy RSS feed to a word scramble that looks like it was written by Perez Hilton, and that’s when it’s working properly. When it isn’t, we get bogus Megyn Kelly news. How did it get this bad?
It’s a case of overcorrection. This spring, Gizmodo reported on former Facebook workers’ claims that the Trending editorial team could be biased against politically right-wing sources. The report made Facebook look simultaneously incompetent and sneaky. Facebook had presented itself as a neutral technology company that just happened to surface news stories its users were talking about. Yet its former workers claimed that it acted like a liberal newsroom, picking and choosing what to cover. Facebook still denies that it ever had systemic political bias, but it overhauled its approach to Trending last week in response to these allegations, replacing its old editorial team with a hands-off review staff. It also axed the short descriptions of the news that used to accompany the keywords it dredged up.
Facebook’s new review unit does not add context with captions, or decide what news items should be amplified or ignored. Instead, it double-checks news items to check that they are tied to a “real-world event.” The news items themselves are chosen by algorithm. “Making these changes to the product allows our team to make fewer individual decisions about topics,” Facebook said, as though allowing its employees to make decisions was obviously the big problem that needed to be solved.
Facebook published the fake Megyn Kelly trending topic because its review team, following its new guidelines, only checked that posts about Megyn Kelly existed, instead of checking that the posts were true. A source told The Ringer that the topic was marked for re-review Monday morning because its sources were likely inaccurate. (Facebook did not answer The Ringer’s question about exactly how an item gets flagged for re-review.) The details are unclear, but we know this: Facebook promoted fake news as real news after it dumped its human edit team.
In the aftermath of a scandal, Facebook could’ve used its vast resources to improve its editorial team. It could’ve hired fact-checkers to do independent story verification! Facebook could’ve bought The New York Times! Facebook is rich as hell! But the company responded to allegations that its news team was not making the right editorial choices by stripping that news team of the ability to make editorial choices altogether.
The Los Angeles Times has an algorithm that writes breaking news about earthquakes. Maybe it’s not so crazy to imagine that Facebook could write an algorithm that can cherry-pick important journalism! Then again, a human at the Los Angeles Times had to decide that earthquakes were worth writing about. Figuring out what matters and what is true is not a simple task, and each requires subjectivity.
I don’t know if there is any good way for Facebook to run the trending-topics feature, but this is a uniquely bad way. This way assumes that an algorithm will be less biased than a person, even though people make algorithms. “Algorithm and data-driven products will always reflect the design choices of the humans who built them, and it’s irresponsible to assume otherwise,” data scientist Fred Benenson told Technical.ly during an interview about Trending.
Facebook’s algorithms are built to make people use Facebook more. They do not contain a secret wisdom about historical relevance; they don’t have a secret formula to ferret meaning from our daily blogroll. “On Facebook the goal is to maximize the amount of engagement you have with the site and keep the site ad-friendly. You can easily click on ‘like,’ for example, but there is not yet a ‘this was a challenging but important story’ button,” sociologist Zeynep Tufekci wrote in a New York Times piece on Facebook’s algorithm bias.
If Trending was a widget on Google+ or Ello or some other social network nobody really uses, its aggressive shittiness would be a curiosity worth nothing more than an eye-roll. As it is, Trending is not an integral Facebook feature. When we talk about Facebook’s dominance as a news source, we’re talking about how people read stories that appear in their News Feeds. But Facebook’s remarkably goofy overcorrection here is noteworthy and troubling because the company responded to criticism in the least thoughtful way: The algorithms will fix this!
Looking at Facebook’s Trending today, you’ll see a list of random-seeming phrases chosen by this algorithm. These are the most noteworthy topics picked for me around noon Monday: Will Smith, Spice Girls, McChicken, Los Angeles International Airport, Avicii. Unfortunately, the bots thought that a McDonald’s novelty sandwich was one of the most vital stories for me to see today. And it wasn’t even the McRib.