/cdn.vox-cdn.com/uploads/chorus_image/image/55910663/1_OXvfRO7wZHSZmXN0uRDoSw.0.png)
Last week, Facebook users in 14 countries opened their News Feeds to find a cute illustration of a magnifying glass over a newspaper, along with a tagline advertising “Tips for spotting false news.” If they chose to click through — and that’s a big if — they landed at the platform’s historically unsexy Help Center, where the digital media literacy group First Draft had provided 10 bullet points of plain-text advice. Among them were suggestions to “be skeptical,” “investigate the source,” and “look at other reports” when reading the news on Facebook. All of it was helpful common sense, but also tragically boring. It was as if the same tech platform that, for years, has been inventing features like autoplay video, livecasting, and colorful blocky text to improve engagement was suddenly bitter about completing an obligatory homework assignment, and wanted to make its users suffer for it, too.
To give Facebook credit, this wasn’t the first or only thing it had done to show us its measured concern with the spread of fake news. Since CEO Mark Zuckerberg flippantly described the notion of Facebook influencing the outcome of the U.S. election as “a pretty crazy idea” last year, the company has come a long way in demonstrating a commitment to eliminating misleading content in its ecosystem. In January, the company brought on former TV journalist Campbell Brown to form partnerships with the media. It also introduced the Facebook Journalism Project, an initiative that promised to develop new news products, train the public to be more news literate, and create what turned out to be problematic educational tools for journalists. Most notably, in late March it created a partnership with independent fact-checking organizations to flag what it dubbed “disputed” stories. (Though the head of Facebook’s News Feed only recently hinted at making that model more sustainable by being “open to” paying those fact-checkers itself.) Earlier this month the company became a founding (and funding) father of a news-literacy initiative that will run independently out of City University of New York’s journalism graduate school. Practically any organization that was in the position to critique Facebook’s news-delivery model now seems to be partnering with the company in some capacity.
When it comes to making continual announcements about its progress in the area of halting fake news, Facebook is killing it. And, in theory, having these experts aiding its efforts behind the scenes should be helpful. But amid all these partnerships and experiments, a fog has emerged as to what, if any, effect Facebook’s efforts are having on its users. For every headline we’ve read about a new initiative to combat fake news, the measurement tools and statistics to follow up on their success have been absent. First Draft’s tips to avoid spreading false information were harmless, but how many people even clicked through that link atop their News Feed to reach them? If they clicked, how many seconds did they spend on that imageless, videoless, colorless Help Center page? Now that the campaign has ended, why is it so difficult to manually search for those tips on the platform? Facebook might have this information, but it hasn’t shared it with the public at large. So we have no way of knowing whether these campaigns are simply paying lip service to an issue or actively improving it.
Tailoring the presentation of a website’s content based on engaging users is practically baked into the structure of a company as large as Facebook. As Seth Ashley, a communications associate professor at Boise State University, notes, it would only be natural to apply that same strategy to the platform’s news literacy campaigns.
“I’m all for them doing A-B, product testing,” he said. “A company that big is definitely trying to make sure it knows its market and is reaching it correctly. Even for literacy stuff, what works on one market might not work on a different one. What works for my students might be very different from what works for my mom on Facebook.”
The public may not necessarily be privy to the impact of each campaign to combat fake news, but there is hope that more engaging educational content is on the horizon. The News Literacy Project, which will release its own campaign on the platform in a couple weeks, has promised its presentation will include “videos and other multimedia elements familiar to Facebook users.” Though the organization has declined to comment on whether they will be specifically tailored to certain demographics of Facebook’s 18-billion-user community, its PSA will at least be more eye-catching than what we’ve seen so far.
In order for Facebook to really follow through on its promise to educate users, however, its partners need to leverage the platform’s most popular tools. Facebook’s Sabrina Kizzie, an adjunct lecturer in social media at Baruch College, suggests using Facebook Live to broadcast live Q&As with experts or — in the spirit of transparency — introduce the team of experts and engineers Facebook has tasked to address the issue.
“You want to provide this information, but you have to consider every chance to make it easy for consumers,” she said. “If you really want to get a message across to consumers, the sense of urgency of starting a Live video in the News Feed will get to them right away. Use the platform.”
Why Facebook hasn’t used the very tools it promotes to the masses for its news-literary efforts is unclear. Zuckerberg has been historically cagey about his company taking a stand for or against specific news sources to avoid seeming partisan. Passing on this responsibility to a collection of nonprofits provides a layer of protection to the company. Maybe it’s too risky to assign an individual expert to a Live video Q&A, on the off chance that they might — oh, I dunno — criticize a site like Breitbart or discuss the inherent sociological effects that Facebook’s interface has on its users. As Ashley points out, some of the problems in news literacy that he teaches to his students are problems created by Facebook itself.
“Facebook is a business and they want to keep people on their website,” he said. “They’re not likely to do much that runs counter to those goals.”