The company has vowed to address the issues of abuse and foreign interference that affected the 2016 election. It is still figuring out how to do so.

It’s been a weird month—a weird year, technically—for Facebook. A few weeks ago, as the anniversary of President Trump’s inauguration approached, CEO Mark Zuckerberg declared that his annual New Year’s resolution was, to quote Quartz, to “do his job.” Zuckerberg vowed to “focus on fixing” a handful of issues that influenced the 2016 election, and Facebook has since been grinding out a series of announcements that share a tone of mild regret and cautious optimism toward solving issues of abuse, belief bubbles, and foreign interference. A company that has historically hid behind an insistently positive force field of public-relations spin is now in the throes of figuring out a not-horrible way to admit its many, extremely consequential errors.

The most recent update came in the form of a Monday-morning blog post entitled: “Hard Questions: What Effect Does Social Media Have on Democracy?” The answer to that question, as Facebook’s product manager of civic engagement, Samidh Chakrabarti, explains, is that it can enable the spread of false information, be susceptible to foreign interference, and encourage harassment. “If there’s one fundamental truth about social media’s impact on democracy it’s that it amplifies human intent—both good and bad,” he writes. “At its best, it allows us to express ourselves and take action. At its worst, it allows people to spread misinformation and corrode democracy. I wish I could guarantee that the positives are destined to outweigh the negatives, but I can’t.” To paraphrase: Facebook is a powerful force capable of perverting the structures that protect our basic human rights, but it can’t guarantee that it won’t destroy us all. And by the way, hope you catch this blog post on our corporate communications page!

Like many self-preserving Silicon Valley behemoths, Facebook has spent years perfecting a disjointed disclosure strategy that splits up every update into a vaguely worded blog post that the average user probably misses. Even if a user happens to read one, its significance remains intentionally unclear. The effect of these updates is something like assembling the most frustrating 10,000-piece puzzle that has ever been made. If you squint very hard, you can see something like a bigger picture. But it’s impossible to connect the individual parts in any solid, conclusive way.

Which brings us to Facebook’s current plan to fix Facebook. To get any idea of what’s going on, one must collect and arrange a handful of hazy company announcements and news stories. Following Zuckerberg’s vague New Year’s resolution to “fix our issues together,” he told The New York Times that Facebook’s News Feed would be updated to surface content from family and friends and to downplay posts from news sources and businesses. (Most publishers were not thrilled to learn of this change, and one even argued that it would “[hurt] the nation.”) Last Friday afternoon, around the time most people were packing up their things to start their weekend, both Zuckerberg and the head of Facebook’s News Feed, Adam Mosseri, shared a second major update: It will soon prioritize news from publications based on community ratings of their trustworthiness. We can’t know how users will wield this responsibility, but let’s just remember that time when a group of scientists asked the internet to name their new research vessel, and they wound up with Boaty McBoatface.

Chakrabarti’s most recent assessment is startlingly bleak, but he nevertheless offers a handful of solutions that he and his team are exploring. To address foreign interference, they will confirm the identities of who paid for election-related ads and put them in an archive that will allow people to research where this content is coming from … on Facebook. To fight fake news, they’ve made it “easier” to report offending stories … on Facebook. To prevent echo chambers, they’re testing a feature called “Related Articles” that offers people the opportunity to click through stories with varying perspectives on the news they’re reading about … on Facebook. To aid in unequal participation, they’ve pioneered new privacy models that they say have encouraged women to speak up … on Facebook. To address almost every topic of concern, Chakrabarti has a consistent prescription: more time engaging with Facebook.

Both in the solutions that Chakrabarti offers and in the announcements that preceded them, Facebook relies on the assumption that user participation is inevitable. In the company’s perfect world, all well-meaning human beings with Facebook profiles—people of different genders, beliefs, religions—have already read these corporate communications and taken note of their new responsibilities as online citizens. In reality, these messages may be fully taken to heart only by journalists, viral content manufacturers whose existences depend on the whims of the Facebook algorithm, and fringe, highly active communities. For the average person swiping through their phone, consumption of media on a social network like Facebook is largely passive—a blur of DIY videos, Trump parodies, Fiona the hippo, and stray hate speech. Facebook fails to recognize just how little most normal users are paying attention to its suggestions to investigate the source of their Facebook content, let alone rank their trustworthiness.

In many ways, the expectation of equal, well-distributed participation is also the most glaring flaw in our modern democracy. We continue to struggle with skewed congressional district maps, voter suppression, and an election system that favors the Electoral College over the popular vote. But there’s a big difference between the theoretical system of checks and balances that the United States has in place to address these issues and Facebook’s self-improvement process. The former relies on a government funded by taxpayer money and run by representatives who are voted in and out of office every few years. The latter starts with the New Year’s resolution of a billionaire in charge of the world’s most influential social media company, and it trickles down to the public via a handful of blog posts and features that conveniently lead us all back to a Facebook URL. In this online ecosystem, users are expected to self-regulate everything from their behavior to their content consumption—a task as unrealistic as expecting a powerful, for-profit corporation like Facebook to regulate itself.

Facebook may have just begrudgingly admitted that social media has the potential to ruin our democracy, but it still insists it can fix it with more social media. It’s not entirely clear how the company will bring it all together, or whether Facebook can offer assurances stronger than a phrase like “we’ll see if it helps.” But one thing we can all be sure of: More blog posts on the topic are coming soon.

Keep Exploring

Latest in Tech