A company with a nebulous purpose is a dangerous one. Facebook was once easy to understand—it was an online directory for college classmates. Thirteen years later, the same company under the same leadership is embarking on the incomprehensibly complex mission to “bring the world closer together” via a one-size-fits-all-seven-billion-of-us platform mediating commerce, community discussion, entertainment, and politics. There is no precedent in U.S. history for such a rapid exponential leap in corporate ambition, and scant evidence that a 19-year-old Mark Zuckerberg planned for his invention to change the nature of online discourse.
“A lot of people are focused on taking over the world,” Zuckerberg said in a 2005 interview at Facebook’s early, keg-accented office. “I really just want to see everyone focus on college and create a really cool college directory product.” Talk to early Facebook employees now, and they’ll admit that the social network’s master plan for global domination didn’t exist in its early years. “When I first got there, the mission of the company, I don’t think, was very clear,” Leah Pearlman, who joined the company in 2006 and cocreated the Like button, told me in February. “It wasn’t just a college site anymore so we were thinking about that. But the mission didn't come until a few years after I was there.” Even when Pearlman left the company in 2011, Facebook’s stated purpose on its login page was to “[help] you connect and share with the people in your life.” It wasn’t until early 2013, just months after becoming a publicly traded tech giant, that the company changed its login page text to “Connect with friends and the world around you on Facebook.”
The thread that ties together every iteration of Facebook is growth. Unlike its social-networking precursors like Myspace and AOL Instant Messenger, Facebook has ably adapted to changing times by aggressively redesigning its products, copying competitors, and acquiring emerging threats. Facebook’s greatest skill has always been protecting Facebook.
But the social network’s 2 billion monthly users, $530 billion valuation, and stranglehold on the average user’s free time (averaging 50 minutes a day) don’t make it automatically well-suited to be the connective tissue of the entire digital landscape. No one, from Zuckerberg on down, could have imagined that a website that got its start as a Hot or Not clone would one day be testifying before Congress over its role in spreading misinformation during a presidential election. We weren’t shepherded into this Facebook-controlled universe by responsible stewards of free speech and the democratic process; we crash-landed here by accident.
Now, faced with the reality that as many as 126 million of its users were exposed to propaganda from Russian operatives between January 2015 and August 2017, Facebook is adapting once again, in order to ensure its survival and avoid onerous federal regulation. On Tuesday and Wednesday, Colin Stretch, Facebook’s general counsel, testified before three different congressional subcommittees on the company’s role in hosting hundreds of accounts and tens of thousands of posts created by the Internet Research Agency, a Russian “troll factory,” with the expressed intent of sowing discord in the United States before and after the presidential election. Twitter and Google also had lawyers on hand for the congressional grilling, but it was Stretch who took the most heat from lawmakers. “60 percent of the U.S. population uses Facebook,” Senator Richard Burr, chairman of the U.S. Senate Intelligence Committee, said in his opening remarks. “A foreign power using that platform to influence how Americans see and think about one another is as much a public policy issue as it is a national security concern.”
Though the Russian investigation is being viewed through a political prism, Stretch noted that about 90 percent of the ads that Russian operatives bought on Facebook were meant to increase divisiveness over wedge issues in the United States, such as gun control or police brutality, rather than to promote specific political candidates. Burr explained how Facebook’s tools had been used by Russian trolls to craft a real-world conflict in Houston last year. A fake Facebook account called Heart of Texas organized an anti-Muslim rally outside an Islamic center in the city in May 2016. Another fake account, United Muslims of America, organized a counterprotest. In total about 10 protesters and 60 counterprotesters showed up for the dueling events, and the conflict was covered by local news outlets. It was a perverse inversion of the rhetoric previously pushed by Facebook that social media has the power to galvanize real-life protest movements such as Occupy Wall Street and the Arab Spring. “Mr. Stretch, you commented yesterday that your company's goal is bringing people together,” Burr said. “In this case, people were brought together to foment conflict, and Facebook enabled that event to happen. I would say Facebook has failed their goal.”
Other lawmakers also chastised Facebook, as well as Twitter and Google, for allowing Russian posts to slip by undetected. Senator Al Franken wondered why advertisers buying American political messages in rubles (the Russian currency) didn’t raise any red flags. Senator John Kennedy questioned how Facebook could be sure none of its 5 million advertisers were foreign foes. And Senator Mark Warner, a former tech investor himself, fumed over the fact it took Silicon Valley months to offer the government any data on Russian interference in the election, despite being awash in near-limitless financial resources and unparalleled technological prowess. “Your companies know more about Americans than the United States government does,” Warner said, “and the idea that you had no idea that any of this was happening strains my credibility.”
Stretch was contrite throughout the proceedings (as were the other corporate lawyers), but his remorsefulness came across as a rhetorical tactic rather than a signal that Facebook will be fundamentally changed by its trip to Washington. Again and again, he emphasized Facebook’s important role in driving discussion around politics and divisive social issues. “We create innovative technology that gives people the power to build community and bring the world closer together,” he said in his opening statement. “We have an important role to play in the democratic process.”
Instead of shrinking back from its calamitous role in influencing global politics, Facebook is taking the mistakes it made in 2016 as an opportunity to assert more power. When negative attention surrounding Facebook’s role in the election was first gaining steam at the beginning of the year, Mark Zuckerberg issued a globalist manifesto asserting that Facebook was the next evolutionary organizational structure for human society, following tribes, cities, and nation-states. “There are questions about whether we can make a global community that works for everyone, and whether the path ahead is to connect more or reverse course,” he wrote. “In times like these, the most important thing we at Facebook can do is develop the social infrastructure to give people the power to build a global community that works for all of us.”
The fact Facebook accepted advertising money from Russian operatives, created ad-targeting systems helping them rile up specific demographic groups, and designed an overall engagement framework that helps incendiary posts go viral should be a damning sign that Zuckerberg is not the globalist czar to lead us into the utopian digital future. Instead, Facebook will play the Russian controversy off as a bug that can be fixed by more content reviewers or ad disclosures or all-powerful “artificial intelligence.” Throughout his testimony, Stretch reiterated that the primary issue with the Russian posts was that they were “inauthentic,” bought under duplicitous means, rather than racist, misleading, or vitriolic.
The problem with Facebook is bigger than Russia. Look through the ads the Internet Research Agency placed on Facebook and you’ll find a trove of political lies, blood-pumping bromides, and poorly Photoshopped image macros. That’s content that plays well on Facebook whether it was made by a secret army of trolls in St. Petersburg, teenagers in Macedonia, or a professional fake-news writer in Phoenix. The social network is designed to elevate extreme content, the stuff that keeps people glued to their screens.
The key to keeping Facebook safe from foreign propaganda isn’t hard-to-see ad disclaimers or more tools that task users with self-policing a billion-dollar-company’s app. It’s creating a platform that encourages thoughtful discussion, deep reading, and less reliance on shallow engagement metrics. It’s creating a platform that does not flatten various types of content for the expressed purpose of making advertisements appear as native as possible. It’s creating a platform with the goal of maturation rather than relentless, startup-like growth.
But that version of Facebook would be at odds with the one that just posted world-beating earnings and saw its share price hit a new all-time high. Zuckerberg chose to attend his company’s earnings call rather than the congressional hearing, a decision that lawmakers didn’t appreciate. After rattling off the company’s sterling topline results during the call, Zuckerberg noted, “None of that matters if our services are used in a way that doesn't bring people closer together—or if the foundation of our society is undermined by foreign interference ... I want to be clear about what our priority is: protecting our community is more important than maximizing our profits.”
Zuckerberg believes he can have the world—that he can create a platform that is ubiquitous, powerful, safe, wildly profitable, and a force for good. Because humans are greedy, deceptive, and fallible, that’s probably impossible. And there’s little reason to think that Facebook, in the business of making itself the center of the digital universe at all costs, is the company that will engineer a solution to humanity’s foibles. The only way the social network would ever get close is through radical reinvention or government regulation. After this week’s testimony, reinvention seems unlikely, but regulation feels less impossible than it did when the week began.
“We are not going to go away, gentlemen,” Senator Dianne Feinstein said on Wednesday. “You bear this responsibility. You’ve created these platforms. And now they are being misused. And you have to be the ones to do something about it, or we will.”