
Here are the basic facts that anyone with an internet connection has gathered over the past week: Facebook did a Bad Thing with its user data (again), a political data mining company called Cambridge Analytica was involved, and its vaguely sinister machinations may have somehow led to Donald Trump being elected president.
Like you, dear reader, I have been on Facebook for a long time, and I recall when Facebook redesigns garnered more backlash than privacy transgressions. But the Cambridge Analytica story has riled journalists, users, and (most worryingly for Facebook) lawmakers in a way previous controversies did not. The story goes back to 2014, when a psychology professor built a Facebook quiz app that allowed him to surreptitiously harvest the data of 50 million users without their permission, then sold the data to Cambridge Analytica. Or you could argue that the story begins in 2007, when Facebook opened its platform to developers and allowed them to gather data about both the people who used the apps and those users’ networks of friends. Or maybe it goes back to 2004, when Mark Zuckerberg was first building Facebook in his dorm room and allegedly said his classmates were “dumb fucks” for trusting him with their data. The reason this story feels impossible to fully comprehend is because it represents all the anxieties about privacy that Facebook has built up in its 14-year history.
But we’re going to try to understand what’s going on. We can make it there together. Here’s what you really need to read to get up to speed:
If you need a big-picture primer on what went down: In clear terms, Wired walks through how a researcher named Aleksandr Kogan developed a personality quiz app used by 270,000 Americans on Facebook. Because of Facebook’s privacy permissions at the time, Kogan was also able to access the data of all these users’ friends as well, totaling an estimated 50 million people. He then sold the data to Cambridge Analytica, an analytics firm that worked on Donald Trump’s presidential campaign. The sale of the data violated Facebook’s policies, but the harvesting of it did not.
If you just want to know if this means Facebook got Trump elected: The answer is probably not, at least not because Cambridge Analytica made off with this data stash. Read The Verge’s critique of “psychographic profiling,” the data-mining buzzword that Cambridge Analytica claims allows it to target voters based on their personality traits (as opposed to traditional demographic targeting). After you understand the theory, read this 2017 investigation by The New York Times claiming that psychographic profiling wasn’t used in the Trump campaign and that it was ineffective when used by Ted Cruz’s team in his failed presidential bid.
If you really have to be the “well, actually” guy, even in 2018: Read this Associated Press report on how Barack Obama’s campaign used a Facebook app to mine user data in the 2012 presidential race. Obama staffers say that their actions were transparent because users knew they were using a political app and they didn’t directly target the friends of the people using the app.
If you want to know how this fits into your overarching Russian conspiracy theory: Vox has a handy explainer about Cambridge Analytica’s ties to the Kremlin (they’re ambiguous, but various U.S. government investigations into Russia’s role in the 2016 election are ongoing). And in an in-depth interview with Wired, Zuckerberg couldn’t rule out the chance that the Facebook data bought by Cambridge Analytica ultimately wound up in the hands of Russian operatives.
If you want to know about the pink-haired dude at the center of this mess: Read this Washington Post profile of Christopher Wylie, the Cambridge Analytica insider who revealed the organization’s misdeeds to a variety of news outlets. Wylie talks like a stoned college sophomore who just deleted his Facebook account (beware “Steve Bannon’s psychological warfare mindfuck tool”), which explains why his colorful quotes helped this story go viral. But a BuzzFeed investigation shows that Wylie was eager to develop the targeting tools at the center of the current controversy and sell them to the highest bidder. “We will cleanse our souls with other projects, like using the data for good rather than evil,” he wrote in 2013. “But evil pays more.”
If you want to know how Facebook plans to respond: Read my colleague Alyssa Bereznak’s explanation of Facebook’s three-pronged plan to ensure that this kind of data misuse doesn’t happen in the future.
If you want the greatest hits from Zuck’s apology tour: On Wednesday Zuckerberg did lengthy interviews with Wired, CNN, Recode, and The New York Times (and of course, wrote a Facebook post). Read the Wired one for the best big-picture sense of how this incident is changing Zuckerberg’s thinking, and the Recode one to enjoy watching Zuck squirm around tough questions. Overall the interviews paint Zuckerberg as someone blindsided by the fact that his idealistic vision of connecting the world (via the largest data-mining platform ever conceived by man) has led to negative outcomes. “If you had asked me, when I got started with Facebook, if one of the central things I’d need to work on now is preventing governments from interfering in each other’s elections, there’s no way I thought that’s what I’d be doing, if we talked in 2004 in my dorm room,” he told the Times.
If you want to know how the federal government is responding: Keep an eye on the Twitter feeds of Mark Warner and Amy Klobuchar, two U.S. senators who are backing a bill to require more transparency in online political ads. (Arizona Senator John McCain is also a sponsor.) This week both officials called for Zuckerberg to testify before Congress (Zuck said he’s open to the idea). The FTC has also opened an investigation into the Cambridge Analytica scandal, but honestly its Twitter feed is super boring.
If you’ve absorbed this huge amount of knowledge, conducted a sober assessment of your digital life, and decided that, yes, it really is time to delete Facebook: The Verge has a guide to getting off the social network (surprise—it’s super annoying). But also read my colleague Kate Knibbs’s story about the limited efficacy of consumer boycotts. And if you’re not ready to push the big red button, at least follow BuzzFeed’s instructions for limiting app developers’ access to your data.
If you want to remember how naive we all were about what Facebook would become: Check out this 2007 Fortune story about the company’s plan to become a platform that would share user data with third-party developers—the exact functionality that Cambridge Analytica abused to ignite this controversy. Zuckerberg, then 23, said that Facebook’s new open-platform initiative would become “the most powerful distribution mechanism that’s been created in a generation.” The Fortune writer’s take? “He may be crazy, but then again, he is something of a boy wonder.” Only Zuckerberg really saw the potential of what he was building—and even he couldn’t predict the destructive chain reaction.