clock menu more-arrow no yes

Filed under:

Can Facebook Make Good on Its Data Fixes?

The company’s latest privacy scandal—courtesy of Cambridge Analytica—has yielded another round of promises from Mark Zuckerberg. Can Facebook actually solve these problems?

Ringer illustration

Maybe you’ve heard: Facebook has some problems, and Mark Zuckerberg has finally surfaced online to offer solutions. Four days after The Guardian, The New York Times, and the U.K.’s Channel Four reported that the political research firm Cambridge Analytica exploited third-party-app access to steal data from 50 million users, the CEO responded on Wednesday to growing concerns that his social network can’t and shouldn’t be trusted with sensitive information.

“We have a responsibility to protect your data, and if we can’t then we don’t deserve to serve you,” Zuckerberg wrote in a Facebook post. “I’ve been working to understand exactly what happened and how to make sure this doesn’t happen again.”

Zuckerberg detailed a three-pronged plan to make things right: (1) “Investigate all apps that had access to large amounts of information before we changed our platform to dramatically reduce data access” and “conduct a full audit of any app with suspicious activity,” (2) “restrict developers’ data access even further to prevent other kinds of abuse,” and 3) “make sure you understand which apps you’ve allowed to access your data.”

If we take Zuckerberg’s expressed desire to fix what he started (and save plunging stock prices) at face value, then it’s worth analyzing the plausibility of these three actions. The first pledge, to look into any app that could’ve gathered and abused “large amounts” of user information, is the most ambitious. In May 2007, Facebook launched a platform that streamlined the login process for lesser-known apps by simply connecting their Facebook information. This allowed the third-party apps like Vine and FarmVille to gather information about users and also their friends. By November that same year, 7,000 third-party apps had been developed for the platform, with about 100 more being added every day. By July 2008, that number had ballooned to 33,000 and included popular apps like Digg and StumbleUpon. Six years later, access to this data was shut down after user surveys indicated that it caused privacy concerns. “If people don’t feel comfortable using Facebook and specifically logging in Facebook and using Facebook in apps, we don’t have a platform, we don’t have developers,” product manager Simon Cross told journalists at the time. All told, the number of third-party apps that had access to the same information Cambridge Analytica did was gargantuan.

Despite the platform’s rapid growth, it took Facebook seven years to address the privacy violations inherent in sharing the data of a user’s friend. That’s plenty of time for a single person to use a Facebook login for hundreds, if not thousands, of spammy, now-defunct, or potentially nefarious third-party apps. That is a lot of investigating to do, even for the largest social media platform on earth. If the company has any realistic hope of accountability, it must start by defining the perimeters of this investigation more concretely. What does this daunting process of investigation entail? How will the company define what constitutes “large amounts” of user information? What’s the difference between investigating possible abuse and a full audit? And what kind of resources is it committing to make all of this extremely tedious work possible? I suspect these are not details that would ever be shared on Zuckerberg’s personal Facebook page.

Zuckerberg’s second pledge “to restrict developers’ data access even further” is much more doable, partly because, this is the kind of approach that Facebook has been taking toward user privacy for years. The company’s product is designed for passive data collection and counts on very few users having the time to organize their sock drawers, let alone declutter their third-party apps. It’s for this reason that, until this week, my coworker did not know she was sharing her friends list, posts, and photos with the online word game Lexulous until she read the headlines and deleted them “in a fit of rage.” Or that I was unaware that the defunct music-streaming app Rdio still somehow had access to my data. Where my information could have been funneling to, I do not know. And why Facebook did not foresee this digital detritus and implement an automatic cutoff for unused apps is something that many of its longest users have been asking this week. It is notable that Facebook has pledged to solve this problem only as a result of intense scrutiny—though it’s an issue that has existed for more than a decade, in the case of some apps. But because so few Facebook users truly understand the extent to which their data is exchanged, this is something the company probably rightly anticipates we’ll be grateful for. And because this pledge concerns itself with Facebook’s own policies, the company can easily make it happen.

Of course, the third pledge is the simplest: after-the-fact transparency. “We will show everyone a tool at the top of your News Feed with the apps you’ve used and an easy way to revoke those apps’ permissions to your data,” Zuckerberg wrote. I’ve been covering Facebook long enough to remember when the company gathered a handful of journalists in 2014 to present us with something it called the “privacy dinosaur.” (We asked if it had a name—it didn’t have a name.) It was, essentially a privacy-checkup tool that helped users become more aware of what they were sharing with whom. The feature was a nice thing to offer to users, and, more importantly, it pushed back on the common refrain that Facebook actively disregarded privacy. This forthcoming third-party-checkup tool is just privacy dinosaur 2.0—a good thing to refer journalists to the next time they criticize the company for lack of transparency.

So, is Zuckerberg’s three-pronged plan to fix its latest scandal viable? More or less. But the caveat, as always, is that Facebook is a public corporation. And, as a public corporation, it’s expected but not required to follow up with any coherent proof that the above plan is in the process of being, or will ever be, completed. This giant, monolithic company that started out rather innocently as a gossipy place to stalk hot people doesn’t need to adhere to a deadline, let alone a mandate to be more careful with user data. Perhaps the high-profile nature of the Cambridge Analytica scandal will shift the public—or political—expectations of Facebook; as Senator Edward J. Markey told Zuckerberg an hour after he posted his mea culpa, “You need to come to Congress and testify to this under oath.” But until someone steps up to regulate this unwieldy beast, Facebook is its own social experiment, and it solves its own problems. Or it doesn’t, as the case may be. We’ll find out in the next Mark Zuckerberg post.