clock menu more-arrow no yes

Filed under:

YouTube’s Alex Jones Problem

The platform has warned the controversial ‘Infowars’ host time and again. So why won’t it ban him?

YouTube/Ringer illustration

The Alex Jones Channel, which hosts Infowars, is ten years old. The YouTube channel is only three years younger than YouTube itself. And in the past year, the channel’s survival has embarrassed the platform, undermining its content restrictions and underscoring a hesitation among social media websites to moderate hate speech and the alt-right.

In just the past year, Infowars — a raucous, right-wing pulpit for Alex Jones and his lesser cast of “news” anchors — has peddled conspiracy theories about school shootings, child sex trafficking, and a slain Democratic National Committee staffer. On Thursday, YouTube issued a “first strike” against The Alex Jones Channel, removing four unspecified videos and suspending the account’s live-streaming privileges. To be clear, the “first strike” isn’t the first strike that YouTube has issued against Jones and his Infowars team. In February, YouTube issued two strikes against The Alex Jones Channel for antagonizing survivors of the Parkland massacre, bringing the channel to the brink of termination.

On Friday, Facebook, too, censured The Alex Jones Channel, removing four videos from the channel’s feed — including footage of an adult male choke-slamming a young boy, billed as a “public service announcement” about “how to prevent liberalism.” Additionally, Facebook announced its decision to ban Alex Jones for 30 days. “Our Community Standards make it clear that we prohibit content that encourages physical harm [bullying], or attacks someone based on their religious affiliation or gender identity [hate speech],” Facebook says. According to a CNET report, Facebook’s most recent strikes against The Alex Jones Channel restrict only Jones, personally, from streaming and posting new content for the next 30 days; his colleagues, who cohost The Alex Jones Show as well as other Infowars.com programming, such as War Room, are able to continue updating the channel. Despite serial infractions, neither Facebook nor YouTube have resolved to ban Infowars and The Alex Jones Channel outright.

Facebook is the largest social media platform in the world, and YouTube is the capital of Infowars viewership. Without Facebook and YouTube, Infowars doesn’t exist.

But YouTube won’t just ban Infowars. In general, social media companies have proved reluctant to punish prominent right-wing figures for hate speech and other offensive content, fearful of criticism and reprisals from right-wing trolls, including the president himself.

Ostensibly, there’s recourse. In YouTube parlance, a “strike” is an official censure against a user, such as Jones, who publishes “hateful content,” “harmful or dangerous content,” or “violent or graphic content,” which are all examples of acts prohibited by the website’s “Community Guidelines.” There are two kinds of strikes: a “copyright strike,” which suggests copyright infringement; and a “community guidelines strike,” which suggests extremely obscene or unlawful deptictions. In either case, three strikes get a user banned, and their accounts deleted, from YouTube—but only if YouTube issues each strike within three months of a previous strike. For community guidelines violations, each strike is expunged after three months. The latest strike against The Alex Jones Channel comes five months after the February 2018 strikes, and so, technically, the account’s third strike this year carries the punitive weight of a “first” strike. YouTube removed four offending videos and limited the channel’s ability to livestream.

The livestream ban is a substantial, if temporary, penalty since The Alex Jones Channel typically airs more than 20 hours of live programming, including Infowars, The Alex Jones Show, and a Nightly News show, seven days a week. But YouTube has yet to explain why it doesn’t just ban The Alex Jones Channel, which programmatically releases content at odds with the website’s community guidelines. In response to a query about YouTube’s discretion in moderating offensive accounts, such as The Alex Jones Channel, a spokesperson for Google—which owns YouTube—referred The Ringer to the standard community guidelines language that outlines the three-months, three-strikes system.

The major social media platforms—including YouTube, Facebook, Twitter, and Reddit—have shown overwhelming hesitation to moderate right-wing hate speech and conspiracy-mongering. In fact, the alt-right has largely preempted moderation efforts by dramatically exaggerating the degree to which prominent right-wing figures are moderated already, thus displacing fears about real neo-Nazis with fears about illusory censorship. On Wednesday, right-wing agitators seized on a Vice News report about “shadow banning,” a term meant to describe the process by which a web moderator might quietly restrict a troublesome account’s exposure to other users on a platform. In this case, right-wing figures—including Donald Trump—insist that Twitter has “shadow banned” some exceptionally popular right-wing Twitter users, including the Infowars.com editor Paul Joseph Watson. Donald Trump has some thoughts. “Twitter ‘SHADOW BANNING’ prominent Republicans. Not good,” Trump tweeted on Thursday. “We will look into this discriminatory and illegal practice at once! Many complaints.” (On Thursday, a Twitter product lead denied any “shadow banning,” attributing the limited visibility of certain users in native search results to a glitch.)

The “shadow ban” backlash is a classic example of right-wing activists claiming marginalization in media, including social media, in order to discourage moderation of even their most extreme speech. On YouTube, Twitter, Facebook, and Reddit, this speech often includes conspiracy theories, harassment campaigns, and neo-Nazi mobilization. It’s not only repugnant speech—though the platforms and the offenders, together, make vague gestures toward the first amendment. In many cases, The Alex Jones Channel broadcasts political rhetoric so wild and spiteful, to an audience so demonstrably rabid, that it becomes tough to conceive of YouTube as a website with any sort of “hateful content” prohibitions. Indeed, YouTube is a crucial hub for alt-right thinkers to broadcast, interact, and cultivate large followings. The fact that conspiracy theorists and white nationalists have made YouTube their home suggests some fundamental oversight, endemic to the platform, which Jones has eagerly exploited for a decade and counting.

Infowars is, indisputably, beyond rehabilitation. On Tuesday, the media watchdog website Right Wing Watch, which focuses on conservative organizations, published a report about the new Infowars cohost, Jake Lloyd. For the past couple months, Lloyd has hosted several white nationalists as “experts” on The Alex Jones Channel. On Twitter, Lloyd plays dumb about his association with neo-Nazis, but on YouTube—through The Alex Jones Channel as well as his personal account—he’s made his sympathies clear: Lloyd means to use YouTube to promote white nationalism. It’s YouTube’s content, whether YouTube likes it or not. It’s YouTube’s problem, whether YouTube likes it or not.