Deplatforming isn’t about free speech, it’s about stopping the spread of hate

The implications for free speech have worried conservatives and liberals alike. Many have expressed wariness about the power social media companies have to simply oust whoever they deem dangerous, while critics have pointed out the hypocrisy of social media platforms spending years bending over backward to justify not banning Trump despite his posts violating their content guidelines, only to make an about-face during his final weeks in office. Some critics, including Trump himself, have even floated the misleading idea that social media companies might be brought to heel if lawmakers were to alter a fundamental internet law called Section 230 — a move that would instead curtail everyone’s internet free speech.

All of these complicated, chaotic arguments have clouded a relatively simple fact: Deplatforming is effective at rousting extremists from mainstream internet spaces. It’s not a violation of the First Amendment. But thanks to Trump and many of his supporters, it has inevitably become a permanent part of the discourse involving free speech and social media moderation, and the responsibilities that platforms can and should have to control what people do on their sites.

Radical extremists across the political spectrum use social media to spread their messaging, so deplatforming those extremists makes it harder for them to recruit. Deplatfoming also decreases their influence; a 2016 study of ISIS deplatforming found, for example, that ISIS “influencers” lost followers and clout as they were forced to bounce around from platform to platform. And when was the last time you heard the name Milo Yiannopoulos? After the infamous right-wing instigator was banned from Twitter and his other social media homes in 2016, his influence and notoriety plummeted. Right-wing conspiracy theorist Alex Jones met a similar fate when he and his media network Infowars were deplatformed across social media in 2018.

Remember this troublemaker? Maybe not, deplatforming shut him down right quick

Vox

January 20 is what happens when these platforms take even minimal steps to block violent extremists from their services. After the Capitol siege, Facebook finally decided to ban QAnon accounts. It didn’t even ban that many accounts: Reports suggest it restricted around 2,000 Facebook groups and around 10,000 Instagram accounts, barely a dent in its overall user base. Yet, scarcely two weeks after removing some of the most obvious bad users, the “violent insurrectionists” are already reduced to a couple of randos milling around state capitols with arts-and-crafts projects, wondering where the party went.

Facebook and Twitter also removed the most notorious bad user: Trump. The last 10 days have been blissfully void of his inane complaints about the election and terrifying love notes to white supremacists. And look what’s happened. The New York Times reports that the Proud Boys, who pledged themselves to “Emperor Trump” not three months ago, are now calling him a “total failure.” Removing Trump from his Twitter account for only two weeks has already helped to cause a rift between the militant forces of white supremacy and the head of the Republican Party.

I will always believe that if Twitter had banned Trump’s account the moment he started lying about the results of the election, five people would not have died in a riot at the Capitol. I will also believe that if Twitter had banned Trump’s account the moment he started lying about the coronavirus, hundreds of thousands of people might have been saved from this disease. And I will always believe that if Twitter had banned Trump’s account the moment he started lying about Barack Obama’s nation of birth, he would never have been president in the first place.

Who will be Trump' running mate?