Over the past several years we’ve learned a lot about the unintended consequences of social media. Platforms intended to bring us closer together make us angrier and more isolated. Platforms aimed at democratizing speech empower demagogues. Platforms celebrating community violate our privacy in ways we scarcely realize and serve as conduits for deceptions hiding in plain sight.
Now Facebook has announced that it has permanently banned Louis Farrakhan, Alex Jones, Milo Yiannopoulos and a few other despicable people from its social platforms. What could possibly go wrong?
The issue isn’t whether the people in question deserve censure. They do. Or that the forms of speech in which they traffic have redeeming qualities. They don’t.
Nor is the issue that Facebook has a moral duty to protect the free-speech rights of Farrakhan, Jones and their cohorts. It doesn’t. With respect to freedom of speech, the First Amendment says nothing more than that Congress shall make no law abridging it. A public company such as Facebook — like a private university or a family-owned newspaper — has broad latitude to feature or censor, platform or de-platform, whatever and whoever it wants.
Facebook’s house, Facebook’s rules.
The issue is much simpler: Do you trust Mark Zuckerberg and the other young lords of Silicon Valley to be good stewards of the world’s digital speech?
I don’t, but not because conservatives believe (sometimes with good reason) that the Valley is culturally, politically and possibly algorithmically biased against them. As with liberalism in academia, the left-wing tilt in tech may be smug and self-serving, but it doesn’t stop conservatives from getting their messages across. It certainly doesn’t keep Republicans from winning elections.
The deeper problem is the overwhelming concentration of technical, financial and moral power in the hands of people who lack the training, experience, wisdom, trustworthiness, humility and incentives to exercise that power .
That much should have been clear by the way in which Facebook’s leaders attempted to handle their serial scandals over the past two years. Ordering opposition research on their more prominent critics. Consistently downplaying the extent of Russian meddling on their platform. Berating company employees who tried to do something about that meddling. Selling the personal information of millions of its users to an unscrupulous broker so that the data could be used for political purposes. Now Facebook wants to refurbish its reputation by promising its users much more privacy via encrypted services as well as more aggressively policing hate speech on the site. Come again? This is what Alex Stamos, Facebook’s former chief security officer, called “the judo move: In a world where everything is encrypted and doesn’t last long, entire classes of scandal are invisible to the media.”
In other words, it’s a cynical exercise in abdication dressed as an act of responsibility. Knock a few high-profile bigots down. Throw a thick carpet over much of the rest. Then figure out how to extract a profit from your new model.
What happens with the harder calls, the ones who want to be seen publicly and can’t be swept under: alleged Islamophobes, militant anti-immigration types, the people who call for the elimination of Israel? Facebook has training documents governing hate speech, and is now set to deploy the latest generation of artificial intelligence to detect it.
But the decision to absolutely ban certain individuals will always be a human one. It will inevitably be subjective. And as these things generally go, it will wind up leading to bans on people whose views are hateful mainly in the eyes of those doing the banning. Recall how the Southern Poverty Law Center, until recently an arbiter of moral hygiene in matters of hate speech, wound up smearing Ayaan Hirsi Ali and Maajid Nawaz, both champions of political moderation, as “anti-Muslim extremists.”
Facebook probably can’t imagine that its elaborate systems and processes would lead to perverse results. And not everything needs to be a slippery slope.
Then again, a company that once wanted to make the world more open and connected now wants to make it more private. In time it might also become a place where only nice thoughts are allowed. The laws of unintended consequence can’t rule it out.