One reason I mostly avoid Facebook is that it’s dull; there's something about that landing page that's empty or generic. I understand what Twitter is and how one might use it; it has a definite form. But Facebook is everything and nothing, an amorphous blob of software somewhere. It's got no angle, no point of view, no personality. Mark Zuckerberg seems the same, and if he made an unlikely hero (way too blank), he makes an even less likely agent of the apocalypse. But he's definitely taking on that role. Facebook, with all its blandness and banality, is liable to take the blame for anything and everything that anyone doesn't prefer: the election of Trump, for example, or the disintegration of our very shared reality itself.
"Facebook Is a Doomsday Machine," reads the headline of a story by the Atlantic's executive editor Adrienne LaFrance. "The architecture of the modern web poses grave threats to humanity. It's not too late to save ourselves," she adds in bold disagreement with the consensus of thinkers who’ve concluded that it’s far too late to save ourselves from Facebook. From Facebook, I want to emphasize. "Megascale is nearly the existential threat that megadeath is," LaFrance declares, transcending in her prose thebounds of meaning. "No single machine should be able to control the fate of the world’s population," she urges, reasonably enough as far as that goes. "Facebook’s megascale gives Zuckerberg an unprecedented degree of influence over the global population."
People have panicked about social media for a long time, but Zuckerberg keeps ticking right along, not getting any poorer. LaFrance and her ilk evidently conclude that their hyperbole has been insufficiently hyperbolic. It's time to take to the streetcorner with a signboard and wild beard (it might look cool, really, Ms. LaFrance) to declare that the end is nigh. But it's not too late to repent, maybe? Close, though. Repentance in this case will take the form of much, much more government regulation, just as it did for the apostles of our lord and savior Jesus Christ. Wait, is that right?
Being coaxed in this direction, believe it or not, by Adrienne LaFrance, Hillary Clinton has described Facebook as a "Trumpian nation-state," an “authoritarian” and even fascist government. It's one of the things she blames for her loss in 2016, along with "the Russians," who were of course working partly through Facebook.
The analogy of Facebook to a nation-state, I remark, is completely inept and puzzling, except as a little chunk of anti-Zuckerberg verbiage. When Facebook has coercive taxation, weapon systems, and internment facilities, it will be a nation state. Until then, no. LaFrance insists repeatedly that Facebook, for legal purposes, should be treated as a publisher, not a utility (like phone companies, etc.). Neither of those is a nation-state, alright?
But what makes Facebook a fascist nation-state, according to Clinton and LaFrance, is precisely its openness, the lack of constraint and regulation that it imposes on the content that appears on its pages. In evidence of authoritarianism, LaFrance quotes Zuckerberg: "It’s not about saying, Here’s one view; here’s the other side. You should decide where you want to be.” If that's authoritarianism, one might wonder what LaFrance's anti-authoritarianism is going to amount to. I think she means much more government regulation of the Internet. So who's engaging in disinformation?
Trying to figure out what causal role something like Facebook plays in any particular political disaster (or for that matter any inspiring political event, such as the Arab Spring) is not nearly so straightforward as people like LaFrance seem to think. Stipulate that disinformation had a bad effect on event X (the 2016 election, let's say), and that people were exposed to disinformation on Facebook. That’s not at all sufficient to establish Facebook's causal role in X. We'd need to think about how else disinformation has been or might be distributed, before and aside from Facebook. We'd need to think about why people are seeking disinformation, and how they might have gotten it on other platforms or in other mediums. The sorts of propaganda, rumors, conspiracy theories, and unfortunate election outcomes that LaFrance decries long predate Facebook. Maybe Facebook makes disinformation more pervasive. It was always pretty damn pervasive. Maybe Facebook makes manipulation easier. It was always pretty damn easy.
If certain dangers lurk in an unregulated Internet, greater dangers lurk in a regulated one, and the envisioned cure for the nation-state of Facebook is annexation by actual nation-states, that is, by the people who do have the weapons systems and internment facilities. We can see the overwhelming problems this presents live: China fully embodies the alternative to the Internet as a chaotic, commercialized free-for-all.
As China dealt with the early days of COVID, Li Wenliang, a doctor who’s tried unsuccessfully to warn authorities of the new illness, died of the disease. There was an outcry of sympathy and outrage on social media that the Chinese government quickly decided must be contained. Now, the Chinese state controls all the social media in the country and maintains that control with tens of thousands of monitors and extremely sophisticated surveillance software.
"They ordered news websites not to issue push notifications alerting readers to his death," reports The New York Times, quoting leaked internal documents. "They told social platforms to gradually remove his name from trending topics pages. And they activated legions of fake online commenters to flood social sites with distracting chatter, stressing the need for discretion: 'As commenters fight to guide public opinion, they must conceal their identity, avoid crude patriotism and sarcastic praise, and be sleek and silent in achieving results.'"
This was all done for admirable purposes, of course: to prevent panic and maintain social cohesion in the face of the emerging health emergency. If you think that someone like LaFrance or Clinton couldn't contemplate such steps, think about the drumbeat of calls to repress criticisms of the American electoral system, which did result in many Tweets, videos, and Facebook posts being removed or given warning labels. There was an immediate widespread call for suppression of certain sorts of criticisms of the government, and it had some effect in controlling such criticisms. Now picture that the people directing the social media platforms are themselves government officials. And suppose that there really are problems with the elections...
The ways that China controls the presentation of the situation in Xinjiang, where they are operating mass re-education camps, or in Hong Kong, where they’ve brutally repressed a rebellion, suggest the ways governments, including our own, can be expected to use the ability to regulate social media fully. Creating ever-more state-regulated social media is unlikely to cure us of disinformation; it's just likely to make the disinformation more univocal, but no less dangerous, putting it mildly. Or as dangerous as Zuckerberg would be if you gave himan army and put Facebook in charge of the prison system.
Facebook isn't really the problem; I think the problem is our permanent desire to manipulate one another and to be manipulated by one another. Politics are no worse now overall than they were before social media, and not nearly as different as people think: everyone was always throwing bullshit around. And I'm less scared of social media platforms in which all sorts of wild things are going on and jive is coming at everyone from every point of view, or even over and over again from their own, than one in which the only view available is the one approved by regulators.
That's the only thing that could make Facebook blander than it is already.
—Follow Crispin Sartwell on Twitter: @CrispinSartwell