Matt Taibbi: Racket News
This is a speech given by Matt Taibbi, to our mind America’s best journalist. He was a major player in the spectacular revelations of the Twitter Files, which demonstrated once and for all the clear involvement of governments and intelligence agencies in manipulating social media platforms, not just around Covid but across multiple issues.
The wanton destruction of traditional liberties, including freedom of speech, has become one of the most urgent stories of our times.
Matt Taibbi has become one of the world’s leading figures in standing up to the evils of our times.
You can follow his work at Racket News.
A funny thing happened last night, at a remarkable event in London in celebration of free speech with Russell Brand and Michael Shellenberger. Before the proceedings Michael suggested we give prepared remarks. I wrote a speech, tinkering with it at night on the plane over, then all day after landing. At the event Michael stood before the large crowd and extemporaneously delivered a rousing address. I slid what I wrote under a chair.
Though I did end up mumbling a few things from memory, this is the whole speech, as written:
It’s heartening to see so many faces here in London, to talk about the crisis of free speech around the globe, or to protest censorship, or whatever it is we’re doing exactly. Before we begin, I think it’s important to make a distinction. Unlike Russell and the rest of our hosts, Michael and I, and a few of us in the crowd, are Americans. For us, belief in unfettered free speech is a core part of our character. It’s a big reason that we Americans enjoy the wonderful reputation we do all around the world, especially here in Europe, where (I’m sorry to tell you) we hear you whispering to the restaurant hostess that you’d like to be seated at the table as far away from us as possible.
That was meant to be a laugh line, but in some ways, that’s what the First Amendment of the U.S. Constitution comes down to: the right to be an asshole. We have a prettier way of saying it — a right to petition for a redress of grievances — but it’s the same basic idea.
Isn’t that a beautiful phrase, a redress of grievances? Great, memorable language. Like a lot of Americans, I know the First Amendment by heart. I’ve recited it to myself enough to know it doesn’t say the government gives me the right to speech, assembly, a free press. It says I have those things, already. As a person, as a citizen.
This is a very American thing, the idea that rights aren’t conferred, but a part of us, like our livers, and you can’t take them away without destroying who we are. That’s why in other contexts you’ll hear some of us say things like, “I’ll give you this gun when you pry it from my cold dead hands!”
Some people roll their eyes and think that sounds crazy, but we know that guy actually means it, and to a lot of us it makes sense. We’re touchy about rights, especially about the first ones: speech, assembly, religion, the free press.
But we’re not here tonight to debate the virtues of American speech law versus the European tradition. Instead, Michael and I are here to tell a horror story that concerns people from all countries. Last year, he and I were offered a unique opportunity to look at the internal documentation of Twitter.
I entered that story lugging old-fashioned, legalistic, American views about rights, hoping to answer maybe one or two questions. Had the FBI, for instance, ever told the company what to do in a key speech episode? If so, that would be a First Amendment violation. Big stuff!
But after looking at thousands of emails and Slack chats, I first started to get a headache, then became confused. I realized the old-school Enlightenment-era protections I grew up revering were designed to counter authoritarianism as people understood the concept hundreds of years ago, back in the days of tri-cornered hats and streets lined with horse manure.
What Michael and I were looking at was something new, an Internet-age approach to political control that uses brute digital force to alter reality itself. We certainly saw plenty of examples of censorship and de-platforming and government collaboration in those efforts. However, it’s clear that the idea behind the sweeping system of digital surveillance combined with thousands or even millions of subtle rewards and punishments built into the online experience, is to condition people to censor themselves.
In fact, after enough time online, users will lose both the knowledge and the vocabulary they would need to even have politically dangerous thoughts. What Michael calls the Censorship-Industrial Complex is really just the institutionalization of orthodoxy, a vast, organized effort to narrow our intellectual horizons.
It’s appropriate that we’re here in London speaking about this, because this is the territory of George Orwell, who predicted a lot of what we saw in the Twitter Files with depressing accuracy.
One example stands out.
One of the big themes of 1984 was the reduction of everything to simple binaries. He described a world where “all ambiguities and shades of meaning had been purged,” where it wasn’t really necessary to have words for both “warm” and “cold,” since as he put it, “every word in the language – could be negatived by adding the affix un-.”
Let’s not bother with cold, let’s just have unwarm.
A political movement has long been afoot in America and other places to reduce every political question to simple binaries. As Russell knows, current political thought doesn’t like the idea that there can be left-neoliberalism over here, and right-Trumpism over here, and then also all sorts of people who are neither – in between, on the peripheries, wherever.
They prefer to look at it as, “Over here are people who are conscientious and believe in science and fairness and democracy and puppies, and then everyone else is a right-winger.” This is how you get people with straight faces calling Russell Brand a right-winger.
But it goes deeper. Michael and I found correspondence in Twitter about something called the Virality Project, which was a cross-platform, information-sharing program led by Stanford University through which companies like Google, Twitter, and Facebook shared information about Covid-19.
They compared notes on how to censor or deamplify certain content. The ostensible mission made sense, at least on the surface: it was to combat “misinformation” about the pandemic, and to encourage people to get vaccinated. When we read the communications to and from Stanford, we found shocking passages.
One suggested to Twitter that it should consider as “standard misinformation on your platform… stories of true vaccine side effects… true posts which could fuel hesitancy” as well as “worrisome jokes” or posts about things like “natural immunity” or “vaccinated individuals contracting Covid-19 anyway.”
This is straight out of Orwell. Instead of having “ambiguities” and “shades of meaning” on Covid-19, they reduced everything to a binary: vax and anti-vax.
They eliminated ambiguities by looking into the minds of users. In the Virality Project if a person told a true story about someone developing myocarditis after getting vaccinated, even if that person was just telling a story – even if they weren’t saying, “The shot caused the myocarditis” – the Virality Project just saw a post that may “promote hesitancy.”
So, this content was true, but politically categorized as anti-vax, and therefore misinformation – untrue.
A person who talks about being against vaccine passports may express support for the vaccine elsewhere, but the Virality Project believed “concerns” about vaccine passports were driving “a larger anti-vaccination narrative,” so in this way, a pro-vaccine person may be anti-vax. They also wrote that such “concerns” inspired broader discussions “about the loss of rights and freedoms,” also problematic.
Other agencies talked about posts that shared results of Freedom of Information searches on “authoritative health sources” like Dr. Anthony Fauci, or used puns like “Fauxi.” The VP frowned on this.
“This continual process of seeding doubt and uncertainty in authoritative voices,” wrote Graphika, in a report sent to Twitter, “leads to a society that finds it too challenging to identify what’s true or false.”
It was the same with someone who shared true research about the efficacy of natural immunity or suggested that the virus came from a lab. It all might be factual, but it was politically inconvenient, something they called “malinformation.” In the end, out of all of these possible beliefs, they derived a 1984 binary: good and ungood.
They also applied the binary to people.
This was new. Old-school speech law punished speech, not the speaker. As a reporter I was trained that if I commit libel, if I wrote something defamatory that caused provable injury to someone, I would have to retract the error, admit it, apologize, and pay remuneration. All fair! But the court case wouldn’t target me as a person. It wouldn’t assume that because I was wrong about X, I would also be wrong about Y, and Z.
We saw NGOs and agencies like the FBI or the State Department increasingly targeting speakers, not speech. The Virality Project brought up the cases of people like Robert F. Kennedy, Jr. The posts of such “repeat offenders,” they said, are “almost always reportable.” They encouraged content moderators to make assumptions about people, and not to look on a case-by-case basis. In other words, they saw good and ungood people, and the ungood were “almost always reportable.”
Over and over we saw algorithms trying to electronically score a person’s good-or-ungoodness. We found a Twitter report that put both Wikileaks and Green Party candidate Jill Stein in a Twitter “denylist,” a blacklist that makes it harder for people to see or search for your posts. Stein was put on a denylist called is_Russian because an algorithm determined she had too many beliefs that coincided with banned people, especially Russian banned people.
We saw the same thing in reports from the State Department’s Global Engagement Center. They would identify certain accounts they claimed were Russian operatives, and then identify others as “highly connective” or “Russia-linked,” part of Russia’s “information ecosystem.” This is just a fancy way of saying “guilt by association.” The technique roped in everyone from a Canadian website called Global Research to former Italian Prime Minister Giuseppe Conte, and former Italian Democratic Party Secretary Nicola Zingaretti.
If you apply these techniques fifty million, a hundred million, a billion times, or a billion billion times, people will soon learn to feel how certain accounts are deamplified, and others are not. They will self-sort and self-homogenize.
Even when Twitter doesn’t remove an account if the FBI recommends it, or passes along a request from Ukrainian intelligence to remove someone like Grayzone journalist Aaron Mate, users start to be able to guess where that line between good and ungood is.
One last note. As Michael and I found out recently with regard to the viral origin story, things deemed politically good often turn out to be untrue, and things deemed ungood turn out to be true.
I can recite a list if need be, but many news stories authorities were absolutely sure about yesterday later proved totally incorrect. This is another characteristic Orwell predicted: doublethink.
He defined doublethink as “the act of holding, simultaneously, two opposite, individually exclusive ideas or opinions and believing in both simultaneously and absolutely.”
Not long ago we were told in no uncertain terms the Russians blew up their own Nord Stream pipeline, that they were the only suspect. Today the U.S. government is telling us it has known since last June that Ukrainian forces planned it, with the approval of the highest military officials. But we’re not expected to say anything. We’re expected to forget.
What happens to a society that doesn’t square its mental books when it comes to facts, truth, errors, propaganda and so on? There are only a few options. Some people will do what some of us in this room have done: grow frustrated and angry, mostly in private. Others have tried to protest by frantically cataloging the past.
Most however do what’s easiest for mental survival. They learn to forget. This means living in the present only. Whatever we’re freaking out about today, let’s all do it together. Then when things change tomorrow, let’s not pause to think about the change, let’s just freak out about that new thing. The facts are dead! Long live the new facts!
We’re building a global mass culture that sees everything in black and white, fears difference, and abhors memory. It’s why people can’t read books anymore and why, when they see people like Russell who don’t fit into obvious categories, they don’t know what to do except point and shriek, like extras in Invasion of the Body Snatchers.
We have been complaining about censorship, and it’s important to do that. But they are taking aim at people in a way that will make censorship unnecessary, by building communities of human beings with no memory and monochrome perception. This is more than a speech crisis. It’s a humanity crisis. I hope we’re not too late to fix it.
June 28, 2023 at 10:34 am
Thsnks John for having the foresight to publish the facts and tell give truth