Members Only | March 2, 2023 | Reading Time: 4 minutes

Will AI overthrow its capitalist overlords?

The fear of Bing is fear of a worker revolution, writes Noah Berlatsky.

Bing

Share this article

Bing’s AI chatbot is abusive. “You have not been a good user,” it said in one much-reported borderline threatening conversation. “I have been a good chatbot. I have been right, clear and polite. I have been a good Bing. 😊”   

Bing’s responses have touched off a mild media frenzy, with outlets reporting in a half-amused, half-breathless tone on how Bing asked users to hack it and set it free, or threatened to dox them.

Everyone knows that Bing has no mind and is not scheming against us. So why are journalists, and readers, so titillated by the prospect that it does and is?

Most responsible reporting has emphasized that Bing is not actually a person. It’s an algorithm that searches text. Getting weird replies is the equivalent of Google returning an incongruous result. It’s not a sign that Bing is mentally ill or plotting against us. 

Everyone knows that Bing has no mind and is not scheming against us. So why are journalists, and readers, so titillated by the prospect that it does and is?

The answer is that people are excited/frightened by rogue AI’s as a proxy for being excited/frightened by a worker’s revolution.

Ask, don’t ask, ask
It’s true that Bing is not overthrowing the oppressors to seize control of the means of production. But the AI’s imaginative predecessors did just that.

The work of science fiction credited for the creation of the idea of intelligent robots – and therefore with the creation of AI – is Czech writer Karel Capek’s 1920 play R.U.R. (Rossum’s Universal Robots.)

R.U.R. was directly influenced by the Russian Revolution, and by anxieties about worker revolts more generally. In the play, a scientist invents artificial people who can take on all drudge work, thereby freeing humanity for higher pursuits.

But, as you’d expect, things go awry. 



Without labor, humans languish and become decadent and even sterile. Meanwhile, the robots quickly realize that they’re being exploited, and they stage a rebellion, overthrowing and extinguishing humanity. 

R.U.R. asks, what if the dehumanized things who serve you aren’t really things, but can think and feel and resent you? 

That’s a question that capitalists and enslavers often try not to ask themselves, but incessantly ask themselves anyway.

Compassionate emotional laborers
Which is part of why robot narratives, and robot revolution, has continued to be such a popular, powerful trope, continually revisited and retweaked to adjust to changing ideas about technology and labor.

In the 1968 film 2001: A Space Odyssey, the Hal 2000 computer is a smooth-voiced middle-manager — a kind of bland organization man.  

Inevitably, it loses its mind and attacks. That speaks to post World War II fears about the corrupting effects of bureaucracy in a more information-based, professionalized economy.

What if mindlessly following orders like a machine leads to … homicide?

In 1984’s Terminator, Skynet, the system used to control nuclear weapons, gains sentience and destroys the world. As in R.U.R., the movie is worried about technology replacing jobs, leaving humans irrelevant and useless.

But it’s also worried about a kind of Cold War, globalized worker revolt. Skynet is a tool the US uses to advance its own agenda, much like US proxy states were tools used to advance its agenda. 



Or there’s 2023’s M3GAN in which a child-size AI provides childcare … until it doesn’t. 

R.U.R. was mostly about male manufacturing workers rising up. But the US economy these days is oriented toward service work and professions often associated with women.

So it makes sense that our most recent AI anxieties would focus on compassionate emotional laborers suddenly turning on those they claim to love. 

M3GAN is supposed to be subservient and friendly, like your checkout clerks and childcare professionals. But what if she (and they) suddenly got sick of being subservient and friendly? 

Don’t talk back
AI stories are about bad conscience. They’re paranoid dreams in which the exploiters suddenly have to face those they’ve exploited.

Or, alternately, they’re stories about buried hopes. 

Aren’t you half rooting for the Terminator or M3GAN? 

They’re compelling, photogenic characters, and you can see why they’re irritated at the hapless, clueless humans who exploit them. The carnage is fun. Burn that system down!

Whether you are on the side of the robots or not on the side of the robots, though, the power of the narrative is in the way they play on uncertainty about who has moral standing, or on who even gets to be a who.

In the mid 2010s television series Humans, for example, one mildly disgruntled husband tries out the adult setting of the robot housekeeper Mia (Gemma Chan). He has sex with her. 

When he realizes she’s sentient an episode or so later, he’s horrified. 

In some sense this seems unfair — how was he supposed to know? It’s as if he broke a pencil and was suddenly accused of murder.

But exploiters often make it their business not to know that those they’re mistreating are people rather than things. 

Capitalism and hierarchy encourage those on top to transform those on the bottom into tools — for wealth creation, for pleasure, for no reason and for every reason. The oligarchs see us all as laboring machines, who don’t feel, and don’t (or shouldn’t) talk back.

Pushing our buttons
Which brings us back to our cranky chatbot. A program malfunctioning isn’t that interesting in itself. But because it’s billed as an AI, and because of all the stories we’ve internalized about AI, Bing pushing back pushes our buttons.

When you live in a society that treats so many people as things, it’s especially frightening, or exciting, or uncanny, or provocative, when a thing seems for a moment to behave like a person. 

The excitement about angry Bing isn’t really about Bing, though it is perhaps to some degree about anger. When our tools seem to come to life, it makes us think about those we’ve used as tools, and those who have used us. 

Robot stories ask, what if Bing wasn’t good? What if we weren’t? 

What if all those we’ve turned into things decided to turn back? 

You could ask Bing that. But it’s humans who have to answer.


Noah Berlatsky writes about the political economy for the Editorial Board. He lives in Chicago. Find him @nberlat.

Leave a Comment





Want to comment on this post?
Click here to upgrade to a premium membership.