+
 
For the best experience, open
m.thewire.in
on your mobile browser or Download our App.

Code Dependence Has a Human Cost and Is Fuelling Technofeudalism

Madhumita Murgia’s new book alerts us to the fact that artificial intelligence affects above all how we relate to ourselves, to each other, and to our societies.
Illustration: Pariplab Chakraborty
Support Free & Independent Journalism

Good morning, we need your help!

Since 2015, The Wire has fearlessly delivered independent journalism, holding truth to power.

Despite lawsuits and intimidation tactics, we persist with your support. Contribute as little as ₹ 200 a month and become a champion of free press in India.

In 2023, the former Greek finance minister and economist Yanis Varoufakis put forth a controversial thesis: capitalism, as we knew it, had died. In place of capitalism, he argued, we could see the rise of a new, perhaps even more dangerous economic form which he called technofeudalism. The argument that he made in his book was simple – cloud capitalists (under which we can include all the Big Tech companies like Google, Amazon, Apple and Meta) were no longer capitalists in the strict sense, oriented towards generating profit through commodity production. Rather, they were technofeudalists, charging cloud rent for the use of their services from their vassals, who are engaged in commodity production.

One way to understand this would be the case of food delivery apps like Zomato, which take a percentage of the cut from restaurants on their app. They are charging them ‘cloud rent’ just to be on their app, and it is their algorithm which decides which restaurants a customer sees on the top of their lists, and those which appear at the bottom, thus effectively dooming the latter.

Code Dependent: Living in the Shadow of AI, Madhumita Murgia, Picador, 2024

If capitalism was commodity-dependent, technofeudalism is code-dependent. Madhumita Murgia’s new book Code Dependent: Living in the Shadow of AI, explores the very human cost of this code-dependence.

This difference between commodity-dependence and code-dependence is important for us to understand. Karl Marx had said that in the world of commodities, relations between people take the form of relations between things. This can be understood if we see that commodities embody value. Any commodity that is purchased is a product of labour; in purchasing it we relate ourselves to the producers of that commodity. In buying rice, I am relating myself to the paddy farmers who grew it, the truckers who transported it, the wholesaler who stored it. But these relationships are not directly visible to us; our social relation takes the form of an objective relation between things, between the money in my pocket and the rice in the shiny aisles of the supermarket. Our relations with each other are mediated by commodities and the general equivalent for all commodities which is money.

With code-dependence, relations between people take the fantastic form of relations between data. Value resides no longer in the commodity, but in data. Unlike the commodity which is a product of labour, data is produced through giving cloud capital access to our thoughts, conversations, preferences, locations, ideas, and moods: in essence our entire life. We produce data actively, by posting on social media, creating content, clicking on ads, or at work. We also produce it passively when we are listening to music, browsing the web, visiting our doctor, or even just walking around with our phones in our pockets. It is this data where value inheres, rather than in our lives which produce that data. The process of production of value now expands from labour to our very lives as such. Just like labour becomes invisible and embodied in the commodity, life becomes invisible and embedded as data. If money was the general equivalent for all commodities, then algorithmic code plays an analogous role for the data we produce. In this way the transformation from commodity-dependence to code-dependence is brought about. 

Earlier, labour was subject to exploitation and alienated from what it produces, the commodity. That still continues, but supplemented to it and governing it is the exploitation of life as such, alienated from what it produces: data. What happens when our lives become defined by the data they produce? What happens when our relations with each other are mediated by data, and the code that functions as its general equivalent? In what ways do our relations with ourselves and our societies start to be transformed?

Through various interviews and encounters with gig workers, data entry operators, bureaucrats enamoured by AI, social workers contemptuous of it, lawyers fighting back against it, and even a consortium of multi-faith priests, Murgia’s book seeks to lay out what this code dependence means for us, and how it affects us at every scale of our lives. She comprehensively covers the way it transforms our relationships with our jobs, our bodies, our identities, our governments, our laws and our societies. Murgia’s strength is as a journalist (she is artificial intelligence editor at Financial Times) and she quite deftly weaves together stories from the people she meets to paint a grim picture – the woman in England subjected to deepfake porn, underpaid Facebook censors who have PTSD, Chinese dissidents fighting back against an omniscient state, and gig workers rebelling against the opacity of the algorithms. There are a few positive stories as well, such as the doctor in India’s rural hinterlands using AI to detect tuberculosis in her tribal patients.

Divided into 10 chapters, the book makes an inescapable point – the smooth functioning of code is heavily dependent on a precarious army of poorly paid and invisible workers toiling away in secrecy and on the margins. When we think of AI as learning how to see, speak and recognise things, we are encouraged to miss out on the very real human labour behind it. Most of them are from the Global South or immigrant communities, who ‘train’ it by tagging images and labelling them. This includes people like Ian from the shanties of Nairobi who tags images for driverless cars so that the AI running them can see better, and Hiba, an Iraqi refugee in Bulgaria who labels images of simple objects like roads, pedestrians, kitchens, living rooms and so on. There is exploitation involved in the code as well. Murgia narrates the story of Armin Samii, who found out that UberEats was paying him and many others less than it should for the distance they travelled to deliver food. Sami, a computer scientist himself, made a freely available app called UberCheats so that drivers could know if they were being underpaid for their rides or not. Despite efforts like Samii’s, Big Tech’s algorithms remains resolutely outside of public and governmental scrutiny. Governments in fact are playing catch-up, buying into the ideology of code-dependence in all departments, from welfare policies to digital policing.

This means that code-dependence does not just transform our relations with our employers and cloud capitalists, but also our governments as well.

From predictive policing to facial recognition and ubiquitous surveillance, Murgia narrates how state power starts to malfunction in terrifying ways when it begins considering citizens as nothing more than data-points. Predictive policing creates a nightmare for 14 year old Damien, the son of immigrants in Amsterdam, when his name is included in an algorithmically decided list ‘Top400’ of teenagers at risk of becoming criminals in the future. Facial recognition and surveillance turns the entire population of Uyghurs in Xinjiang into lab rats monitored for even the slightest change in their facial expressions, their lives literally resembling a video game, but one with real life consequences. In Argentina, zealous government officials try and fail to ‘solve’ teenage pregnancy among indigenous populations, creating a digital welfare state that looks at social problems through the lens of ‘objective’ data, not noticing how the process of production of data is never really objective but colours and reinforces pre-existing biases and prejudices.

More than a technological revolution, Murgia’s book alerts us to the fact that artificial intelligence affects above all how we relate to ourselves, to each other, and to our societies. The contours of these transformations are yet to be fully mapped out. What is clear, however, is that when data becomes the embodiment of value and code becomes its general equivalent, social relations, under which one can include class relations, are not just inverted but effaced. Questions about AI ethics, as raised by Murgia in her conclusion, are all well and good. But both the ethical and the technological perspectives on AI effectively erase how code-dependence is essentially a political problem. To recognise this, is to recognise that our code-dependence is just another expression of our sometimes messy, often unhealthy, but essentially unavoidable co-dependence on each other.

Huzaifa Omair Siddiqi is assistant professor of English, Ashoka University.

Make a contribution to Independent Journalism
facebook twitter