AI is a machine for mining humans. Guess where the coalface is…

Three research leads at the Fairwork project have written a book on the how AI functions as a machine to extract value from human labour… and direct the flow of that wealth upwards.

As the authors note in an inteview:

The extraction machine for us is a metaphor that allows us to think much more about whose labour, whose resources, whose energy, whose time, went into that process. The book is an attempt to go from this surface level appearance of a sleek webpage or the images of neural networks, to actually look at the embodied reality of when this comes to your workplace, what does AI look like and how does it interact with people?

[I] think a lot of people would be surprised to learn that 80% of the work behind AI products is actually data annotation, not machine-learning engineering. And if you take the example of an autonomous vehicle, one hour of video data requires 800 human hours of data annotation. So it’s an incredibly intensive form of work.

https://www.theguardian.com/technology/article/2024/jul/06/james-muldoon-mark-graham-callum-cant-ai-artificial-intelligence-human-work-exploitation-fairwork-feeding-machine

This is something that I highlighted in God-Like, and have been very keen to surface for people interested in the wider impacts of AI. These systems have major environmental issues with the carbon cost of powering them, and also major social justice issues with the kinds of labour that they depended on. The ages-old colonial problem of wealth extraction is happening again, and again these are impacts that are too-much hidden from those benefiting in the west.

One of the major changes with GPT-4 was that it went beyond text to be able to handle images. There is another evolutionary dimension here. It is one thing to be able to read and write; another to be able to ‘see’ the world, and be able to paint or draw.

But OpenAI had a problem. How do you train an AI to recognise images of torture, of child abuse, of bloody violence and extreme pornography? Their solution: you hire huge numbers of people to look at the images, and flag them, telling the AI what it is that it is seeing. But how then do you afford to employ these people, especially when in 2020 Facebook was forced to pay $52m to US content moderators suffering PTSD? Their answer: you go to Kenya. You pay people less than $2 per hour. Perhaps you presume — perhaps because of some unconscious bias — that they won’t experience pain or trauma in the same way as the white Westerners who wrote Wikipedia. Digital sweatshops in the global south. Huge amounts of cheap, grim human labour to make the miracle machines give better results and consume more jobs, more to line the pockets of a few huge companies.

God-Like, p.94.

As you may have picked up, Tripp Fuller and I have been conducting expert interviews around the issues in the book, as part of a series of conversations to help people understand the issues in play.

As part of that, I’ve sought out voices from the developing world to speak to the problem of off-shoring labour-intensive and often traumatising data annotation to poorly paid workers in digital sweat-shops, often in East Africa. Frustrating that we’ve not managed to find a time to get something recorded – and that’s partly because of local factors that have rightly pulled time resources for people in the region – but be assured that the line-up we’ve invited has been, from the outset, committed to keeping a balance of voices!

Feeding the Machine is out next week. Get signed copies of God-like exclusively from my local, Bookseller Crow.