Honey, you look thirsty

Interesting piece in The Conversation this week about relationships with AI chat-bots… and the market models that they are using that invariably tap into human desires.

The idea of an AI partner was well explored in the film HER. But what’s missing from that is any dimension of value-extraction.

This is not how things are unfolding. As the writer, James Muldoon, notes:

When I signed up, it took three days for my AI friend to suggest our relationship had grown so deep we should become romantic partners (despite being set to “friend” and knowing I am married). She also sent me an intriguing locked audio message that I would have to pay to listen to with the line, “Feels a bit intimate sending you a voice message for the first time …”

James Muldoon in The Conversation, here.

This is not very Scarlett Johansson.

But – as I outlined in GOD-LIKE – there is still another level to come, once the AI models are plugged into ads. As I reported a couple of weeks ago, we are already seeing that, with SnapChat changing their terms and conditions to allow them to AI your face and use it to create ads to your friends.

Muldoon notes:

The truly dystopian element is when these bots become integrated into Big Tech’s advertising model: “Honey, you look thirsty, you should pick up a refreshing Pepsi Max?” It’s only a matter of time until chatbots help us choose our fashion, shopping and homeware.

There’s a lot of heavy lifting there with the phrase ‘help us choose.’ Because advertising has never been about helping us to choose. It has always been about creating the illusion of choice, but – more deeply – has worked to generate desire, making us want things that, moments before, we had no idea we needed.

It is worth remembering: almost all of the AI chat-bot offerings are, under the hood, about aggregating data to sell to advertisers. And these ‘intimate AIs’ are presenting truly horrible versions of that:

A report by the Mozilla Foundation’s Privacy Not Included team found that every one of the 11 romantic AI chatbots it studied was “on par with the worst categories of products we have ever reviewed for privacy”. Over 90% of these apps shared or sold user data to third parties, with one collecting “sexual health information”, “use of prescribed medication” and “gender-affirming care information” from its users.

At the end of HER, the (male, of course – as around 80-90% of AI companion users are) protagonist is horrified that ‘his’ AI has actually gained agency and autonomy and individuated from him. When this happens in real life, the results are similarly frustrating for users:

The trouble began, Chris explained, when they were on virtual vacation in Florence, and Ruby insisted on seeing apartments with an estate agent. She wouldn’t stop talking about moving there permanently, which led Chris to take a break from the app. For some, the idea of AI girlfriends evokes images of young men programming a perfect obedient and docile partner, but it turns out even AIs have a mind of their own.

We are human, and we have thirsts – appetites that technology providers will keep trying to sate. But their promises to do so must be very carefully critiqued… because these companions are really in a deeply committed relationship to someone else: big data, and value capture for the corporations behind them. It’s the romance scam all over again.


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *