SnapChat is taking your face, and Facebook is taking your chat.

Two things caught my eye this week around how big social media companies are looking to harvest content to train AI models.

As a sidebar, it’s useful to remember that social media companies really, really don’t give a crap about community, friendship, being social or whatever else you want to say about friendship and relationships. They are data-mining companies, and we are the field in which they extract resources in ever more sophisticated ways to sell on to people. Important to know who you’re dealing with.

So first up, Facebook are going to mine your posts and content to build their AI models.

And guess what, they’re trying to convince you that it’s a brilliant, generous and ‘British’ thing they’re doing. LOL.

“We can bring AI at Meta products to the UK much sooner, and that our generative AI models will reflect British culture, history and idiom.”

https://about.fb.com/news/2024/09/building-ai-technology-for-the-uk-in-a-responsible-and-transparent-way/

They’re also keen to trumpet that, “we’ve engaged positively with the Information Commissioner’s Office (ICO) and welcome the constructive approach that the ICO has taken throughout these discussions.”

Except, the ICO have come out and said… ermm… not so fast, Facebook.

In short, the ICO slapped them down for their previous attempt at this and made them stop, and change. So it’s hardly that they ought now to be claiming that they’re best-in-class.

In short: go change your settings and tell FB they can’t use your data.

Secondly though, and this is more pernicious, SnapChat have a new feature that is genuinely creepy. By default, they are saying that they reserve the right to use an AI-generated version of your face, and then use that face in ads.

This is the grim end of the wedge that I spoke about in God-like. Because these data companies know so much about us, they are able to target ads very, very precisely. And now, it seems, you will see your friends’ AI-generated faces telling you to buy stuff. That, in my book, is wrong.

The wider issue here is that people are looking for new and slightly hushed ways to extract value using AI, because there is a desperate need to get a return on the massive sums that have been invested. And our faces, our words… every aspect of our person… is a raw material that they will take and put to that use. Be clear: this will be presented as promises of them doing us a favour. But… this will not be the case. There is no social. There is only a huge hunger for more and more data.

Check yo’ settings people. Then buy a book 😉