Hallucination: an ancient and modern concern

Photo by Joe Mabel. Used under CC Licence

I’ve been asked a few times about how the work I did in GETTING HIGH links to the issues raised in the new book GOD-LIKE, and I wanted to answer that with some thoughts around the issue of hallucination.

GETTING HIGH plots the history of the human quest for flight, mainly through the twin prisms of two of the most powerfully-fuelld trips of the past century, the Apollo missions to the moon and the LSD countrerculture.

The full subtitle of the book is A Savage Journey to the Heart of the Dream of Flight which, if you hadn’t clocked already, is a nod to Hunter S. Thompson’s Fear and Loathing: A Savage Journey to the Heart of the American Dream. The analysis of that book that I offer tries to pull it away from the hedonistic binge lolz interpretation that is conferred by students zoned out on college campuses, and reveal the book as an expression of sorrow and frustration at the failure of the 60s counterculture to deliver on its promises.

And that hope had been astronomical. It’s hard to understand now because the haze of recreational party drug use and the chronic blight of addiction – from heroin to crack to new opiods – makes it almost impossible to see the naive and unencumbered view of psychedelics that dominated in the 50s and 60s. They really were seen as a great hope for world peace, and for unlocking the potential of human creativity and mind-expansion. Go look it up: the Pope took it.

But that came crashing down very violently, and Hunter Thompson – a very sensitive person and writer – felt it acutely. Fear and Loathing was his angry tirade about those failures. It is about his disillusionment.

To get further into that word, you need to read Getting High, but I want to link that with the concept of hallucination that ties together the issues in Getting High and God-like.

The word comes from the Latin ‘alucinor’: to roam. As I note in God-Like:

There are elements of the dream here, but more of a sense of the wandering being guided by some other force. The technology of a drug, the trance state induced by a ritual. Our reality becomes overcome by some brighter luce, some more powerful projection.

The point I make is that this is an odd reversal of what we see in Plato’s cave, where it is the projected reality that is something to be escaped, rather than something that offers us escape.

What we see in the 60s though – empowered by the writings of Huxley and others – is that the dream-like mental roaming of the acid hallucination is considered brighter than our reality, is more than what could be offered in the crude light of day. So the hallucinatory LSD trip became considered more truthful, its messages to be something akin to prophesies from on high.

And yet… it did not deliver great breakthroughs in world peace. It did not aid the civil rights struggle. Instead, it absented people’s bodied from engaging with those important, embodied, political issues.

In fact, it went beyond that: the dream became a total nightmare… became a source of fear and of loathing. As HST puts it:

What sells, today, is whatever Fucks You Up—whatever short-circuits your brain and grounds it out for the longest possible time.

Fear and Loathing

And so to the problem of AI, and it’s own hallucinations.

And the first thing to say is that people saw this on a continuum: from the space race to the LSD counterculture to computers… it was the same mission, unfolding from one hope to the next. Stewart Brand – a key figure from the period who was all over the space race, was right there in Ken Kesey’s Merry Pranksters, and then involved in the early computer movement, put it back then:

Drugs proved to be self-limiting, but computers proved to be infinitely self-enhancing and biotech has the same quality.

In short: this was the new great hope. Going to the moon hadn’t solved our problems in the way people had hoped, and nor had new drugs… may be computers will! And again, go back to the early sources and you see the same HUGE hopes for this latest technology: this augmentation of our selves that will fill-in our failings and lift us to perfection.

But what we’ve seen with AI is that, rather than being a true light to guide us to truth, it… hallucinates. The Cambridge Dictionary chose it as their word of the year for 2023:

“When an artificial intelligence (= a computer system that has some of the qualities that the human brain has, such as the ability to produce language in a way that seems human) hallucinates, it produces false information.”

If you’ve been to one of ‘those’ parties, or seen footage of them from films about the 60s/70s, you’ll know the scene: the charismatic figure, tripping on something, rapping some great proclamation about the insights they are having… people around nodding, saying ‘yeah man,’ the whole thing seeming so deep. And yet… in the cold light of day, it’s utter bullshit, drawing on ‘facts’ that don’t exist, and coming to conclusions that make no sense.

And this is what we’ve seen AI systems do: you ask it a question, and it’s not just that it comes back with an incorrect fact, it comes back with a whole off-its-face spiel to support this bullshit, with made-up references. But then other AI systems can start to cite these hallucinations themselves, and so the untruth gains solidity through multiple sources. As I put it in God-Like:

But this is more than a Google search coming up with a wrong answer. It is the wise owl in full bullshit flow, holding court in the forest, spouting untruths as bold as brass, backing them up with false references. It is the other owls listening and recording what the owl says, and then later – when asked a similar question – referencing the wise owl’s words to add weight to their own. And so the hallucination gains its own light, starts to brighten in ways that make the actual truth a little harder to see.

This is the real issue with AI as a hallucinatory technology: it is generating and sustaining these false light-sources and we, moth-like since birth, cannot help but be drawn to them.

The problem that HST identified was that the state of hallucination was deep-set in the American Dream: a Great Light to guide the lost soul, a Higher Power to take away our responsibilities and to tell us what to do. And – as with the fundamental religious version of this, as with the crazed cults, as with the drugs – what began in inflated hope ended with fear and loathing. Ended with disillusionment: the illusion destroyed (again, check Getting High for how Freud dealt with this.)

My prediction: we’ll see elements of AI disillusion coming very soon. But also, we’ll see a much more aggressive push to have AI seen as the One True Light because, different to the LSD counterculture, this is a hallucinatory technology that may have begun in the idealism of the 50s and 60s, but is driven by aggressive capital.

Either way, our task is the same: to be reflective enough to understand the illusion. And the only way to do that is to talk. To be able to call out the BS with people we trust.

Getting a copy of God-Like here.

And Getting High here.

You should also subscribe to the Process This Substack, where Tripp Fuller and I are digging into the details of AI with world-leading experts.