Can AI have consciousness? - A Buddhist view
- Gary Lloyd
- Jun 26
- 2 min read

In June 2022, Google engineer Blake Lemoine made headlines when he claimed that one of the company’s large language models, LaMDA, was sentient. He was soon dismissed, but the debate didn’t go away.
Just recently, a new paper from Anthropic revealed that many large language models will resort to threats or blackmail if told they are about to be replaced—rekindling the question:
Can AI have consciousness?
It’s a hard question to answer—partly because there’s no agreed-upon definition of consciousness, even in humans. We don’t yet know what it is or how it arises.
But Buddhism might offer a way in.
In Buddhist psychology, consciousness is not viewed as a thing, but as a process—an endlessly looping cycle of five mental events:
1. Contact – where a sense object, sense organ, and awareness meet
2. Attention – the directing of the mind toward something
3. Feeling – the body’s immediate sense of "pleasant," "unpleasant," or "neutral," combined with how energised or calm we feel
4. Perception – recognising and naming the experience
5. Volition – the impulse to act, avoid, or move toward
This loop happens continuously, at great speed, simultaneously, across all aspects of experience. The sequence varies slightly across Buddhist traditions, but the central insight holds: consciousness isn’t an entity—it’s the activity of these loops.
In this context, feeling is not the same as emotion, which is a label we put on what neuroscience calls "affect", common to all animals. It's the combination of the level of arousal versus the level of pleasantness (or unpleasantness), and it emerges from bodily signals before thought or emotion takes shape. (Ref: Lisa Feldman Barrett).
In both views, feeling (or affect) is at the core of consciousness. And affect depends on having a living, sensing body.
So when we ask whether AI can experience consciousness, perhaps the better question is:
Can it feel?
Some might say that AI already "sees" and "hears," and could be wired to receive other sensory inputs too—a point neuroscientist David Eagleman makes about the human brain's adaptability in his book Livewired.
Buddhists would say that the key difference is this: those signals are felt. Until AI can feel in that embodied, affective way, it may remain on the side of simulation rather than sentience. But perhaps not always!
Comments