Article
Assisted dying
Death & life
4 min read

Behind the data: the social messages physician assisted suicide sends to the autistic

If intense suffering caused by society drives autistic people to seek assisted death, then society has failed.
A hand rest gently on another outstretched hand.
Alexander Grey on Unsplash.

Statistically speaking, autistic people are far more likely to die by suicide than non-autistic people. They are also, statistically speaking, far more likely to die by physician assisted suicide than non-autistic people, in countries where this is allowed.  

For example, in a study of 927 people who sought physician assisted suicide in the Netherlands (where this is legal) 39 of them were autistic. That’s about four per cent, but the prevalence of diagnosed autism in the Netherlands is only one to two per cent. The researchers go on to note that 21 per cent of these 39 people cite autism or intellectual disability as the “sole cause of suffering” that had prompted them to request assistance to die.  

I don’t like speaking statistically. For a start, 21 per cent of 39 people is 8.19 people, which raises obvious questions. A little digging reveals that what the researchers mean really is eight people. Eight people with eight unique stories that include an account of autistic suffering so intense that they asked for help to end their lives.  

But we do not have those stories, not really. Included in the report are carefully anonymised excerpts from the physicians’ notes, and this is the nearest that we can get.  

‘The patient suffered from his inability to participate in society [ … ] [He] was not able to live among people, because he was easily overstimulated. This made him isolated’ (2019 (22), male, 70s, ASD) 

‘The patient had felt unhappy since childhood and was persistently bullied because he was just a bit different from others [ … ] [He] longed for social contacts but was unable to connect with others. This reinforced his sense of loneliness. The consequences of his autism were unbearable for him [ … ] The prospect of having to live on in this way for years was an abomination to him and he could not bear it’ (2021 (26), male, 20s, ASD) 

The debate about legalising physician assisted suicide in the UK is ongoing, and the British Medical Association have provided a helpful guidance document which sets out the main arguments, both for and against, without making a recommendation either way. In the document, they observe that the reasons people ask for assisted suicide are predominantly personal and social, not clinical, and also that “laws send social messages.” I agree that laws do that, and I also think that those seeking assisted suicide send social messages too.  

For example, even just from these two tiny excerpts, I hear that a life worth living is one where people can participate in society and have social contacts, even if they are a “just a bit different from others.” It would be good to hear more. It would be good to sit down over a cup of coffee with each of these two men and ask them all my questions about their lived wisdom when it comes to autism.  

I could ask “2019 (22), male, 70s, ASD”: 

What causes the overstimulation - are there places where you don’t feel that?  

Can we create more such places for autistic people to socialise?  

And I could ask “2021 (26), male, 20s, ASD”:  

What makes you feel different?  

What kind of social contacts and connections do you think that you are looking for?  

But of course, I can’t do that, because these two men have been assisted to die.   

The word ‘welcome’ is striking to me here. What does it mean to welcome someone, not to merely include or tolerate, but to really welcome someone. 

When approached for comment, autistic theologian Claire Williams said:  

‘There is something of a personal and social tragedy reflected in these cases. If we understand that much of the difficulty that autistic people suffer is caused by society – as per the neurodiversity paradigm – then it is the case that these two nameless men were failed by society. They felt that their lives could not find a place in an unwelcoming world. It is, of course, their choice to end their lives but I do also think that God chose to start their lives and finds them to be infinitely valuable. They were both made in God’s image and reflect something of it. That they felt there isn’t a place for them that is suitable is a tragedy because society should do better to welcome them.’ 

The word ‘welcome’ is striking to me here. What does it mean to welcome someone, not to merely include or tolerate, but to really welcome someone, even if they seem ‘a little bit different from others’? Dr Léon van Ommen, another theologian who writes about autism, suggests that it is a matter of making oneself and one’s resources fully available to that person, to the point where they feel that you belong to them. This is not to promote relationships with unhealthy power dynamics, but to highlight that when a person feels truly welcomed by another, they feel the opposite of owing a debt or being a burden – they feel they are of value, that you would be lacking something without them.  

I feel we are lacking something without you, “2019 (22), male, 70s, ASD”. And I feel we are lacking something without you, “2021 (26), male, 20s, ASD”. Not to forget the 37 others who are a little like you. We can pause to reflect on the social messages that you have sent, what you are teaching all of us about what it means to live a “good” life. But I am sorry that you have all died now and we cannot hear more.   

Whether people in the UK should be able to choose physician assisted suicide, I, personally, am not yet sure. Like the BMA, I see and respect the very good arguments both for and against. But eight people have chosen physician assisted suicide due to autism or intellectual disability, and when it comes to the social messages that sends, I feel compelled to sit down and listen.  

1,000th Article
AI
Creed
Death & life
Digital
6 min read

AI deadbots are no way to cope with grief

The data we leave in the cloud will haunt and deceive those we leave behind.

Graham is the Director of the Centre for Cultural Witness and a former Bishop of Kensington.

A tarnished humaniod robot rests its head to the side, its LED eyes look to the camera.
Nicholas Fuentes on Unsplash.

What happens to all your data when you die? Over the years, like most people, I've produced a huge number of documents, letters, photos, social media posts, recordings of my voice, all of which exist somewhere out there in the cloud (the digital, not the heavenly one). When I die, what will happen to it all? I can't imagine anyone taking the time to climb into my Dropbox folder or Instagram account and delete it all? Does all this stuff remain out there cluttering up cyberspace like defunct satellites orbiting the earth?  

The other day I came across one way it might have a future - the idea of ‘deadbots’. Apparently, AI has now developed to such an extent that it can simulate the personality, speech patterns and thoughts of a deceased person. In centuries past, most people did not leave behind much record of their existence. Maybe a small number of possessions, memories in the minds of those who knew them, perhaps a few letters. Now we leave behind a whole swathe of data about us. AI is now capable of taking all this data and creating a kind of animated avatar, representing the deceased person, known as a ‘deadbot’ or even more weirdly, a ‘griefbot’. 

You can feel the attraction. An organisation called ‘Project December’ promises to ‘simulate the dead’, offering a ghostly video centred around the words ‘it’s been so long: I miss you.’ For someone stricken with grief, wondering whether there's any future in life now that their loved one has gone, feeling the aching space in the double bed, breakfast alone, the silence where conversation once filled the air, the temptation to be able to continue to interact and talk with a version of the deceased might be irresistible. 

There is already a developing ripple of concern about this ‘digital afterlife industry’. A recent article in Aeon explored the ethical dilemmas. Researchers in Cambridge University have already called for the need for safety protocols against the social and psychological damage that such technology might cause. They focus on the potential for unscrupulous marketers to spam surviving family or friends with the message that they really need XXX because ‘it's what Jim would have wanted’. You can imagine the bereaved ending up being effectively haunted by the ‘deadbot’, and unable to deal with grief healthily. It can be hard to resist for those whose grief is all-consuming and persistent. 

Yet it's not just the financial dangers, the possibility of abuse that troubles me. It's the deception involved which seems to me to operate in at a number of ways. And it's theology that helps identify the problems.  

The offer of a disembodied, AI-generated replication of the person is a thin paltry offering, as dissatisfying as a Zoom call in place of a person-to-person encounter. 

An AI-generated representation of a deceased partner might provide an opportunity for conversation, but it can never replicate the person. One of the great heresies of our age (one we got from René Descartes back in the seventeenth century) is the utter dualism between body and soul. It is the idea that we have some kind of inner self, a disembodied soul or mind which exists quite separately from the body. We sometimes talk about bodies as things that we have rather than things that we are. The anthropology taught within the pages of the Bible, however, suggests we are not disembodied souls but embodied persons, so much so that after death, we don't dissipate like ethereal ‘software’ liberated from the ‘hardware’ of the body, but we are to be clothed with new resurrection bodies continuous with, but different from the ones that we possess right now. 

We learned about the importance of our bodies during the COVID pandemic. When we were reduced to communicating via endless Zoom calls, we realised that while they were better than nothing, they could not replicate the reality of face-to-face bodily communication. A Zoom call couldn't pick up the subtle messages of body language. We missed the importance of touch and even the occasional embrace. Our bodies are part of who we are. We are not souls that happen to temporarily inhabit a body, inner selves that are the really important bit of us, with the body an ancillary, malleable thing that we don't ultimately need. The offer of a disembodied, AI-generated replication of the person is a thin paltry offering, as dissatisfying as a virtual meeting in place of a person-to-person encounter. 

Another problem I have with deadbots, is that they fix a person in time, like a fossilised version of the person who once lived. AI can only work with what that person has left behind - the recordings, the documents, the data which they produced while they were alive. And yet a crucial part of being human is the capacity to develop and change. As life continues, we grow, we shift, our priorities change. Hopefully we learn greater wisdom. That is part of the point of conversation, that we learn things, it changes us in interaction with others. There is the possibility of spiritual development of maturity, of redemption. A deadbot cannot do that. It cannot be redeemed, it cannot be transformed, because it is, to quote U2, stuck in a moment, and you can’t get out of it.  

This is all of a piece with a general trajectory in our culture which is to deny the reality of death. For Christians, death is an intruder. Death - or at least the form in which we know it, that of loss, dereliction, sadness - was not part of the original plan. It doesn't belong here, and we long for the day when one day it will be banished for good. You don’t have to be a Christian to feel the pain of grief, but paradoxically it's only when you have a firm sense of hope that death is a defeated enemy, that you can take it seriously as a real enemy. Without that hope, all you can do is minimise it, pretend it doesn't really matter, hold funerals that try to be relentlessly cheerful, denying the inevitable sense of tragedy and loss that they were always meant to express.  

Deadbots are a feeble attempt to try to ignore the deep gulf that lies between us and the dead. In one of his parables, Jesus once depicted a conversation between the living and the dead:  

“between you and us a great chasm has been fixed, so that those who might want to pass from here to you cannot do so, and no one can cross from there to us.”  

Deadbots, like ‘direct cremations’, where the body is disposed without any funeral, denying the bereaved the chance to grieve, like the language around assisted dying that death is ‘nothing at all’ and therefore can be deliberately hastened, are an attempt to bridge that great chasm, which, this side of the resurrection, we cannot do. 

Deadbots in one sense are a testimony to our remarkable powers of invention. Yet they cannot ultimately get around our embodied nature, offer the possibility of redemption, or deal with the grim reality of death. They offer a pale imitation of the source of true hope - the resurrection of the body, the prospect of meeting our loved ones again, yet transformed and fulfilled in the presence of God, even if it means painful yet hopeful patience and waiting until that day. 

Celebrate with us - we're 2!

Since March 2023, our readers have enjoyed over 1,000 articles. All for free. This is made possible through the generosity of our amazing community of supporters.

If you’re enjoying Seen & Unseen, would you consider making a gift towards our work?

Do so by joining Behind The Seen. Alongside other benefits, you’ll receive an extra fortnightly email from me sharing my reading and reflections on the ideas that are shaping our times.

Graham Tomlin

Editor-in-Chief