Article
AI
Comment
4 min read

It's our mistakes that make us human

What we learn distinguishes us from tech.

Silvianne Aspray is a theologian and postdoctoral fellow at the University of Cambridge.

A man staring at a laptop grimmaces and holds his hands to his head.
Francisco De Legarreta C. on Unsplash.

The distinction between technology and human beings has become blurry: AI seems to be able to listen, answer our questions, even respond to our feelings. It becomes increasingly easy to confuse machines with humans. In this situation, it is increasingly important to ask: What makes us human, in distinction from machines? There are many answers to this question, but for now I would like to focus on just one aspect of what I think is distinctively human: As human beings, we live and learn in time.  

To be human means to be intrinsically temporal. We live in time and are oriented towards a future good. We are learning animals, and our learning is bound up with the taking of time. When we learn to know or to do something, we necessarily make mistakes, and we take practice. But keeping in view something we desire – a future good – we keep going.  

Let’s take the example of language. We acquire language in community over time. Toddlers make all sorts of hilarious mistakes when they first try to talk, and it takes them a long time even to get single words right, let alone to try and form sentences. But they keep trying, and they eventually learn. The same goes with love: Knowing how to love our family or our neighbours near and far is not something we are good at instantly. It is not the sort of learning where you absorb a piece of information and then you ‘get’ it. No, we learn it over time, we imitate others, we practice and even when we have learned, in the abstract, what it is to be loving, we keep getting it wrong. 

This, too, is part of what it means to be human: to make mistakes. Not the sort of mistakes machines make, when they classify some information wrongly, for instance, but the very human mistake of falling short of your own ideal. Of striving towards something you desire – happiness, in the broadest of terms – and yet falling short, in your actions, of that very goal. But there’s another very human thing right here: Human beings can also change. They – we – can have a change of heart, be transformed, and at some point in time, actually start to do the right thing – even against all the odds. Statistics of past behaviours, do not always correctly predict future outcomes. Part of being human means that we can be transformed.  

Transformation sometimes comes suddenly, when an overwhelming, awe-inspiring experience changes somebody’s life as by a bolt of lightning. Much more commonly, though, such transformation takes time. Through taking up small practices, we can form new habits, gradually acquire virtue, and do the right thing more often than not. This is so human: We are anything but perfect. As Christians would say: We have a tendency to entangle ourselves in the mess of sin and guilt. But we also bear the image of the Holy One who made us, and by the grace and favour of that One, we are not forever stuck in the mess. We are redeemed: are given the strength to keep trying, despite the mistakes we make, and given the grace to acquire virtue and become better people over time. All of this to say that being human means to live in time, and to learn in time. 

So, this is a real difference between human beings and machines: Human beings can, and do strive toward a future good. 

Now compare this to the most complex of machines. We say that AI is able to “learn”. But what does it mean to learn, for AI? Machine learning is usually categorized into supervised learning, unsupervised and self-supervised learning. Supervised learning means that a model is trained for a specific task based on correctly labelled data. For instance, if a model is to predict whether a mammogram image contains a cancerous tumour, it is given many example images which are correctly classed as ‘contains cancer’ or ‘does not contain cancer’. That way, it is “taught” to recognise cancer in unlabelled mammograms. Unsupervised learning is different. Here, the system looks for patterns in the dataset it is given. It clusters and groups data without relying on predefined labels. Self-supervised learning uses both methods: Here, the system uses parts of the data itself as a kind of label – such as, for instance, predicting the upper half of an image from its lower half, or the next word in a given text. This is the predominant paradigm for how contemporary large-scale AI models “learn”.  

In each case, AI’s learning is necessarily based on data sets. Learning happens with reference to pre-given data, and in that sense with reference to the past. It may look like such models can consider the future, and have future goals, but only insofar as they have picked up patterns in past data, which they use to predict future patterns – as if the future was nothing but a repetition of the past.  

So this is a real difference between human beings and machines: Human beings can, and do strive toward a future good. Machines, by contrast, are always oriented towards the past of the data that was fed to them. Human beings are intrinsically temporal beings, whereas machines are defined by temporality only in a very limited sense: it takes time to upload data, and for the data to be processed, for instance. Time, for machines, is nothing but an extension of the past, whereas for human beings, it is an invitation to and the possibility for being transformed for the sake of a future good. We, human beings, are intrinsically temporal, living in time towards a future good – which machines do not.  

In the face of new technologies we need a sharpened sense for the strange and awe-inspiring species that is the human race, and cultivate a new sense of wonder about humanity itself.  

Article
Attention
Comment
Digital
Monastic life
5 min read

The Sycamore Gap vandals were chasing the wrong sort of fame

Fifteen minutes of notoriety is nothing - just ask St Cuthbert.

Graham is the Director of the Centre for Cultural Witness and a former Bishop of Kensington.

A felled decidious tree lies sprawled on the ground. The freshly sawn stump and roots are in the foreground
The stump of the felled sycamore tree.
Wandering wounder, CC BY-SA 4.0, via Wikimedia Commons.

It was Andy Warhol who is said to have uttered the famous statement: “In the future, everyone will be world-famous for 15 minutes." Never mind the fact that the quotation has been attributed to other people as well, whoever came up with it first can hardly have anticipated how quickly it would come true.  

In our times, social media has democratised information. We all now have our own individual press office, issuing our considered statements to the world in the form of Instagram or Facebook posts, comments on X, reels and the like. Secretly we all hope one of our gems of wisdom, a joke or a video of something weird will go viral - in a positive way - and we will get our 15 minutes of fame.  

I was thinking of all this recently on a walk by Hadrian's Wall in Northumberland in the North-east of England. It so happened that on that very day, the Wall was in the news, as the two men who had cut down the famous tree at Sycamore Gap – the one featured in the Robin Hood film - were convicted of the crime. We looked up at Sycamore Gap, and it was just that - a gap – denuded of its tree, it is now just like any other depression in the escarpment over which Hadrian's Wall runs. Only you couldn't avoid the memory of the distinctive tree silhouetted against the sky which was no longer there, like an awkward smile with a tooth missing. 

The story of Daniel Graham and Adam Carruthers is a pretty unsavoury one. Two fairly low life characters without a great deal of purpose seem to have thought of this as a clever stunt which would somehow impress people. The video would go viral, they hoped, and they would be famous - maybe for 15 minutes - basking in the global coverage of their daring action. They seem to have totally miscalculated the affection with which the tree was held and the outrage this stupid act generated. They got their notoriety but not in a good way. Today they wait anxiously to see whether this mindless act of vandalism will lead to a prison sentence. 

It is perhaps another symptom of our culture’s desperate desire for fame. Social media is full of influencers who are famous for not much more than being famous. Similar stunts, one more outrageous than the other are performed daily, recorded on YouTube and put out there to gain attention. We are addicted to fame. 

The hapless pair were desperate for their moment of fame and got it in a particularly nasty form.

On the same Northumberland trip, not very far away, a very different approach to fame caught my eye. Cuthbert, a seventh century hermit was one of those hardy Christian monks and missionaries who spread the faith in these islands in the wake of the fall of the Roman Empire. He was known for his piety, astonishing miracles and sympathy with nature. His biographer, the Venerable Bede, tells us he would walk into the cold North Sea, standing up to his neck in water to pray, in order to increase his ability to focus on God, the object of his prayers, not the yearnings of his body. On coming out of the water, sea otters would come and warm his feet, sensing that this man was in tune with the heart of the universe and should be cared for and protected.  

As his fame grew, Cuthbert tried to find more and more ways to run away from it. He was given permission to leave his monastery in Lindisfarne to go out alone to live on the remote Farne islands, far from prying eyes, giving him the freedom to focus on the one object of his desire - to know God through a deep life of prayer and meditation. People would try to come to see him, fellow monks bringing supplies, or pilgrims looking for a word of wisdom from the holy man, yet his focus was ruthless. Eventually, says Bede, “he shut himself away from sight within the hermitage, rarely talking to visitors even from the inside, and then only through the window… in the end he blocked it up and opened it only to give a blessing or for some definite need”. 

The difference between Graham / Carruthers and Cuthbert could hardly be more stark. The hapless pair were desperate for their moment of fame and got it in a particularly nasty form - fame that turns out to be more like shame. Cuthbert fled from fame, longing for the attention not of other people but of his Maker and Redeemer.

Cuthbert’s relentless pursuit of God, and its results in a remarkable life - weird in a different and more nourishing way than the stunts on YouTube - fascinated people. After he died, his bones were transferred to Durham Cathedral where they still lie today. You find the name of St Cuthbert everywhere in the North East – on schools, road signs, coffee shops and fishing boats. It’s a name that will endure after the destroyers of the sycamore tree are long forgotten. We're still talking about Cuthbert 1,400 years later. 

Fame is an elusive and dangerous thing. Tom Holland once called it “a beast that you can't control or be prepared for.” If you chase it, it rarely turns out well. More often than not you get the wrong kind of (unwelcome) fame. The best kind comes when you’re not making fame itself the thing you’re looking for. If you ignore it, and seek something more satisfying, something really worth attention – which for Cuthbert was God, the source of all beauty, truth and goodness - you won’t be worried whether you’re famous or not, because your heart will be full of something much more lasting and worthwhile.

Celebrate our 2nd birthday!
Since Spring 2023, our readers have enjoyed over 1,000 articles. All for free. 
This is made possible through the generosity of our amazing community of supporters.

If you enjoy Seen & Unseen, would you consider making a gift towards our work?

Do so by joining Behind The Seen. Alongside other benefits, you’ll receive an extra fortnightly email from me sharing my reading and reflections on the ideas that are shaping our times.

Graham Tomlin
Editor-in-Chief