Column
Comment
4 min read

There’s more than one way to lose our humanity

How we treat immigrants and how AI might treat humans weighs on the mind of George Pitcher.

George is a visiting fellow at the London School of Economics and an Anglican priest.

A grey multi-story accommodation barge floats beside a dock.
The Bibby Stockholm accommodation barge in Portland Harbour.
shley Smith, CC BY-SA 4.0 , via Wikimedia Commons.

“The greatness of humanity,” said Mahatma Gandhi, “is not in being human, but in being humane.” At first glance, this is something of a truism. But actually Gandhi neatly elides the two meanings of humanity in this tight little phrase. 

Humanity means both the created order that we know as the human race and its capacity for self-sacrificial love and compassion. In the Christian tradition, we celebrate at Christmas what we call the incarnation – the divine sharing of the human experience in the birth of the Christ child.  

Our God shares our humanity and in doing so, shows his humanity in the form of a universal and unconditional love for his people. So, it’s an act both for humanity and of humanity. 

This Christmas, there are two very public issues in which humanity has gone missing in both senses. And it’s as well to acknowledge them as we approach the feast. That’s in part a confessional act; where we identify a loss of humanity, in both its definitions, we can resolve to do something about it. Christmas is a good time to do that. 

The first is our loss of humanity in the framing of legislation to end illegal immigration to the UK. The second is the absence of humanity in the development of artificial intelligence. The former is about political acts that are inhumane and the latter goes to the nature of what it is to be human. 

We have literally lost a human to our inhumanity, hanged in a floating communal bathroom. It’s enough to make us look away from the crib, shamed rather than affirmed in our humanity. 

There is a cynical political line that the principal intention of the government’s Safety of Rwanda (Asylum and Immigration) Bill, voted through the House of Commons this week, is humane, in that it’s aimed at stopping the loss of life among migrants exploited by criminal gangs. But it commodifies human beings, turning them into cargo to be exported elsewhere. That may not be a crime – the law has yet to be tested – but it is at least an offence against humanity. 

Where humanity, meaning what it is to be human, is sapped, hope withers into despair. When a human being is treated as so much freight, its value not only diminishes objectively but so does its self-worth. The suicide of an asylum seeker on the detention barge Bibby Stockholm in Portland Harbour is a consequence of depreciated humanity. Not that we can expect to hear any official contrition for that. 

To paraphrase Gandhi, when we cease to be humane we lose our humanity. And we have literally lost a human to our inhumanity, hanged in a floating communal bathroom. It’s enough to make us look away from the crib, shamed rather than affirmed in our humanity. 

That’s inhumanity in the sense of being inhumane. Turning now to humanity in the sense of what it means to be human, we’re faced with the prospect of artificial intelligence which not only replicates but replaces human thought and function.  

To be truly God-like, AI would need to allow itself to suffer and to die on humanity’s part. 

The rumoured cause of the ousting of CEO Sam Altman last month from OpenAI (before his hasty reinstatement just five days later) was his involvement in a shadowy project called Q-star, GPT-5 technology that is said to push dangerously into the territory of human intelligence. 

But AI’s central liability is that it lacks humanity. It is literally inhuman, rather than inhumane. We should take no comfort in that because that’s exactly where its peril lies. Consciousness is a defining factor of humanity. AI doesn’t have it and that’s what makes it so dangerous. 

 To “think” infinitely quicker across unlimited data and imitate the best of human creativity, all without knowing that it’s doing so, is a daunting technology. It begins to look like a future in which humanity becomes subservient to its technology – and that’s indeed dystopian. 

But we risk missing a point when our technology meets our theology. It’s often said that AI has the potential to take on God-like qualities. This relates to the prospect of its supposed omniscience. Another way of putting that is that it has the potential to be all-powerful. 

The trouble with that argument is that it takes no account of the divine quality of being all-loving too, which in its inhumanity AI cannot hope to replicate. In the Christmastide incarnation, God (as Emmanuel, or “God with us”) comes to serve, not to be served. If you’ll excuse the pun, you won’t find that mission on a computer server. 

Furthermore, to be truly God-like, AI would need to allow itself to suffer and to die on humanity’s part, albeit to defeat its death in a salvific way. Sorry, but that isn’t going to happen. We must be careful with AI precisely because it’s inhuman, not because it’s too human. 

Part of what we celebrate at Christmas is our humanity and, in doing so, we may re-locate it. We need to do that if we are to treat refugees with humanity and to re-affirm that humanity’s intelligence is anything but artificial. Merry Christmas. 

Article
Comment
Digital
Freedom
5 min read

Seen in Beijing: what’s it like in a surveillance society?

Cameras and controls remind a visitor to value freedom.
A guard stands behind a barrier across an entrance to a station escalator.
A Beijing station gate and guard.

The recent Archers’ storyline wouldn’t have worked in Beijing. Here, great gantries of traffic cameras see into cars and record who is driving, so a court case which hinged on who was behind the wheel would not play out in months of suspense. The British press periodically runs stories on how much tracking and surveilling we are subject to, while the success of the TV series Hunted showed just how hard it is it to evade detection, and how interested we are in the possibility—but how often do we stop to think about the tensions inherent in the freedoms we enjoy? 

It is difficult to explain just how free life in Britain is to someone in China, and how precious, and conducive to social good, that freedom is, to people at home. Take my recent experiences. Prior to entry at Beijing airport, I was randomly chosen for a health check and required to give a mouth swap. This may have been a benign Covid testing program, but it was impossible to tell from the questions on screen we had to answer—and a mouth swab certainly hands DNA to the authorities. At the university where I was studying, face scans are required for entry on every gate, and visitors must be registered with state ID in advance. Despite not having been in China since prior to Covid restrictions, my face had been pre-programmed into the system and an old photograph flashed up on screen as the barriers opened.  

The first time I used a rental bike to cycle back to campus (the local Boris-bikes come on a monthly scheme, linked to a registered phone number), a message flashed up on my phone telling me that I had gone the wrong way down a one-way bike lane. The banner appeared twice, and the system would not let me lock the bike until I had acknowledged my error. The fact that the GPS system tracks the bikes so closely it knew I had gone against the traffic flow for a couple of hundred metres to avoid cycling across a 4-lane street was a surprise. Since that phone is registered to a Chinese friend, such infractions are also potentially a problem for him. What was less surprising, is the systemic nature of China’s ability to track its people at all times. 

Walking through my university campus, where every junction has three or four cameras covering all directions, I occasionally wonder where students find space to have a quick snog. 

No one uses cash in cities in China; in many outlets and places cash isn’t even accepted. Everyone uses apps like WeChat or Alipay to pay for goods—even at food trucks and casual stalls the vendor has a machine to scan a phone QR code. WeChat is WhatsApp and Facebook and a bank debit card and a travel service and news outlet rolled into one; Alipay, its only effective rival, offers similar. To obtain either account, a phone number is needed, numbers which have to be registered. And to pay for anything, a bank account in the name of the individual must be linked to the account. In other words, the government can choose to know every purchase I make, and its exact time and place. A friend who works in a bank says he uses cash where possible because he doesn’t want his colleagues in the bank to see what he's been buying. 

Transport is also heavily regulated. To enter a train station, a national ID card is needed, which is scanned after bags are x-rayed. To purchase a high-speed train ticket, a national ID card—or passport for foreigners—is required. It might be possible to purchase a ticket anonymously in cash from a ticket window outside the station for an old-fashioned slow train, but one would still need an ID card corresponding to the face being scanned to make it to the platform—and the train station has, of course, cameras at every entrance and exit. 

Cameras are pervasive. Walking through my university campus, where every junction has three or four cameras covering all directions, I occasionally wonder where students find space to have a quick snog. The only place I have not yet noticed cameras is the swimming pool changing rooms, which are communal, and in which I am the only person not to shower naked. There are cameras in the church sanctuary, and cameras on street crossings.  

Imagine being constantly reminded by human overseers that your activity in person and online is both seen and heard.

Even when not being watched, out in the countryside, the state makes its presence felt. On a recent hike in the hills, our passage triggered a recording every few hundred metres: “Preventing forest fires is everyone’s responsibility.” Once or twice is common sense, ten or twenty times a stroll is social intrusion. One can, of course, learn to ignore the posters, the announcements, the security guards on trains playing their pre-recorded notices as they wander up the aisles and the loud speaker reminders that smoking in the toilets or boarding without a ticket would affect one’s social credit score and imperil future train travel, but white noise shapes perception.  

As a (mostly) upright citizen, there are many upsides to constant surveillance. People leave their laptops unattended on trains, since they will not be stolen. Delivery packages are left strewn by the roadside or by a doorway: anyone stealing them will be quickly found. There is almost no graffiti. I can walk around at night safe in the knowledge that I am exceedingly unlikely to be a victim of petty theft, let alone knife or gun crime. Many Chinese have horrified tales of pickpockets in European cities or crime rates in the UK, while young friends are so used to the state having access to phone data and camera logs that they barely notice. Most Chinese I know are very happy with the trade-off of surveillance for safety—and the longer I spend in Beijing, the more appealing that normality seems. 

To those who have lived outside, however, the restrictions make for a more Orwellian existence. Any church group wanting to hold an online service must apply for a permit. A friend was recently blocked from his WeChat account for a period after using a politically sensitive term in a family group-chat. Not being able to access certain foreign websites, search engines or media (no Google, no WhatsApp and no Guardian without an illegal virtual private network) might be an irritation for a foreign resident but means a lifetime of knowingly limited information for a citizen. Not being able to access information freely means, ultimately, not being able to think freely, a loss that cannot be quantified. The elite can skip over the firewall, but many cannot.  

We have seen the dangers recently in the UK of limited information flow, and of social media interference by hostile players. Imagine never being able to know whether the information you are receiving is trustworthy—or being constantly reminded by human overseers that your activity in person and online is both seen and heard. Christians may believe in the benevolent and watchful gaze of God—but are rightly wary of devolving that omniscience to fellow humans.