Explainer
AI - Artificial Intelligence
Culture
Digital
6 min read

Tech has changed: it’s no longer natural or neutral

The first in a three-part series exploring the implications of technology.

James is Canon Missioner at Blackburn Cathedral. He researches technology and theology at Oxford University.

A caveman holding a hammer looks at a bench on which are a broken bicycle and a laptop.
Nick Jones/Midjourney.ai.

My son was born in February last year and it seems that every day he is developing new skills or facial expressions and adorable quirks. Just the other day he was playing with some wooden blocks and when they inevitably fell over, he let out the most adorable giggle. As you can guess I immediately reached for my phone so that I could capture the moment. Moments like this happen all the time in the life of a modern parent- we want to share with our spouse, family, and friends or just capture the moment for ourselves because it’s something we treasure. And yet, in this series of articles I would like to consider this moment, and the thousands like it that take place in a technological society, and ask: is everything as benign as it seems? 

There are two ideas that often come up whenever people talk about technology. The first is that technology is basically ‘neutral’, that technology only becomes good or bad depending on what you are doing with it. “Look at a hammer,” someone might say, “there is nothing intrinsically good or bad about this hammer, only the end result is good or bad depending on whether I’m using it to hit nails or people!” On this reading of technology, the only important questions relate to the consequences of use.  

If technology is neutral, then the primary concern for users, legislators and technologists is the consequences of technology, and not the technology itself. The only way to ensure that the technology is used for good is to ensure, somehow, that more good people will use the technology for good things than bad people using it for bad things. Often this idea will present itself as a conversation about competing freedoms: very few people (with some important exceptions, see this article from Ezra Klein) are debating whether there is something intrinsically problematic about the app formerly known as Twitter, most discussion revolves around how to maintain the freedom of good users while curtailing the freedom of bad users. 

We assume that these tools of social interaction like Facebook and Instagram are, in and of themselves, perfectly benign. We are encouraged to think this by massive corporations who have a vested interest in maintaining our use of their platforms, and at first glance, they seem completely harmless: what could possibly be the problem with a website in which grandma can share photos of her cat? And while the dark underbelly of these platforms has violent real-world consequences – like the rise of antisemitism and anti-Muslim hatred – the solution is primarily imagined as a matter of dealing with ‘bad actors’ rather than anything intrinsically problematic with the platforms themselves. 

Jobs here draws a straight-line comparison between the bicycle and the PC. As far as Jobs is concerned, there is no quantitative difference in kind between the two tools.

The second idea is related but somewhat different: Advocates of modern technology will suggest that humanity has been using technology ever since there were humans and therefore all this modern technology is not really anything to worry about. “Yes, modern technology looks scary,” someone might say, “but it’s really nothing to worry about, humans have been using tools since the Stone Age don’t you know!” This view proposes that because hammers are technology, and all technology is the same, there is, therefore, no difference between a hammer and the internet, or between the internet and a cyborg.  

This second idea tends to be accompanied by an emphasis on the slow and steady evolution of technology and by highlighting the fact that at every major technological advancement there have been naysayers decrying the latest innovation. (Even Plato was suspicious of writing when that was invented). Taken as part of a very long view of human history, the technological innovations of the last 100 years seem to be a normal and natural part of the evolution of our species which has always set itself apart from the rest of the animal kingdom in its use of technology. 

Steve Jobs gives a good example of this in an interview he gave about the development PC: 

“I think one of the things that really separates us from the high primates is that we’re tool builders. I read a study that measured the efficiency of locomotion for various species on the planet. The condors used the least energy to move a kilometer. And humans came in with a rather unimpressive showing about a third of the way down the list… not too proud of a showing for the crown of creation… But then somebody at Scientific American had the insight to test the efficiency of locomotion for a man on a bicycle. And a human on a bicycle blew the condor away – completely off the top of the charts. 

And that’s what a computer is to me… It’s the most remarkable tool we’ve ever come up with… It’s the equivalent of a bicycle for our minds”  

Notice that Jobs here draws a straight-line comparison between the bicycle and the PC. As far as Jobs is concerned, there is no quantitative difference in kind between the two tools: one is more complex than the other but otherwise, they are just technologies that expand human capacity. “A Bicycle for our minds” is a fascinating way to describe a computer because it implies that nothing about our minds will be changed, they’ll just be a little bit faster. 

And yet, despite the attempts of thought leaders like Jobs to convince us that modern technology is entirely benign, many of us are left with a natural suspicion that there is more going on. As a priest in the Church of England, I often have conversations with parishioners and members of the public who are looking for language or a framework which describes the instinctive recognition that something has changed at some point (fairly recently) about the nature of the technology that we use, or the way that it influences our lives. That modern technology is not simply the natural extension of the sorts of tools that humans have been using since the Stone Age and that modern technology is not neutral but in significant ways has already had an effect regardless of how we might use it. How do we respond to such articulate and thoughtful people such as Steve Jobs who make a compelling case that modern technology is neutral and natural?  

I often have conversations with parishioners who are looking for language or a framework which describes the instinctive recognition that something has changed about the nature of the technology that we use, or the way that it influences our lives.

Thinking back to that moment with my son when he giggles and I take a photo of him, at first glance it seems completely innocuous. But what resources are available if I did want to think more carefully about that moment (and the many like it) which suffuse my daily life? Thankfully there is a growing body of literature from philosophers and theologians who are thinking about the impact of modern technology on the human condition.  In the next two articles I would like to introduce the work of Martin Heidegger, outline his criticism of modern technology, showing how he challenges the idea that technology is simply a natural extension of human capacity or a neutral tool.  

Heidegger is a complex character in philosophy and in Western history. There is no getting around the fact that he was a supporter of the Nazi Party during the second world war. His politics have been widely condemned and rightly so, nevertheless, his insights on the nature of modern technology continue to this day to provide insights that are useful. His claim is that modern technology essentially and inevitably changes our relationship with the world in which we live and even with ourselves. It is this claim, and Heidegger’s suggested solution, that I will unpack in the next two articles. 

Column
Belief
Creed
Education
4 min read

Theology isn’t just for believers – and that’s the problem

As spiritual curiosity among the young rises, let’s change how they explore it

George is a visiting fellow at the London School of Economics and an Anglican priest.

Quizzical-looking students look across a tutorial to others.
Nick Jones/Midjourney.ai.

The Cambridge don told us calmly but firmly in answer to a question, probably mine, that: “You don’t need to have a commitment to study theology here, but it helps.” It was 1972 and I looked around me. I was surrounded by young blokes, dare I say it, of a certain type – tall, pale-though-uninteresting, spotty and a bit chinless. Very much like me in fact in those respects but unlike me, I thought, in one key respect: Blimey, they’re all going to be vicars! 

I ran a mile – well, about 100 – to study something more fun at a redbrick, something that was also being studied by young women, which was important for me at that moment. 

Little did I know that I’d take a theology degree some 30 years later, when I trained for priesthood. And, as it happens, very much alongside women, though this column really isn’t about that. 

What it is about is the lingering academic assumption that theology is for the committed, the faithful; that it’s vocational and for people who are called to make a career of it. I wanted to study it as an adolescent only because I was academically interested (yes, how we laugh now). 

I’m with C.S. Lewis when he says faith is either a fraudulent trick or an absolute truth but can’t be anything in between (“He has not left that open to us”). Either way, I thought, that’s a great story, a curiosity for stories that led me to journalism, since the latter choice – absolute truth – seemed to be the matriculation requirement for theology at university. And the idea that theology is for committed Christians still prevails.  

It’s been on the news agenda again lately that religious studies at A-level and theology in further education have been collapsing as course choices, even as Gen Z (18-28) has shown an increasing propensity for a return to faith and church-going – the “quiet revival”. 

That apparent paradox may be explained in a number of ways. The current college generation may not equate religious interest with academic study (as I didn’t, in a way). It may be that young men, in particular, are drawn to church by a resurgent conservative Christian nationalism. Or it may simply be that a spiritual consciousness is seen as a self-improvement technique that gets dropped by their thirties. 

But there’s another possibility. Maybe we’re just not teaching theology very well. Maybe, perish the thought, we’re making it boring. Perhaps it’s like wanting to make music as a child and being sat down in front of a blackboard to be taught theory, bars and crotchets and whatnot. 

Maybe the young are interested in the subject but not in the Church Fathers, scriptural hermeneutics and ancient Greek. They may be intensely interested in whence our western ethics, morality and culture derive, but then they have history and philosophy for that. Theology is dying on its feet even as the young are wishing to make it live again. 

I have a proposed solution and it’s this: Contextual Theology. This is a school that examines the meaning of religious faith in its contemporary cultural and social contexts. It values human experience – otherwise revelation – as a valid theological source and recognises that scripture emerges from its own cultural circumstances and must be viewed with reference to our own.  

What Contextual Theology emphatically isn’t is an attempt to make theology “fit” post-modern mores and fads. The fount of divinity is unchangeable – impassable, as theologians say – but we’re invited to interpret it through the prism of the world in which we live. It’s not so much about how theology works in the world as how theology makes the world work.  

Contextual Theology is as demanding as it is illuminating. It’s the degree I took, as it happens, when I trained for priesthood in the early Noughties and it never did me any harm (Discuss). 

This isn’t a replacement theory for classical theology. We need to understand it in its ancient context to re-interpret it in our technocratic political climate. The curiosity of our young generation seems to suggest that’s an appealing prospect. 

Yet search for a Contextual Theology degree course and you search in vain. They’re only available on courses for ministerial training (like mine), validated by a university rather than taught by it. That means you can only really study Contextual Theology if you’re training for ministry. Which takes me back to that summer of 1972. 

As Graham Tomlin, of this parish, asserts, we need theologians as well as ministers if the quiet revival among the young is to be properly supported. But we need theologians of all sorts, classical and contextual. 

I like to think of the theologian who, struck by lightning, arrives at the Pearly Gates. “But I taught that God sends famine and floods on all those who sin,” complains the theologian. “Ah,” replies St Peter, “but I think you took him out of context.” 

Support Seen & Unseen

Since Spring 2023, our readers have enjoyed over 1,500 articles. All for free. 
This is made possible through the generosity of our amazing community of supporters.

If you enjoy Seen & Unseen, would you consider making a gift towards our work?

Do so by joining Behind The Seen. Alongside other benefits, you’ll receive an extra fortnightly email from me sharing my reading and reflections on the ideas that are shaping our times.

Graham Tomlin
Editor-in-Chief