Explainer
AI
Culture
Digital
6 min read

Tech has changed: it’s no longer natural or neutral

The first in a three-part series exploring the implications of technology.

James is Canon Missioner at Blackburn Cathedral. He researches technology and theology at Oxford University.

A caveman holding a hammer looks at a bench on which are a broken bicycle and a laptop.
Nick Jones/Midjourney.ai.

My son was born in February last year and it seems that every day he is developing new skills or facial expressions and adorable quirks. Just the other day he was playing with some wooden blocks and when they inevitably fell over, he let out the most adorable giggle. As you can guess I immediately reached for my phone so that I could capture the moment. Moments like this happen all the time in the life of a modern parent- we want to share with our spouse, family, and friends or just capture the moment for ourselves because it’s something we treasure. And yet, in this series of articles I would like to consider this moment, and the thousands like it that take place in a technological society, and ask: is everything as benign as it seems? 

There are two ideas that often come up whenever people talk about technology. The first is that technology is basically ‘neutral’, that technology only becomes good or bad depending on what you are doing with it. “Look at a hammer,” someone might say, “there is nothing intrinsically good or bad about this hammer, only the end result is good or bad depending on whether I’m using it to hit nails or people!” On this reading of technology, the only important questions relate to the consequences of use.  

If technology is neutral, then the primary concern for users, legislators and technologists is the consequences of technology, and not the technology itself. The only way to ensure that the technology is used for good is to ensure, somehow, that more good people will use the technology for good things than bad people using it for bad things. Often this idea will present itself as a conversation about competing freedoms: very few people (with some important exceptions, see this article from Ezra Klein) are debating whether there is something intrinsically problematic about the app formerly known as Twitter, most discussion revolves around how to maintain the freedom of good users while curtailing the freedom of bad users. 

We assume that these tools of social interaction like Facebook and Instagram are, in and of themselves, perfectly benign. We are encouraged to think this by massive corporations who have a vested interest in maintaining our use of their platforms, and at first glance, they seem completely harmless: what could possibly be the problem with a website in which grandma can share photos of her cat? And while the dark underbelly of these platforms has violent real-world consequences – like the rise of antisemitism and anti-Muslim hatred – the solution is primarily imagined as a matter of dealing with ‘bad actors’ rather than anything intrinsically problematic with the platforms themselves. 

Jobs here draws a straight-line comparison between the bicycle and the PC. As far as Jobs is concerned, there is no quantitative difference in kind between the two tools.

The second idea is related but somewhat different: Advocates of modern technology will suggest that humanity has been using technology ever since there were humans and therefore all this modern technology is not really anything to worry about. “Yes, modern technology looks scary,” someone might say, “but it’s really nothing to worry about, humans have been using tools since the Stone Age don’t you know!” This view proposes that because hammers are technology, and all technology is the same, there is, therefore, no difference between a hammer and the internet, or between the internet and a cyborg.  

This second idea tends to be accompanied by an emphasis on the slow and steady evolution of technology and by highlighting the fact that at every major technological advancement there have been naysayers decrying the latest innovation. (Even Plato was suspicious of writing when that was invented). Taken as part of a very long view of human history, the technological innovations of the last 100 years seem to be a normal and natural part of the evolution of our species which has always set itself apart from the rest of the animal kingdom in its use of technology. 

Steve Jobs gives a good example of this in an interview he gave about the development PC: 

“I think one of the things that really separates us from the high primates is that we’re tool builders. I read a study that measured the efficiency of locomotion for various species on the planet. The condors used the least energy to move a kilometer. And humans came in with a rather unimpressive showing about a third of the way down the list… not too proud of a showing for the crown of creation… But then somebody at Scientific American had the insight to test the efficiency of locomotion for a man on a bicycle. And a human on a bicycle blew the condor away – completely off the top of the charts. 

And that’s what a computer is to me… It’s the most remarkable tool we’ve ever come up with… It’s the equivalent of a bicycle for our minds”  

Notice that Jobs here draws a straight-line comparison between the bicycle and the PC. As far as Jobs is concerned, there is no quantitative difference in kind between the two tools: one is more complex than the other but otherwise, they are just technologies that expand human capacity. “A Bicycle for our minds” is a fascinating way to describe a computer because it implies that nothing about our minds will be changed, they’ll just be a little bit faster. 

And yet, despite the attempts of thought leaders like Jobs to convince us that modern technology is entirely benign, many of us are left with a natural suspicion that there is more going on. As a priest in the Church of England, I often have conversations with parishioners and members of the public who are looking for language or a framework which describes the instinctive recognition that something has changed at some point (fairly recently) about the nature of the technology that we use, or the way that it influences our lives. That modern technology is not simply the natural extension of the sorts of tools that humans have been using since the Stone Age and that modern technology is not neutral but in significant ways has already had an effect regardless of how we might use it. How do we respond to such articulate and thoughtful people such as Steve Jobs who make a compelling case that modern technology is neutral and natural?  

I often have conversations with parishioners who are looking for language or a framework which describes the instinctive recognition that something has changed about the nature of the technology that we use, or the way that it influences our lives.

Thinking back to that moment with my son when he giggles and I take a photo of him, at first glance it seems completely innocuous. But what resources are available if I did want to think more carefully about that moment (and the many like it) which suffuse my daily life? Thankfully there is a growing body of literature from philosophers and theologians who are thinking about the impact of modern technology on the human condition.  In the next two articles I would like to introduce the work of Martin Heidegger, outline his criticism of modern technology, showing how he challenges the idea that technology is simply a natural extension of human capacity or a neutral tool.  

Heidegger is a complex character in philosophy and in Western history. There is no getting around the fact that he was a supporter of the Nazi Party during the second world war. His politics have been widely condemned and rightly so, nevertheless, his insights on the nature of modern technology continue to this day to provide insights that are useful. His claim is that modern technology essentially and inevitably changes our relationship with the world in which we live and even with ourselves. It is this claim, and Heidegger’s suggested solution, that I will unpack in the next two articles. 

Article
Creed
Leading
5 min read

The Nicene Creed: a 1,700-year-old game changer

Why we should celebrate the Council of Nicaea today.

Jane Williams is the McDonald Professor in Christian Theology at St Mellitus College.

A ink drawing of Constantine the Emperor on a throne listening to people showing him books.
Constantine and the council.
Wikimedia Commons.

The are not many 1,700-year-old documents that are read out loud every week and known by heart by millions of people across the world. The Nicene Creed is one of them. In 2025 it will be 1,700 years since the Council of Nicaea was called by the Emperor Constantine, and came up with the first version of the Creed. Next year will be full of conferences planned to interrogate and reassess but, mostly, to thank God for the Nicene Creed 

But many people will be bewildered, which is a polite way of saying ‘indifferent’ or even ‘hostile’, to this outpouring of Nicaea-mania. Lots of people don’t know the Creed at all, or, if they do, they see it as dogmatic, exclusionary and couched in the arcane language of fourth century classical philosophy, which seems to have little relevance to the world we live in today. Is it really worth celebrating? Let me suggest some reasons why I think it is. 

Suddenly, Christians had a chance to shape the world, to shape culture, from the top down as well as from the bottom up. 

First of all, 325 marked a period of huge transition for the Christian faith. For the previous 300 years since the time of Jesus, Christianity had been spreading surprisingly rapidly, but generally without support from the wealthy or powerful, and suffering regular persecution. But at the beginning of the fourth century, the Emperor Constantine declared himself to be a ‘Christian’. There is a lot of debate about what he meant by that – it didn’t stop him from murdering most of his family, for example. But Constantine ascribed his victorious Imperial campaign to the protection of the Christian God, and began to offer safety and privilege to Christians and their leaders. It was Constantine who called the Council of Nicaea, wanting to assert his own authority but also wanting this nascent ‘institutional’ Church to get a grip and unite behind him. Suddenly, Christians had a chance to shape the world, to shape culture, from the top down as well as from the bottom up. Whether this is a good thing or a bad one, and what it did and does to the character of Christian faith in the 1,700 years since Nicaea is undoubtedly something that 2025 will have to examine. 

Secondly, the Council of Nicaea offered a model of decision-making that has been profoundly important in Christian life ever since. Nicaea was deliberately chosen as the place to hold this council because it sat roughly on the dividing line between the Eastern part of the Roman Empire, where Greek was the lingua franca, and the Western part, where Latin was the language of public discourse. Constantine was seeking to establish himself as sole emperor over both parts, and he called together at Nicaea Christian leaders from across the Empire. We have a good idea of who was there because of the signatories to the resolutions of the Council. 

Leaders came from some of the most sophisticated, wealthy and educated parts of the Roman Empire, like Alexandria, with its famous school and library. But they also came from some of the simplest parts, where peasant life was the norm for both the bishop and the congregations. St Spiridion, now the patron saint of Corfu, was one of the signatories; he maintained his hard life as shepherd while leading his human flock; St Nicholas of Myra, whom we now know as Santa Claus, was there, too; altogether there were probably 200 to 300 bishops there, highlighting the extraordinary spread of Christian faith across the Roman Empire. That is why the Council of Nicaea is called the First Ecumenical or world-wide Council. This was the first opportunity for the Church to take stock of itself and to notice and learn from its diversity.  

This is a game-changing concept, both for theology and for anthropology. 

This model of ‘conciliar’ discussion has remained key to the way in which Christians try to resolve conflict and make decisions, by meeting, discussing, praying and hearing from voices and experiences that represent the whole diversity of humanity. No one can pretend that the Council of Nicaea was exactly such a process – no women were part of the consultation, for one thing – but the intention was significant. In our own time of deep disagreement between Christians, a commitment to the Nicene method of consultative decision-making would be a good focus for examination of 1,700 years of trying to listen to each other, even if we often fail. 

Thirdly, and most importantly of all, of course, the Council of Nicaea produced the Nicene Creed, a succinct statement of what Christians affirm about God and the world because of the paradigm-changing life, death, resurrection and ascension of Jesus. The short, clear statements of faith in the Creed were hard-fought for and not accepted by everyone, then or now. They became necessary as people tried out different descriptions of who Jesus is in relation to God, which brought out more and more clearly how fundamental this question is for our understanding of God, and so our understanding of our own purpose and destiny. Some suggested that Jesus was just an exceptionally gifted human being, favoured by God. But the world has been full of great prophets, most of whom receive lip-service at best, but make no actual difference. Others proposed that Jesus was God, wearing a disguise but not really, actually, human, suggesting that God can’t really commit to the created order. The most popular suggestion in the fourth century, put forward by a learned teacher called Arius, was that Jesus is something in between, not the eternal God, but not just a human being either. But that’s the worst of all worlds: we can’t trust what Jesus shows us either about God or about human beings. 

All of these ‘solutions’ protected God’s transcendence and otherness – God is above and beyond created existence and divinity cannot or will not sully itself with the earthly, historical lives that human beings live.  

The radical suggestion of the Nicene Creed, trying to be faithful to the witness of the Bible, is that Jesus is really God, living among us, but also really a human being, born into a particular time and place in history and dying a real, historical death. And that must mean that the Almighty God doesn’t think it compromises God’s power and majesty to come and share our lives. Imagine the dignity that gives us and our lives – God loves and honours the world and thinks that a human life is capable of showing us the nature of God. But it also means that the full life-giving power of God is not just ‘outside’ but ‘inside’ the world. 

This is a game-changing concept, both for theology and for anthropology.  

 

To find out more about the McDonald Agape Nicaea Project being held by St. Mellitus College in London, come and join the public lectures, or look out for other Nicene celebrations in 2025.

Participants will hear from some of the world’s leading scholars on various issues related to Nicaea, including Professor Khaled Anatolios, Dr. Beverly Roberts Gaventa, Professor Ilaria Ramelli, Professor Bruce McCormack, Dr. Willie James Jennings, and many more.  

A significant part of the Nicaea conference in 2025 will be a call for papers, expanding dialogue on the topic and hearing from a wide array of voices.  

For more information or to register for these events, you can visit the Nicaea Project website