Review
Books
Culture
Digital
4 min read

Filterworld: algorithmic anxiety is flattening our culture

The rule of vanilla lets our unfeeling gadgets decide what’s best for us.

Simon is Bishop of Tonbridge in the Diocese of Rochester. He writes regularly round social, cultural and political issues.

A podcast guest speaks in front of a mic.
What's next on the playlist?
Sebastian Pandelache on Unsplash.

Here’s another diagnosis to add to modern malaise: algorithmic anxiety.  It’s described by Kyle Chayka in his excellent book Filterworld (Heligo Books, 2024) as the: 

 …awareness that we must constantly contend with automated technological processes beyond our understanding and control, whether in our Facebook feeds, Google Maps driving directions, or Amazon product promotions. 

We don’t understand algorithms.  Even if we did, we wouldn’t know how they actually work on us as every tech company keeps it a secret, lest competitors learn from them.  This has led to the algorithm becoming the century’s newest bogeyman, a phantom we can reference in conversation to make us sound tech savvy and culturally knowing even while we remain in the dark. 

‘Algorithmic has become a byword for anything that feels too slick, too reductive, or too optimised for attracting attention’.   

Kyle Chayka

One of the oddest outcomes of the ascendency of the algorithm is the seemingly diametric effects on politics and culture.  In politics it has polarised people, sorting us into opposing camps and then ensuring we hear only good things about our ‘side’ and only maddening things about the ‘opposing’ side.  Instead of calmly listening to a different view, we hurl insults, as performative as Prime Minister’s Question Time and about as enlightening. 

Something different is happening with culture.  Here, the algorithm makes culture more homogenous; in the words of Kyle Chayka, it is ‘flattened’.  The basic rule of what he calls Filterworld is that ‘the popular becomes more popular, and the obscure becomes even less visible’.  It is a strange re-mix of Jesus for the digital age: ‘to all those who have, more will be given…but from those who have nothing, even what they have will be taken away. 

The life of an Instagram post is said to be determined in the first five minutes.  If it has engagement, it can be sure of more; if it gets none, it will sink.  Visibility on social media is vital for artists of all kinds, because this is where all publicity begins.  Artists try and game the system, figuring out what kind of content the algorithm will promote.  In the process, their creative expression is subtly compromised.  People begin to write in a style that gets attention, and what gets attention is decided by the algorithm.  Those who tweet will know how the short, pared back medium starts to influence their life away from X. Musicians know that art which is safe and mainstream – the public’s crowded middle where performers like Ed Sheeran have thrived – is likely to succeed.   

‘Much of culture now has the hollow, vacant feeling of having been made by algorithm’ according to the cultural commentator Dean Kissick.  Chayka observes that: ‘algorithmic has become a byword for anything that feels too slick, too reductive, or too optimised for attracting attention’.   

It is often at the margins that breakthroughs emerge; art that makes us see this world in a new and divine light.   

There is a valid counter to this development.  Previously, what we read, heard and saw as cultural consumers was determined by a small set of experts who filtered content for us.  These experts were often drawn from a narrow section of society who inevitably brought their own biases to bear.  While this may be true, it is hardly a triumph for the public to have an unfeeling gadget decide what’s best for them, based on what we have liked before and what seems to appeal to most people.  At the ice cream vendor, this is like reaching for vanilla every time.   

The truth is, in necessarily surrendering to the algorithm (for what alternative is there online?) we miss huge volumes of culture that might appeal to us.  It is about as effective as deciding what sea life we like based only on what pops up to the surface of the water. 

The best art is not always the most popular and there is a risk that the divine spark of invention that the creator God has placed within each of us – the unlimited potential of being made in the image of God – will not be fanned into existence as often as it could be.  Chasing likes is no substitute for patient inspiration.  It is often at the margins that breakthroughs emerge; art that makes us see this world in a new and divine light.   

‘Behold, I am making all things new’ says the one who sits on the throne in Revelation.  That algorithms are making all things similar is the reality we are learning to live with. 

Explainer
AI
Culture
Digital
6 min read

Tech has changed: it’s no longer natural or neutral

The first in a three-part series exploring the implications of technology.

James is Canon Missioner at Blackburn Cathedral. He researches technology and theology at Oxford University.

A caveman holding a hammer looks at a bench on which are a broken bicycle and a laptop.
Nick Jones/Midjourney.ai.

My son was born in February last year and it seems that every day he is developing new skills or facial expressions and adorable quirks. Just the other day he was playing with some wooden blocks and when they inevitably fell over, he let out the most adorable giggle. As you can guess I immediately reached for my phone so that I could capture the moment. Moments like this happen all the time in the life of a modern parent- we want to share with our spouse, family, and friends or just capture the moment for ourselves because it’s something we treasure. And yet, in this series of articles I would like to consider this moment, and the thousands like it that take place in a technological society, and ask: is everything as benign as it seems? 

There are two ideas that often come up whenever people talk about technology. The first is that technology is basically ‘neutral’, that technology only becomes good or bad depending on what you are doing with it. “Look at a hammer,” someone might say, “there is nothing intrinsically good or bad about this hammer, only the end result is good or bad depending on whether I’m using it to hit nails or people!” On this reading of technology, the only important questions relate to the consequences of use.  

If technology is neutral, then the primary concern for users, legislators and technologists is the consequences of technology, and not the technology itself. The only way to ensure that the technology is used for good is to ensure, somehow, that more good people will use the technology for good things than bad people using it for bad things. Often this idea will present itself as a conversation about competing freedoms: very few people (with some important exceptions, see this article from Ezra Klein) are debating whether there is something intrinsically problematic about the app formerly known as Twitter, most discussion revolves around how to maintain the freedom of good users while curtailing the freedom of bad users. 

We assume that these tools of social interaction like Facebook and Instagram are, in and of themselves, perfectly benign. We are encouraged to think this by massive corporations who have a vested interest in maintaining our use of their platforms, and at first glance, they seem completely harmless: what could possibly be the problem with a website in which grandma can share photos of her cat? And while the dark underbelly of these platforms has violent real-world consequences – like the rise of antisemitism and anti-Muslim hatred – the solution is primarily imagined as a matter of dealing with ‘bad actors’ rather than anything intrinsically problematic with the platforms themselves. 

Jobs here draws a straight-line comparison between the bicycle and the PC. As far as Jobs is concerned, there is no quantitative difference in kind between the two tools.

The second idea is related but somewhat different: Advocates of modern technology will suggest that humanity has been using technology ever since there were humans and therefore all this modern technology is not really anything to worry about. “Yes, modern technology looks scary,” someone might say, “but it’s really nothing to worry about, humans have been using tools since the Stone Age don’t you know!” This view proposes that because hammers are technology, and all technology is the same, there is, therefore, no difference between a hammer and the internet, or between the internet and a cyborg.  

This second idea tends to be accompanied by an emphasis on the slow and steady evolution of technology and by highlighting the fact that at every major technological advancement there have been naysayers decrying the latest innovation. (Even Plato was suspicious of writing when that was invented). Taken as part of a very long view of human history, the technological innovations of the last 100 years seem to be a normal and natural part of the evolution of our species which has always set itself apart from the rest of the animal kingdom in its use of technology. 

Steve Jobs gives a good example of this in an interview he gave about the development PC: 

“I think one of the things that really separates us from the high primates is that we’re tool builders. I read a study that measured the efficiency of locomotion for various species on the planet. The condors used the least energy to move a kilometer. And humans came in with a rather unimpressive showing about a third of the way down the list… not too proud of a showing for the crown of creation… But then somebody at Scientific American had the insight to test the efficiency of locomotion for a man on a bicycle. And a human on a bicycle blew the condor away – completely off the top of the charts. 

And that’s what a computer is to me… It’s the most remarkable tool we’ve ever come up with… It’s the equivalent of a bicycle for our minds”  

Notice that Jobs here draws a straight-line comparison between the bicycle and the PC. As far as Jobs is concerned, there is no quantitative difference in kind between the two tools: one is more complex than the other but otherwise, they are just technologies that expand human capacity. “A Bicycle for our minds” is a fascinating way to describe a computer because it implies that nothing about our minds will be changed, they’ll just be a little bit faster. 

And yet, despite the attempts of thought leaders like Jobs to convince us that modern technology is entirely benign, many of us are left with a natural suspicion that there is more going on. As a priest in the Church of England, I often have conversations with parishioners and members of the public who are looking for language or a framework which describes the instinctive recognition that something has changed at some point (fairly recently) about the nature of the technology that we use, or the way that it influences our lives. That modern technology is not simply the natural extension of the sorts of tools that humans have been using since the Stone Age and that modern technology is not neutral but in significant ways has already had an effect regardless of how we might use it. How do we respond to such articulate and thoughtful people such as Steve Jobs who make a compelling case that modern technology is neutral and natural?  

I often have conversations with parishioners who are looking for language or a framework which describes the instinctive recognition that something has changed about the nature of the technology that we use, or the way that it influences our lives.

Thinking back to that moment with my son when he giggles and I take a photo of him, at first glance it seems completely innocuous. But what resources are available if I did want to think more carefully about that moment (and the many like it) which suffuse my daily life? Thankfully there is a growing body of literature from philosophers and theologians who are thinking about the impact of modern technology on the human condition.  In the next two articles I would like to introduce the work of Martin Heidegger, outline his criticism of modern technology, showing how he challenges the idea that technology is simply a natural extension of human capacity or a neutral tool.  

Heidegger is a complex character in philosophy and in Western history. There is no getting around the fact that he was a supporter of the Nazi Party during the second world war. His politics have been widely condemned and rightly so, nevertheless, his insights on the nature of modern technology continue to this day to provide insights that are useful. His claim is that modern technology essentially and inevitably changes our relationship with the world in which we live and even with ourselves. It is this claim, and Heidegger’s suggested solution, that I will unpack in the next two articles.