Article
Culture
Film & TV
Monsters
Weirdness
Zombies
5 min read

Zombies: a philosopher's guide to the purpose-driven undead

Don’t dismiss zombiecore as lowbrow.

Ryan is the author of A Guidebook to Monsters: Philosophy, Religion, and the Paranormal.

A regency woman dabs her mouth with a bloody hankerchief.
Lilly James in Pride and Prejudice and Zombies.
Lionsgate.

Writing from his new book, A Guidebook to Monsters, Ryan Stark delves into humanity’s fascination for all things monsterous. In the second of a two-part series, he asks what and where zombies remind us of, and why they caught the eyes of C.S. Lewis and Salvador Dali 

 

On how Frankenstein’s monster came to life nobody knows for sure, but he is more urbane than zombies tend to be. Nor do Jewish golems and Frosty the Snowman count as zombiecore. The latter sings too much, and both are wrongly formulated. Frosty comes from snow, obviously, and the golems—from mere loam, not what the Renaissance playwrights call “gilded loam,” that is, already pre-assembled bodies, which is a zombie requirement. Tolkien’s orcs function likewise as golem-esque monsters, cast from miry clay and then enlivened by the grim magic of Mordor. We do not, for instance, discover scenes with orc children. 

And neither is Pinocchio a zombie, nor Pris from Blade Runner, but dolls, automatons, and C3POs border upon the land of zombies insofar as they all carry a non-human tint. Zombies, however, carry something else as well, a history of personhood, and so in their present form appear as macabre parodies of the human condition writ large. They are gruesome undead doppelgangers, reminding us of who we are not and perhaps—too—of where we are not. Hell is a place prepared for the Devil and his angels, Christ tells us in the book of Matthew. And maybe, subsequently, for zombies. 

Kolchak, in an episode of Kolchak: The Night Stalker aptly titled “The Zombie,” correctly discerns the grim scenario at hand: “He, sir, is from Hell itself!”  

C.S. Lewis pursues a similar line of thinking in The Problem of Pain: “You will remember that in the parable, the saved go to a place prepared for them, while the damned go to a place never made for men at all. To enter Heaven is to become more human than you ever succeeded in being on earth; to enter Hell is to be banished from humanity. What is cast (or casts itself) into Hell is not a man: it is ‘remains.’” Lewis makes an intriguing point, which has as its crescendo the now-famous line about the doors of Hell: “I willingly believe that the damned are, in one sense, successful, rebels to the end; that the doors of Hell are locked on the inside by zombies.” I added that last part about zombies. 

I make this point—in part—to correct those in the cognoscenti who dismiss zombies as a subject too lowbrow for serious consideration.

Not everyone believes in Hell, of course, yet most concede that some people behave worse than others, which also helps our cause. Indeed, part of zombiecore’s wisdom is to show that bad people often produce more horror than the zombies themselves. Such is the character of Legendre Murder, a case in point from the film White Zombie. Not fortunate in name, Mr. Murder runs a dark satanic mill populated by hordes of zombie workers, which is the film’s heavy-handed critique of sociopathic industrialization. The truth to be gleaned, here, is that zombies did not invent the multinational corporation; rather, they fell prey to it. 

We might think, too, of Herman Melville’s dehumanized characters from Bartleby the Scrivener: Nippers, Turkey, Ginger Nut, and the other functionaries whose nicknames themselves indicate the functions. From an economic standpoint, their value becomes a matter of utility, not essence, which is Melville’s reproach of the despairingly corporate drive to objectify personhood—of which zombies are an example beyond the pale. They might as well be fleshy mannequins, in fact, and as such provide the perfect foil for the human being properly conceived. 

Here, then, is why we do not blame zombies for eating brains, nor do we hold them accountable for wearing white pants after Labor Day, as some inevitably do. They cannot help it—in ethics and in fashion. Perhaps especially in fashion. The best we can hope for in the realm of zombie couture is Solomon Grundy, the quasi-zombie supervillain who holds up his frayed pants with a frayed rope, a fashion victory to be sure, however small it might be, though “zombie fashion” is a misnomer in the final analysis. They wear clothes, but not for the same reasons we do. 

The point holds true for Salvador Dali’s zombies as well, most of whom find themselves in nice dresses. I make this point—in part—to correct those in the cognoscenti who dismiss zombies as a subject too lowbrow for serious consideration. Not so. Exhibit A: the avant-garde Dali, darling of the highbrow, or at least still of the middlebrow, now that his paintings appear on t-shirts and coffee mugs. Burning giraffe. Mirage. Woman with Head of Roses. All zombies, too ramshackle and emaciated to live, never mind the missing head on the last one, and yet there they are posed for the leering eye, not unlike those heroin-chic supermodels from Vogue magazine in the late 1990s. Necrophilia never looked so stylish. 

The zombie’s gloomy predicament bears a striking resemblance to that of the Danaids in the classical underworld, those sisters condemned to fill a sieve with water for all eternity...

But never let it be said that zombies are lazy. They are tired, to be sure. Their ragged countenances tell us this, but they are not indolent. Zombies live purpose-driven undead lives. They want to eat brains, or any human flesh, depending on the mythos, and their calendars are organized accordingly. No naps. No swimming lessons. Just brains.  

But we quickly discern that no amount of flesh will satisfy. There is always one more hapless minimart clerk to ambush, one more sorority girl in bunny slippers to chase down the corridor. In this way, the zombie’s gloomy predicament bears a striking resemblance to that of the Danaids in the classical underworld, those sisters condemned to fill a sieve with water for all eternity, an emblem of the perverse appetite unchecked, which has at its core the irony of insatiable hunger. And as the pleasure becomes less and less, the craving becomes more and more. The law of diminishing returns. So, it is with all vices. The love of money demands more money, and the love of brains, more brains. 

And so, in conclusion, a prayer. God bless the obsessive-compulsive internet shoppers, the warehouse workers on unnecessarily tight schedules, and the machine-like managers of the big data algorithms. God bless the students who sedate themselves in order to survive their own educations, taking standardized test after standardized test. And God bless the Emily Griersons of the world, who keep their petrified-boyfriend corpses near them in the bedroom, an emblem of what happens when one tries too mightily to hold on to the past. And God help us, too, when we see in our own reflections a zombie-like affectation, the abyss who stares back at us and falsely claims that we are not the righteousness of God, as Paul says we are in 2 Corinthians. And, finally, Godspeed to Gussie Fink-Nottle from the P.G. Wodehouse sagas: “Many an experienced undertaker would have been deceived by his appearance, and started embalming on sight.”  

  

From A Guidebook to Monsters, Ryan J. Stark.  Used by permission of Wipf and Stock Publishers.   

Explainer
AI
Culture
Digital
6 min read

Tech has changed: it’s no longer natural or neutral

The first in a three-part series exploring the implications of technology.

James is Canon Missioner at Blackburn Cathedral. He researches technology and theology at Oxford University.

A caveman holding a hammer looks at a bench on which are a broken bicycle and a laptop.
Nick Jones/Midjourney.ai.

My son was born in February last year and it seems that every day he is developing new skills or facial expressions and adorable quirks. Just the other day he was playing with some wooden blocks and when they inevitably fell over, he let out the most adorable giggle. As you can guess I immediately reached for my phone so that I could capture the moment. Moments like this happen all the time in the life of a modern parent- we want to share with our spouse, family, and friends or just capture the moment for ourselves because it’s something we treasure. And yet, in this series of articles I would like to consider this moment, and the thousands like it that take place in a technological society, and ask: is everything as benign as it seems? 

There are two ideas that often come up whenever people talk about technology. The first is that technology is basically ‘neutral’, that technology only becomes good or bad depending on what you are doing with it. “Look at a hammer,” someone might say, “there is nothing intrinsically good or bad about this hammer, only the end result is good or bad depending on whether I’m using it to hit nails or people!” On this reading of technology, the only important questions relate to the consequences of use.  

If technology is neutral, then the primary concern for users, legislators and technologists is the consequences of technology, and not the technology itself. The only way to ensure that the technology is used for good is to ensure, somehow, that more good people will use the technology for good things than bad people using it for bad things. Often this idea will present itself as a conversation about competing freedoms: very few people (with some important exceptions, see this article from Ezra Klein) are debating whether there is something intrinsically problematic about the app formerly known as Twitter, most discussion revolves around how to maintain the freedom of good users while curtailing the freedom of bad users. 

We assume that these tools of social interaction like Facebook and Instagram are, in and of themselves, perfectly benign. We are encouraged to think this by massive corporations who have a vested interest in maintaining our use of their platforms, and at first glance, they seem completely harmless: what could possibly be the problem with a website in which grandma can share photos of her cat? And while the dark underbelly of these platforms has violent real-world consequences – like the rise of antisemitism and anti-Muslim hatred – the solution is primarily imagined as a matter of dealing with ‘bad actors’ rather than anything intrinsically problematic with the platforms themselves. 

Jobs here draws a straight-line comparison between the bicycle and the PC. As far as Jobs is concerned, there is no quantitative difference in kind between the two tools.

The second idea is related but somewhat different: Advocates of modern technology will suggest that humanity has been using technology ever since there were humans and therefore all this modern technology is not really anything to worry about. “Yes, modern technology looks scary,” someone might say, “but it’s really nothing to worry about, humans have been using tools since the Stone Age don’t you know!” This view proposes that because hammers are technology, and all technology is the same, there is, therefore, no difference between a hammer and the internet, or between the internet and a cyborg.  

This second idea tends to be accompanied by an emphasis on the slow and steady evolution of technology and by highlighting the fact that at every major technological advancement there have been naysayers decrying the latest innovation. (Even Plato was suspicious of writing when that was invented). Taken as part of a very long view of human history, the technological innovations of the last 100 years seem to be a normal and natural part of the evolution of our species which has always set itself apart from the rest of the animal kingdom in its use of technology. 

Steve Jobs gives a good example of this in an interview he gave about the development PC: 

“I think one of the things that really separates us from the high primates is that we’re tool builders. I read a study that measured the efficiency of locomotion for various species on the planet. The condors used the least energy to move a kilometer. And humans came in with a rather unimpressive showing about a third of the way down the list… not too proud of a showing for the crown of creation… But then somebody at Scientific American had the insight to test the efficiency of locomotion for a man on a bicycle. And a human on a bicycle blew the condor away – completely off the top of the charts. 

And that’s what a computer is to me… It’s the most remarkable tool we’ve ever come up with… It’s the equivalent of a bicycle for our minds”  

Notice that Jobs here draws a straight-line comparison between the bicycle and the PC. As far as Jobs is concerned, there is no quantitative difference in kind between the two tools: one is more complex than the other but otherwise, they are just technologies that expand human capacity. “A Bicycle for our minds” is a fascinating way to describe a computer because it implies that nothing about our minds will be changed, they’ll just be a little bit faster. 

And yet, despite the attempts of thought leaders like Jobs to convince us that modern technology is entirely benign, many of us are left with a natural suspicion that there is more going on. As a priest in the Church of England, I often have conversations with parishioners and members of the public who are looking for language or a framework which describes the instinctive recognition that something has changed at some point (fairly recently) about the nature of the technology that we use, or the way that it influences our lives. That modern technology is not simply the natural extension of the sorts of tools that humans have been using since the Stone Age and that modern technology is not neutral but in significant ways has already had an effect regardless of how we might use it. How do we respond to such articulate and thoughtful people such as Steve Jobs who make a compelling case that modern technology is neutral and natural?  

I often have conversations with parishioners who are looking for language or a framework which describes the instinctive recognition that something has changed about the nature of the technology that we use, or the way that it influences our lives.

Thinking back to that moment with my son when he giggles and I take a photo of him, at first glance it seems completely innocuous. But what resources are available if I did want to think more carefully about that moment (and the many like it) which suffuse my daily life? Thankfully there is a growing body of literature from philosophers and theologians who are thinking about the impact of modern technology on the human condition.  In the next two articles I would like to introduce the work of Martin Heidegger, outline his criticism of modern technology, showing how he challenges the idea that technology is simply a natural extension of human capacity or a neutral tool.  

Heidegger is a complex character in philosophy and in Western history. There is no getting around the fact that he was a supporter of the Nazi Party during the second world war. His politics have been widely condemned and rightly so, nevertheless, his insights on the nature of modern technology continue to this day to provide insights that are useful. His claim is that modern technology essentially and inevitably changes our relationship with the world in which we live and even with ourselves. It is this claim, and Heidegger’s suggested solution, that I will unpack in the next two articles.