Article
Culture
Film & TV
Monsters
Weirdness
Zombies
5 min read

Zombies: a philosopher's guide to the purpose-driven undead

Don’t dismiss zombiecore as lowbrow.

Ryan is the author of A Guidebook to Monsters: Philosophy, Religion, and the Paranormal.

A regency woman dabs her mouth with a bloody hankerchief.
Lilly James in Pride and Prejudice and Zombies.
Lionsgate.

Writing from his new book, A Guidebook to Monsters, Ryan Stark delves into humanity’s fascination for all things monsterous. In the second of a two-part series, he asks what and where zombies remind us of, and why they caught the eyes of C.S. Lewis and Salvador Dali 

 

On how Frankenstein’s monster came to life nobody knows for sure, but he is more urbane than zombies tend to be. Nor do Jewish golems and Frosty the Snowman count as zombiecore. The latter sings too much, and both are wrongly formulated. Frosty comes from snow, obviously, and the golems—from mere loam, not what the Renaissance playwrights call “gilded loam,” that is, already pre-assembled bodies, which is a zombie requirement. Tolkien’s orcs function likewise as golem-esque monsters, cast from miry clay and then enlivened by the grim magic of Mordor. We do not, for instance, discover scenes with orc children. 

And neither is Pinocchio a zombie, nor Pris from Blade Runner, but dolls, automatons, and C3POs border upon the land of zombies insofar as they all carry a non-human tint. Zombies, however, carry something else as well, a history of personhood, and so in their present form appear as macabre parodies of the human condition writ large. They are gruesome undead doppelgangers, reminding us of who we are not and perhaps—too—of where we are not. Hell is a place prepared for the Devil and his angels, Christ tells us in the book of Matthew. And maybe, subsequently, for zombies. 

Kolchak, in an episode of Kolchak: The Night Stalker aptly titled “The Zombie,” correctly discerns the grim scenario at hand: “He, sir, is from Hell itself!”  

C.S. Lewis pursues a similar line of thinking in The Problem of Pain: “You will remember that in the parable, the saved go to a place prepared for them, while the damned go to a place never made for men at all. To enter Heaven is to become more human than you ever succeeded in being on earth; to enter Hell is to be banished from humanity. What is cast (or casts itself) into Hell is not a man: it is ‘remains.’” Lewis makes an intriguing point, which has as its crescendo the now-famous line about the doors of Hell: “I willingly believe that the damned are, in one sense, successful, rebels to the end; that the doors of Hell are locked on the inside by zombies.” I added that last part about zombies. 

I make this point—in part—to correct those in the cognoscenti who dismiss zombies as a subject too lowbrow for serious consideration.

Not everyone believes in Hell, of course, yet most concede that some people behave worse than others, which also helps our cause. Indeed, part of zombiecore’s wisdom is to show that bad people often produce more horror than the zombies themselves. Such is the character of Legendre Murder, a case in point from the film White Zombie. Not fortunate in name, Mr. Murder runs a dark satanic mill populated by hordes of zombie workers, which is the film’s heavy-handed critique of sociopathic industrialization. The truth to be gleaned, here, is that zombies did not invent the multinational corporation; rather, they fell prey to it. 

We might think, too, of Herman Melville’s dehumanized characters from Bartleby the Scrivener: Nippers, Turkey, Ginger Nut, and the other functionaries whose nicknames themselves indicate the functions. From an economic standpoint, their value becomes a matter of utility, not essence, which is Melville’s reproach of the despairingly corporate drive to objectify personhood—of which zombies are an example beyond the pale. They might as well be fleshy mannequins, in fact, and as such provide the perfect foil for the human being properly conceived. 

Here, then, is why we do not blame zombies for eating brains, nor do we hold them accountable for wearing white pants after Labor Day, as some inevitably do. They cannot help it—in ethics and in fashion. Perhaps especially in fashion. The best we can hope for in the realm of zombie couture is Solomon Grundy, the quasi-zombie supervillain who holds up his frayed pants with a frayed rope, a fashion victory to be sure, however small it might be, though “zombie fashion” is a misnomer in the final analysis. They wear clothes, but not for the same reasons we do. 

The point holds true for Salvador Dali’s zombies as well, most of whom find themselves in nice dresses. I make this point—in part—to correct those in the cognoscenti who dismiss zombies as a subject too lowbrow for serious consideration. Not so. Exhibit A: the avant-garde Dali, darling of the highbrow, or at least still of the middlebrow, now that his paintings appear on t-shirts and coffee mugs. Burning giraffe. Mirage. Woman with Head of Roses. All zombies, too ramshackle and emaciated to live, never mind the missing head on the last one, and yet there they are posed for the leering eye, not unlike those heroin-chic supermodels from Vogue magazine in the late 1990s. Necrophilia never looked so stylish. 

The zombie’s gloomy predicament bears a striking resemblance to that of the Danaids in the classical underworld, those sisters condemned to fill a sieve with water for all eternity...

But never let it be said that zombies are lazy. They are tired, to be sure. Their ragged countenances tell us this, but they are not indolent. Zombies live purpose-driven undead lives. They want to eat brains, or any human flesh, depending on the mythos, and their calendars are organized accordingly. No naps. No swimming lessons. Just brains.  

But we quickly discern that no amount of flesh will satisfy. There is always one more hapless minimart clerk to ambush, one more sorority girl in bunny slippers to chase down the corridor. In this way, the zombie’s gloomy predicament bears a striking resemblance to that of the Danaids in the classical underworld, those sisters condemned to fill a sieve with water for all eternity, an emblem of the perverse appetite unchecked, which has at its core the irony of insatiable hunger. And as the pleasure becomes less and less, the craving becomes more and more. The law of diminishing returns. So, it is with all vices. The love of money demands more money, and the love of brains, more brains. 

And so, in conclusion, a prayer. God bless the obsessive-compulsive internet shoppers, the warehouse workers on unnecessarily tight schedules, and the machine-like managers of the big data algorithms. God bless the students who sedate themselves in order to survive their own educations, taking standardized test after standardized test. And God bless the Emily Griersons of the world, who keep their petrified-boyfriend corpses near them in the bedroom, an emblem of what happens when one tries too mightily to hold on to the past. And God help us, too, when we see in our own reflections a zombie-like affectation, the abyss who stares back at us and falsely claims that we are not the righteousness of God, as Paul says we are in 2 Corinthians. And, finally, Godspeed to Gussie Fink-Nottle from the P.G. Wodehouse sagas: “Many an experienced undertaker would have been deceived by his appearance, and started embalming on sight.”  

  

From A Guidebook to Monsters, Ryan J. Stark.  Used by permission of Wipf and Stock Publishers.   

Article
Comment
Education
Leading
5 min read

Why I teach over my students’ heads

Successful teaching is a work of empathy that stretches the mind.
A blackboard covered in chalk writing and highlights.
James's chalkboard.

I’ve been teaching college students for almost 30 years now. As much as I grumble during grading season, it is a pretty incredible way to make a living. I remain grateful. 

I am not the most creative pedagogue. My preference is still chalk, but I can live with a whiteboard (multiple colors of chalk or markers are a must). Over the course of 100 minutes, various worlds emerge that I couldn’t have anticipated before I walked into class that morning. (I take photos of what emerges so I can remember how to examine the students later.) I think there is something important about students seeing ideas—and their connections—unfold in “real time,” so to speak.  

I’ve never created a PowerPoint slide for a class. I put few things on Moodle, and only because my university requires it. I’ve heard people who use “clickers” in class and I have no idea what they mean. I find myself skeptical whenever administrators talk about “high impact” teaching practices (listening to lectures produced the likes of Hegel and Hannah Arendt; what have our bright shiny pedagogical tricks produced?). I am old and curmudgeonly about such “progress.”  

But I care deeply about teaching and learning. I still get butterflies before every single class. I think (hope!) that’s because I have a sense of what’s at stake in this vocation.  

I am probably most myself in a classroom. As much as I love research, and imagine myself a writer, the exploratory work of teaching is a crucial laboratory for both. I love making ideas come alive for students—especially when students are awakened by such reflection and grappling with challenging texts. You see the gears grinding. You see the brow furrowing. Every once in a while, you sense the reticence and resistance to an insight that unsettles prior biases or assumptions; but the resistance is a sign of getting it. And then you see the light dawn. I’m a sucker for that spectacle.  

This is how the hunger sets in. If you can invite a student to care about the questions, to grasp their import, and experience the unique joy of joining the conversation that is philosophy. 

Successful teaching is, fundamentally, a work of empathy. As a teacher, you have to try to remember your way back into not knowing what you now take for granted. You have to re-enter a student’s puzzlement, or even apathy, to try to catalyze questions and curiosity. Because I teach philosophy, my aim is nothing less than existential engagement. I’m not trying to teach them how to write code or design a bridge; I’m trying to get them to envision a different way to live. But, for me, it’s impossible to separate the philosophical project from the history of philosophy: to do philosophy is to join the long conversation that is the history of philosophy. So we are always wresting with challenging, unfamiliar texts that arrive from other times that might as well be other planets for students in the twenty-first century.  

So successful teaching requires a beginner’s mindset on the part of the teacher, a charitable capacity to remember what ignorance (in the technical sense) feels like. To do so without condescension is absolutely crucial if teaching is going to be an art of invitation rather than an act of alienation. (The latter, I fear, is more common than we might guess.) 

Such empathy means meeting students where they are. But successful teaching is also about stretching students’ minds and imaginations into new territory and unfamiliar habits of mind. This is where I find myself especially skeptical of pedagogical developments that, to my eyes, run the risk of infantilizing college students. (I remember a workshop in which a “pedagogical expert” explained that the short attention span of students required changing the PowerPoint slide every 8 seconds. This does not sound like a recipe for making students more human, I confess.) 

That’s why I am unapologetic about trying to teach over my students’ heads. I don’t mean, of course, that I’m satisfied with spouting lectures that elude their comprehension. That would violate the fundamental rule of empathy. But such empathy—meeting students where they are—is not mutually exclusive with also inviting them into intellectual worlds and conversations where they won’t comprehend everything.  

This is how the hunger sets in. If you can invite a student to care about the questions, to grasp their import, and experience the unique joy of joining the conversation that is philosophy, then part of the thrill, I think, is being admitted into a world where you don’t “get” everything.  

This gambit—every once in a while, talking about ideas and thinkers as if students should know them—is, I maintain, still an act of empathy.

When I’m teaching, I think of this in a couple of ways. At the same time that I am trying to make core ideas and concepts accessible and understandable, I don’t regret talking about attendant ideas and concepts that will, to this point, still elude students. For the sharpest students, this registers as something to learn, something to be curious about. Or sometimes when we’re focused on, say, Pascal or Hegel, I’ll plant little verbal footnotes—tiny digressions about how Hannah Arendt engaged their work in the 20th century, or how O.K. Bouwsma’s reading of Anselm is akin to something we’re talking about. The vast majority of students won’t be familiar with either, but it’s another indicator of how big and rich and complicated the intellectual cosmos of philosophy is. For some of these students (not all, certainly), this becomes tantalizing: they want to become the kind of people for whom a vast constellation of ideas and thinkers are as familiar and present as their friends and cousins. This becomes a hunger to belong to such a world, to join such a conversation.  

This gambit—every once in a while, talking about ideas and thinkers as if students should know them—is, I maintain, still an act of empathy. To both meet students where they are and, at the same time, teach “over their heads,” is an invitation to stretch into new terrain and thereby swell the soul into the fullness for which it was made. The things that skitter just over their heads won’t be on the exam, of course; but I’m hoping they’ll chase some of them for a lifetime to come. 

  

This article was originally published on James K A Smith’s Substack Quid Amo.