Article
Christmas culture
Creed
4 min read

For the knowing of the how: creating at Christmas

Learning a new craft unfolds the layers of meaning Christmas is clothed in.

Andrew works at the intersection of theology, science and philosophy. He is Canon and Regius Professor of Divinity at Christ Church, Oxford.

A white crocheted angel decoration against a dark background.
Kelly Sikkema via Unsplash.

Childhood Christmas was for me a time of craft and productivity, of baking and decorating, of paper chains and printing cards with dissected potatoes. Christmas was all about making, so homemade presents outshone everything else.  

That was fine if you were a painter, knitter, sculptor, seamstress, or woodworker, and each member of my family was at least one of those things. I was the odd one out: at least until the autumn before last, when I took up crochet.  

My inspiration came from John Milbank: theologian, philosopher, political theorist, poet, and general ruffler of feathers. Not, I have to say, because he sets example with hook and yarn. Rather, he’d written an essay, an essay that spoke to me, as someone often in art galleries but rarely making. We get so invested in fine art, he wrote, that we forget the priority of applied art, of craft and decoration. That’s the foundation. The art we go to see in museums is great only if it succeeds in ‘intensifying this art which is proper to humanity as such.’ So, I took up crochet.  

Crochet, as I hoped, is rather like playing the recorder. It’s not too difficult, even at the beginning, but has plenty of scope for complexity and skill. I’m now three blankets in, plus six cushion covers and a hat. Even my first efforts were gratefully received as presents, and I some of my recent work is much more intricate, and not half bad. 

I’ve finally joined the ‘Christmas is about making’ project: and Christmas really is about making. John Donne put it like this, addressing the Virgin Mary: 

… yea thou art now 
Thy Maker’s maker, and thy Father’s mother; 
Thou hast light in dark, and shutst in little room, 
Immensity cloistered in thy dear womb. 

Mary becomes her ‘Maker’s maker’. In a further twist, which Donne would appreciate, Mary’s child grew up to be a carpenter, or – as the Greek would better be translated – an all-round, general purpose village maker: from hearths to homes, from shelves to structures. 

In thinking about how God took up a human life, writers have often turned to the language of making. In the same poem, Donne has God weaving himself a kind of garment in Mary’s womb: ‘He will wear, / Taken from thence, flesh’. Thomas Pestel (1586–1667) opens an unjustly forgotten Christmas hymn like this: 

Behold, the great Creator makes 
Himself a house of clay, 
a robe of virgin flesh He takes 
which He will wear for aye. 

More familiar still is Charles Wesley’s ‘Hark the Herald Angels Sing’, with its lines: 

 ‘Veiled in flesh the Godhead see, / Hail the incarnate Deity!’  

The language of wearing, of robes and veils, hasn’t always fared well among theologians. I heard of one stern tutor in doctrine who would look round the chapel whenever Wesley’s carol was sung, reserving a stern word for any student who failed to fall silent at that line. He didn’t like the implication that God was merely draped in humanity, making only an outward show of being human.   

Thomas Aquinas saw that worry, writing in the thirteenth century, but argued for charity. The language of clothing isn’t perfect, but we shouldn’t expect it to be. Illustrations gesture towards the truth, they aren’t identical with it, and all the more when we’re talking about God. As long as we don’t think expect the clothing image to say all that needs to be said, there’s mileage to it. For one thing, clothing can make someone visible (as the late Queen knew very well): ‘veiled in flesh, the Godhead see’. Moreover, Christ’s humanity was shaped by his divinity, like a garment is shaped by the body of the one who wears it, yet the body remains unchanged (and so does the garment), just as God became human without becoming any less divine.  

Alongside clothing, Pestel also suggested God working with clay:

‘Behold, the great Creator makes / Himself a house of clay’.

That takes up, and reworks, another textile image. John’s Gospel gets to the heart of the Christmas message with a line so solemn that Christians have been accustomed to drop to their right knee on hearing it read: ‘And the Word was made flesh, and dwelt among us’. That’s how we know it, but a more accurate translation is that the Divine Word ‘pitched his tent among us’. The houses that Pestel knew, however, were made of bricks not cloth, which is to say of clay, so he adapted the image. Or, just as likely, with that clay, he had the ‘house’ of the human body in mind. That would recall lines in Genesis, where God makes Adam out of clay, or ‘the dust of the ground’. In fact, the Hebrew word ‘Adam’ means just that – something like ‘earthling’ – just as ‘human’ is related to the Latin ‘humus’, meaning soil. 

Whether weaving and wearing, or building, or sculpting, or potato printing, this is the message to stop us in our tracks at Christmas: that the Maker made himself human. There is something beautiful that we greet that with homemade presents, with printing cards, with decorating and baking, with craft and productivity, with paint and cloth, paper, wood, and yarn, and with that sublime sort of making that is music. As Pestel puts it, in closing ‘Behold, the great Creator makes’ 

Join then, all hearts that are not stone, 
and all our voices prove, 
to celebrate this holy One, 
the God of peace and love. 

Article
Belief
Creed
Identity
Truth and Trust
5 min read

Calls to revive the Enlightenment ignore its own illusions

Returning to the Age of Reason won’t save us from post-Truth

Alister McGrath retired as Andreas Idreos Professor of Science and Religion at Oxford University in 2022.

In the style of a Raeburn portrait, a set of young people lounge around on their phones looking diffident
Enlightened disagreement (with apologies to Henry Raeburn).
Nick Jones/Midjourney.ai.

Is truth dead? Are we living in a post-truth era where forcefully asserted opinions overshadow evidence-based public truths that once commanded widespread respect and agreement? Many people are deeply concerned about the rise of irrational beliefs, particularly those connected to identity politics, which have gained considerable influence in recent years. It seems we now inhabit a culture where emotional truths take precedence, while factual truths are relegated to a secondary status. Challenging someone’s beliefs is often portrayed as abusive, or even as a hate crime. Is it any surprise that irrationality and fantasy thrive when open debate and discussion are so easily shut down? So, what has gone wrong—and what can we do to address it? 

We live in an era marked by cultural confusion and uncertainty, where a multitude of worldviews, opinions, and prejudices vie for our attention and loyalty. Many people feel overwhelmed and unsettled by this turmoil, often seeking comfort in earlier modes of thinking—such as the clear-cut universal certainties of the eighteenth-century “Age of Reason.” In a recent op-ed in The Times, James Marriott advocates for a return to this kind of rational thought. I share his frustration with the chaos in our culture and the widespread hesitation to challenge powerful irrationalities and absurdities out of fear of being canceled or marginalized. However, I am not convinced that his proposed solution is the right one. We cannot simply revert to the eighteenth century. Allow me to explain my concerns. 

What were once considered simple, universal certainties are now viewed by scholars as contested, ethnocentric opinions. These ideas gained prominence not because of their intellectual merit, but due to the economic, political, and cultural power of dominant cultures. “Rationality” does not refer to a single, universal, and correct way of thinking that exists independently of our cultural and historical context. Instead, global culture has always been a bricolage of multiple rationalities. 

The great voyages of navigation of the early seventeenth century made it clear that African and Asian understandings of morality and rationality differed greatly from those in England. These accounts should have challenged the emerging English philosophical belief in a universal human rationality. However, rather than recognizing a diverse spectrum of human rationalities—each shaped by its own unique cultural evolution—Western observers dismissed these perspectives as “primitive” or “savage” modes of reasoning that needed to be replaced by modern Western thought. This led to forms of intellectual colonialism, founded on the questionable assumption that imposing English rational philosophies was a civilizing mission intended to improve the world. 

Although Western intellectual colonialism was often driven by benign intentions, its consequences were destructive. The increasing influence of Charles Darwin’s theory of biological and cultural evolution in the late nineteenth century led Darwin’s colleague, Alfred Russel Wallace, to conclude that intellectually and morally superior Westerners would “displace the lower and more degraded races,” such as “the Tasmanian, Australian and New Zealander”—a process he believed would ultimately benefit humanity as a whole. 

We can now acknowledge the darker aspects of the British “Age of Reason”: it presumed to possess a definitive set of universal rational principles, which it then imposed on so-called “primitive” societies, such as its colonies in the south Pacific. This reflected an ethnocentric illusion that treated distinctly Western beliefs as if they were universal truths. 

A second challenge to the idea of returning to the rational simplicities of the “Age of Reason” is that its thinkers struggled to agree on what it meant to be “rational.” This insight is often attributed to the philosopher Alasdair MacIntyre, who argued that the Enlightenment’s legacy was the establishment of an ideal of rational justification that ultimately proved unattainable. As a result, philosophy relies on commitments whose truth cannot be definitively proven and must instead be defended on the basis of assumptions that carry weight for some, but not for all. 

We have clearly moved beyond the so-called rational certainties of the “Age of Reason,” entering a landscape characterized by multiple rationalities, each reasonable in its own unique way. This shift has led to a significant reevaluation of the rationality of belief in God. Recently, Australian atheist philosopher Graham Oppy has argued that atheism, agnosticism, and theism should all be regarded as “rationally permissible” based on the evidence and the rational arguments supporting each position. Although Oppy personally favours atheism, he does not expect all “sufficiently thoughtful, intelligent, and well-informed people” to share his view. He acknowledges that the evidence available is insufficient to compel a definitive conclusion on these issues. All three can claim to be reasonable beliefs. 

The British philosopher Bertrand Russell contended that we must learn to accept a certain level of uncertainty regarding the beliefs that really matter to us, such as the meaning of life. Russell’s perspective on philosophy provides a valuable counterbalance to the excesses of Enlightenment rationalism: “To teach how to live without certainty, and yet without being paralyzed by hesitation, is perhaps the chief thing that philosophy, in our age, can still do for those who study it.” 

Certainly, we must test everything and hold fast to what is good, as St Paul advised. It seems to me that it is essential to restore the role of evidence-based critical reasoning in Western culture. However, simply returning to the Enlightenment is not a practical solution. A more effective approach might be to gently challenge the notion, widespread in some parts of our society, that disagreement equates to hatred. We clearly need to develop ways of modelling a respectful and constructive disagreement, in which ideas can be debated and examined without diminishing the value and integrity of those who hold them. This is no easy task—yet we need to find a way of doing this if we are to avoid fragmentation into cultural tribes, and lose any sense of a “public good.” 

Support Seen & Unseen

Since Spring 2023, our readers have enjoyed over 1,500 articles. All for free. 
This is made possible through the generosity of our amazing community of supporters.

If you enjoy Seen & Unseen, would you consider making a gift towards our work?

Do so by joining Behind The Seen. Alongside other benefits, you’ll receive an extra fortnightly email from me sharing my reading and reflections on the ideas that are shaping our times.

Graham Tomlin
Editor-in-Chief