Explainer
Christmas culture
Creed
4 min read

Why Christmas Day is Christmas Day 

The seasons, festivals, historians and emperors have all influenced the date of Christmas.

Ryan Gilfeather explores social issues through the lens of philosophy, theology, and history. He is a Research Associate at the Joseph Centre for Dignified Work.

a had holds out a small wrapped Christmas present.

I came of age in the 2000s, a decade quite alien to us now. We saw ourselves as pioneers of technology, as the internet emerged in its prehistoric form. There was great optimism about the economy until it all went wrong in 2008. The New Atheism movement was roaring into public view, only to wane just as quickly the decade after. Growing up as a Christian, I remember spirited debates with my peers about whether science disproved Christianity and if God can be disproven. These questions have fallen out of view, just as many of its main proponents have too. Richard Dawkins rarely darkens the door of our TV screens anymore.  

However, one such moment of conflict sticks out in my mind. A friend announced to me that he had disproved the origins of Christianity. The night before he had discovered that in the third century Roman Empire, before Christianity became legal and the official religion, there was already a festival celebrating a god on the 25th of December. Instead of the birth of the Son of God, the Romans celebrated the rebirth of the Unconquered Sun — Sol Invictus. Then, in 336 under the first Christian Emperor, Constantine, that date was first celebrated as the day of the birth of Christ. My friend considered the case to be closed; surely the birth of Christ was merely a repurposing of an existing festival?  

Thankfully this shocking revelation did not pick away at the foundations of my faith. I continued, and still continue, to believe in and love the Christian God, as revealed in the Bible. Indeed, since that moment I have trained as a scholar in the history of the early Church, and have begun to see this question for what it is, a quirk of history.  

We, therefore, celebrate Christ’s birth on the 25th December on account of a quirk of history, a result of the way that Romans mapped significant events on to the waxing and waning of the light. 

The first claim that Christ was born on the 25th December appears in the third century. Sextus Julius Africanus, a Roman Christian historian, wrote an entire chronology of the world from creation to AD 221. He considered March 25th to be the date of creation, because it was the spring equinox in the Roman Calendar, a day which represents new life and new birth. For this reason, he likewise considered it to be the date of Christ’s conception in the womb. Crucially, nine months after that falls December 25th. Although I admire his logic, it is hardly a sound basis for establishing the date of our Lord’s birth. Indeed, other Christians didn’t accept this claim at the time either. 

As already mentioned, December 25th was a significant date in the Roman calendar already. It was the winter solstice, the shortest day of the year, after which the days begin to lengthen. It also shortly followed the popular Roman festival of the Saturnalia. Already endowed with significance, it is unsurprising that the Romans began to celebrate the rebirth of Sol Invictus, and the birth of another god, Mithras, on that date.  

At this time Christianity was an illegal religion, persecuted in some parts of the Roman Empire. However, in 312, the emperor Constantine converts to Christianity and in 313 makes it a legally tolerated religion. At this point he begins to invest the church and Christians with powers, wealth and privileges. Evidence from the Chronography of AD 354 suggests that Christmas was first celebrated on the 25th December in 336, during the reign of Constantine. Perhaps this was an attempt to dislodge existing pagan holidays, and replace it with a Christian one. Or, maybe the significance of the Winter solstice made that date most plausible. Indeed, it is easy to see how the commemoration of Christ coming into the world is particularly salient as the darkness begins to recede. The true answer is, of course, lost to history.  

We, therefore, celebrate Christ’s birth on the 25th December on account of a quirk of history, a result of the way that Romans mapped significant events on to the waxing and waning of the light. The true lesson here though, is that it simply doesn’t matter what the actual date of Christ’s birth was. Our records, and those available in the early church were simply not good enough for us to ever know. What matters is that God loves us so much, that he became human to bring us back to his side in everlasting joy and peace. We have no idea on what date Christ was born. But, each year the 25th December presents a time for us to remember that God became man, so that we might have everlasting life.  

Article
Belief
Creed
Identity
Truth and Trust
5 min read

Calls to revive the Enlightenment ignore its own illusions

Returning to the Age of Reason won’t save us from post-Truth

Alister McGrath retired as Andreas Idreos Professor of Science and Religion at Oxford University in 2022.

In the style of a Raeburn portrait, a set of young people lounge around on their phones looking diffident
Enlightened disagreement (with apologies to Henry Raeburn).
Nick Jones/Midjourney.ai.

Is truth dead? Are we living in a post-truth era where forcefully asserted opinions overshadow evidence-based public truths that once commanded widespread respect and agreement? Many people are deeply concerned about the rise of irrational beliefs, particularly those connected to identity politics, which have gained considerable influence in recent years. It seems we now inhabit a culture where emotional truths take precedence, while factual truths are relegated to a secondary status. Challenging someone’s beliefs is often portrayed as abusive, or even as a hate crime. Is it any surprise that irrationality and fantasy thrive when open debate and discussion are so easily shut down? So, what has gone wrong—and what can we do to address it? 

We live in an era marked by cultural confusion and uncertainty, where a multitude of worldviews, opinions, and prejudices vie for our attention and loyalty. Many people feel overwhelmed and unsettled by this turmoil, often seeking comfort in earlier modes of thinking—such as the clear-cut universal certainties of the eighteenth-century “Age of Reason.” In a recent op-ed in The Times, James Marriott advocates for a return to this kind of rational thought. I share his frustration with the chaos in our culture and the widespread hesitation to challenge powerful irrationalities and absurdities out of fear of being canceled or marginalized. However, I am not convinced that his proposed solution is the right one. We cannot simply revert to the eighteenth century. Allow me to explain my concerns. 

What were once considered simple, universal certainties are now viewed by scholars as contested, ethnocentric opinions. These ideas gained prominence not because of their intellectual merit, but due to the economic, political, and cultural power of dominant cultures. “Rationality” does not refer to a single, universal, and correct way of thinking that exists independently of our cultural and historical context. Instead, global culture has always been a bricolage of multiple rationalities. 

The great voyages of navigation of the early seventeenth century made it clear that African and Asian understandings of morality and rationality differed greatly from those in England. These accounts should have challenged the emerging English philosophical belief in a universal human rationality. However, rather than recognizing a diverse spectrum of human rationalities—each shaped by its own unique cultural evolution—Western observers dismissed these perspectives as “primitive” or “savage” modes of reasoning that needed to be replaced by modern Western thought. This led to forms of intellectual colonialism, founded on the questionable assumption that imposing English rational philosophies was a civilizing mission intended to improve the world. 

Although Western intellectual colonialism was often driven by benign intentions, its consequences were destructive. The increasing influence of Charles Darwin’s theory of biological and cultural evolution in the late nineteenth century led Darwin’s colleague, Alfred Russel Wallace, to conclude that intellectually and morally superior Westerners would “displace the lower and more degraded races,” such as “the Tasmanian, Australian and New Zealander”—a process he believed would ultimately benefit humanity as a whole. 

We can now acknowledge the darker aspects of the British “Age of Reason”: it presumed to possess a definitive set of universal rational principles, which it then imposed on so-called “primitive” societies, such as its colonies in the south Pacific. This reflected an ethnocentric illusion that treated distinctly Western beliefs as if they were universal truths. 

A second challenge to the idea of returning to the rational simplicities of the “Age of Reason” is that its thinkers struggled to agree on what it meant to be “rational.” This insight is often attributed to the philosopher Alasdair MacIntyre, who argued that the Enlightenment’s legacy was the establishment of an ideal of rational justification that ultimately proved unattainable. As a result, philosophy relies on commitments whose truth cannot be definitively proven and must instead be defended on the basis of assumptions that carry weight for some, but not for all. 

We have clearly moved beyond the so-called rational certainties of the “Age of Reason,” entering a landscape characterized by multiple rationalities, each reasonable in its own unique way. This shift has led to a significant reevaluation of the rationality of belief in God. Recently, Australian atheist philosopher Graham Oppy has argued that atheism, agnosticism, and theism should all be regarded as “rationally permissible” based on the evidence and the rational arguments supporting each position. Although Oppy personally favours atheism, he does not expect all “sufficiently thoughtful, intelligent, and well-informed people” to share his view. He acknowledges that the evidence available is insufficient to compel a definitive conclusion on these issues. All three can claim to be reasonable beliefs. 

The British philosopher Bertrand Russell contended that we must learn to accept a certain level of uncertainty regarding the beliefs that really matter to us, such as the meaning of life. Russell’s perspective on philosophy provides a valuable counterbalance to the excesses of Enlightenment rationalism: “To teach how to live without certainty, and yet without being paralyzed by hesitation, is perhaps the chief thing that philosophy, in our age, can still do for those who study it.” 

Certainly, we must test everything and hold fast to what is good, as St Paul advised. It seems to me that it is essential to restore the role of evidence-based critical reasoning in Western culture. However, simply returning to the Enlightenment is not a practical solution. A more effective approach might be to gently challenge the notion, widespread in some parts of our society, that disagreement equates to hatred. We clearly need to develop ways of modelling a respectful and constructive disagreement, in which ideas can be debated and examined without diminishing the value and integrity of those who hold them. This is no easy task—yet we need to find a way of doing this if we are to avoid fragmentation into cultural tribes, and lose any sense of a “public good.” 

Support Seen & Unseen

Since Spring 2023, our readers have enjoyed over 1,500 articles. All for free. 
This is made possible through the generosity of our amazing community of supporters.

If you enjoy Seen & Unseen, would you consider making a gift towards our work?

Do so by joining Behind The Seen. Alongside other benefits, you’ll receive an extra fortnightly email from me sharing my reading and reflections on the ideas that are shaping our times.

Graham Tomlin
Editor-in-Chief