Explainer
Creed
5 min read

Creator or creature – a centuries old question of identity

Why does a 1,700-year-old creed still matter?

Frances Young is Emeritus Professor of Theology at the University of Birmingham. 

An abstract depiction of The Creation shows an aperture in a cloud like formation over water.
The Creation, James Tissot.
James Tissot, Public domain, via Wikimedia Commons.

2025 will be the 1,700th anniversary of the Nicaea Creed. In October 2024, Prof. Frances Young gave the inaugural lecture of the McDonald Agape Nicaea Project at St Mellitus College.

 

In the year 325CE the first ever “ecumenical” (= “worldwide”) council of bishops assembled at Nicaea near Constantinople (now Istanbul). It was summoned by Constantine, the first Roman Emperor to convert to Christianity and patronize the Church. Why does this seventeenth centenary of an obscure discussion around complex words matter to us today? 

The outcome of the Council was agreement to the text of a creed, and banishment of a pesky priest named Arius, whose bishop disapproved of his teaching. Unfortunately, some other bishops remained sympathetic to something like Arius’ viewpoint, and for political reasons Constantine was desperate for Church unity. Argument over the issues went on for half a century, until another Council in 381CE reaffirmed the position established in 325CE and agreed the version labelled “the Nicene Creed” and still used in Church liturgies across the world today. 

The controversy was basically about the identity of the pre-existent Word or Son of God incarnate in Jesus Christ. Nicaea established that the Son was “of one substance” (homoousios) with the Father – in other words, he was fully God in every sense of the word. But for many traditional believers at the time this was difficult to accept. 

The common sense of the culture thought in terms of a “chain of being.” Most people in the Roman Empire were polytheists – there were loads of gods: Mars, god of war, Nepture, god of the sea, and so on. Each city, each ethnic group, had its own god, as did every family, every interest group, every burial society – you name it. But generally there was a sense that above all these was the Supreme God, who was worshipped indirectly through worship of these lower gods, and below them were all sorts of nature spirits, daemons, benign and malign, then souls incarnate in human persons, then animals, even vegetables as living entities, and finally inert matter like earth and stones, at the bottom of the hierarchy or chain of being. 

Jews identified their God with the Supreme God and insisted the one God alone should be worshipped. But they also imagined a heavenly court of archangels and angels, then below that the souls of the righteous, and so on in a somewhat parallel hierarchy. No surprise then that Christians assumed a similar picture: God, then the Son of God, then the Holy Spirit, then archangels and angels, then souls, and so on in a hierarchical ladder. 

But in the second century Christians had argued their way to the idea of “creation out of nothing.” Many non-Jewish thinkers, including some early Christians, followed Plato, conceiving creation as the outcome of Mind (the Demiurge or Craftsman) shaping Matter into whatever Forms or Ideas were in mind. But other Christian thinkers argued that God was not a mere Craftsman who needed stone or wood to work on like a sculptor – God produced the Matter in the first place. This then triggered a full-blown critique: God did not create out of pre-existent Matter or there would be two first principles; God did not create from God’s own self or everything would be divine; so God must have created out of nothing. 

Now try to fit that to the chain of being: where do you draw the line between God the Creator and everything else made out of nothing? This was the issue which surfaced in the so-called Arian controversy. What we might call the “mainstream” remained wedded to the hierarchy, not least because of earlier controversies about God’s monarchia. The word did mean “monarchy” – single sovereignty; but arche could mean “rule” or “beginning,” so monarchia also referred to the single first principle of all that is. It was natural to attribute monarchia to God the Father, a view that worked OK with the hierarchy. But some had suggested that the one God 'changed mode', as it were, appearing now as Father, now as Son, now as Holy Spirit, taking different roles in the overarching scriptural story. This suggestion was mocked as all too similar to the pagan god, Proteus, who in mythology kept changing shape. It is even possible that that key word homoousios had been condemned along with this “Modalist” view.  

Traditionalists were suspicious. The first historian of the Church, Eusebius of Caesarea, was present at Nicaea, and wrote a somewhat embarrassed letter to his congregation explaining how he had come to agree to this formula. Even Athanasius - the one who would come to be regarded as the staunch defender of Nicaea - largely avoided the term for a quarter of a century, though that does not mean he did not identify the principal issue. He campaigned hard and ended up in exile five times over. The fundamental issue was whether Christ was God incarnate or some kind of divinised superman, or a semi-divine mediating figure, a created Creator. Arius is supposed to have said, “there was a when he was not,” even though he was “the first and greatest of the creatures” through whom God created everything else. 

So why does it still matter? Four simple reasons:

Because it was basically about identity, and the question of Christ’s identity still matters. 

Because we still find people treating Jesus Christ as superhuman – not really one of us, or semi-divine – not God in the same sense as the God the Father. If we are to be ecumenical, across different denominations today but also across time, we need to affirm that God’s Son and Spirit are truly of the one God. As early as the second century the first great Christian theologian, Irenaeus, characterized the Word and the Spirit as God’s two hands – we can imagine the Trinity reaching out first to create and then to embrace us with God’s redeeming love. 

Because it means we can look to Jesus and there catch a glimpse of God’s very own loving face - not just a dim image but the reality itself.

And because only God could recreate us in God’s own image and raise us to new life. 

  

To find out more about the McDonald Agape Nicaea Project being held by St. Mellitus College in London, come and join the public lectures, or look out for other Nicene celebrations in 2025. 

For more information or to register for these events, you can visit the Nicaea Project website  

Watch the lecture

Article
Belief
Creed
Identity
Truth and Trust
5 min read

Calls to revive the Enlightenment ignore its own illusions

Returning to the Age of Reason won’t save us from post-Truth

Alister McGrath retired as Andreas Idreos Professor of Science and Religion at Oxford University in 2022.

In the style of a Raeburn portrait, a set of young people lounge around on their phones looking diffident
Enlightened disagreement (with apologies to Henry Raeburn).
Nick Jones/Midjourney.ai.

Is truth dead? Are we living in a post-truth era where forcefully asserted opinions overshadow evidence-based public truths that once commanded widespread respect and agreement? Many people are deeply concerned about the rise of irrational beliefs, particularly those connected to identity politics, which have gained considerable influence in recent years. It seems we now inhabit a culture where emotional truths take precedence, while factual truths are relegated to a secondary status. Challenging someone’s beliefs is often portrayed as abusive, or even as a hate crime. Is it any surprise that irrationality and fantasy thrive when open debate and discussion are so easily shut down? So, what has gone wrong—and what can we do to address it? 

We live in an era marked by cultural confusion and uncertainty, where a multitude of worldviews, opinions, and prejudices vie for our attention and loyalty. Many people feel overwhelmed and unsettled by this turmoil, often seeking comfort in earlier modes of thinking—such as the clear-cut universal certainties of the eighteenth-century “Age of Reason.” In a recent op-ed in The Times, James Marriott advocates for a return to this kind of rational thought. I share his frustration with the chaos in our culture and the widespread hesitation to challenge powerful irrationalities and absurdities out of fear of being canceled or marginalized. However, I am not convinced that his proposed solution is the right one. We cannot simply revert to the eighteenth century. Allow me to explain my concerns. 

What were once considered simple, universal certainties are now viewed by scholars as contested, ethnocentric opinions. These ideas gained prominence not because of their intellectual merit, but due to the economic, political, and cultural power of dominant cultures. “Rationality” does not refer to a single, universal, and correct way of thinking that exists independently of our cultural and historical context. Instead, global culture has always been a bricolage of multiple rationalities. 

The great voyages of navigation of the early seventeenth century made it clear that African and Asian understandings of morality and rationality differed greatly from those in England. These accounts should have challenged the emerging English philosophical belief in a universal human rationality. However, rather than recognizing a diverse spectrum of human rationalities—each shaped by its own unique cultural evolution—Western observers dismissed these perspectives as “primitive” or “savage” modes of reasoning that needed to be replaced by modern Western thought. This led to forms of intellectual colonialism, founded on the questionable assumption that imposing English rational philosophies was a civilizing mission intended to improve the world. 

Although Western intellectual colonialism was often driven by benign intentions, its consequences were destructive. The increasing influence of Charles Darwin’s theory of biological and cultural evolution in the late nineteenth century led Darwin’s colleague, Alfred Russel Wallace, to conclude that intellectually and morally superior Westerners would “displace the lower and more degraded races,” such as “the Tasmanian, Australian and New Zealander”—a process he believed would ultimately benefit humanity as a whole. 

We can now acknowledge the darker aspects of the British “Age of Reason”: it presumed to possess a definitive set of universal rational principles, which it then imposed on so-called “primitive” societies, such as its colonies in the south Pacific. This reflected an ethnocentric illusion that treated distinctly Western beliefs as if they were universal truths. 

A second challenge to the idea of returning to the rational simplicities of the “Age of Reason” is that its thinkers struggled to agree on what it meant to be “rational.” This insight is often attributed to the philosopher Alasdair MacIntyre, who argued that the Enlightenment’s legacy was the establishment of an ideal of rational justification that ultimately proved unattainable. As a result, philosophy relies on commitments whose truth cannot be definitively proven and must instead be defended on the basis of assumptions that carry weight for some, but not for all. 

We have clearly moved beyond the so-called rational certainties of the “Age of Reason,” entering a landscape characterized by multiple rationalities, each reasonable in its own unique way. This shift has led to a significant reevaluation of the rationality of belief in God. Recently, Australian atheist philosopher Graham Oppy has argued that atheism, agnosticism, and theism should all be regarded as “rationally permissible” based on the evidence and the rational arguments supporting each position. Although Oppy personally favours atheism, he does not expect all “sufficiently thoughtful, intelligent, and well-informed people” to share his view. He acknowledges that the evidence available is insufficient to compel a definitive conclusion on these issues. All three can claim to be reasonable beliefs. 

The British philosopher Bertrand Russell contended that we must learn to accept a certain level of uncertainty regarding the beliefs that really matter to us, such as the meaning of life. Russell’s perspective on philosophy provides a valuable counterbalance to the excesses of Enlightenment rationalism: “To teach how to live without certainty, and yet without being paralyzed by hesitation, is perhaps the chief thing that philosophy, in our age, can still do for those who study it.” 

Certainly, we must test everything and hold fast to what is good, as St Paul advised. It seems to me that it is essential to restore the role of evidence-based critical reasoning in Western culture. However, simply returning to the Enlightenment is not a practical solution. A more effective approach might be to gently challenge the notion, widespread in some parts of our society, that disagreement equates to hatred. We clearly need to develop ways of modelling a respectful and constructive disagreement, in which ideas can be debated and examined without diminishing the value and integrity of those who hold them. This is no easy task—yet we need to find a way of doing this if we are to avoid fragmentation into cultural tribes, and lose any sense of a “public good.” 

Support Seen & Unseen

Since Spring 2023, our readers have enjoyed over 1,500 articles. All for free. 
This is made possible through the generosity of our amazing community of supporters.

If you enjoy Seen & Unseen, would you consider making a gift towards our work?

Do so by joining Behind The Seen. Alongside other benefits, you’ll receive an extra fortnightly email from me sharing my reading and reflections on the ideas that are shaping our times.

Graham Tomlin
Editor-in-Chief