Explainer
Creed
Psychology
5 min read

Should you be ashamed of yourself?

Shame powers cancel culture, yet its historic role is guarding community boundaries. Henna Cundill takes an in depth look at shame - and empathy.
The word 'SHAME' spray painted onto a grey hoarding in lime green paint.
Anthony Easton/flickr: PinkMoose, CC BY 2.0, via Wikimedia Commons.

“Put on this dunce’s cap and go and stand in the corner!” cries the teacher, and immediately we are transported to a scene that takes place in a schoolroom of centuries past. Likewise, if nowadays we were to see a woman being led down the street wearing a scold’s bridle, we might assume that there was a very odd sort of party going on; we might even intervene or phone the police. Why? Because these are not the scenes of 21st century Britain. We don’t do public shaming anymore – at least, we like to think we don’t.  

But the truth is we very much do; in fact, shame is essential, at least to a certain degree. For a group to survive with any sense of collective identity and purpose, something has to prevent each person within that group from becoming too greedy, or too lazy, or too dishonest. That something is often the fear of being shamed, not even punished – just shamed. It doesn’t feel nice to be judged and found wanting, or to fear that you might be. 

Think back to the last windy day when your recycling bin blew over – did you experience a passing moment of concern about the public pavement acrobatics of your wine-bottles, cake boxes and ready-meal trays? No need to blush – your neighbours probably rushed out ahead of you to hide their own multifarious sins. Studies have long shown that installing self-checkouts at supermarkets dramatically increases the purchase rates of “stigma items” such as alcohol and unhealthy foods. Oh, the things we do when we think no one is watching… 

So, shame is, on one level, a functional tool which does the essential job of guarding the life and boundaries of a community. Perhaps one or two of us still eats a little too much and drinks a little too much, but shame is one of the things that keeps most of us from going too far, too often – or at least the threat of shame tends to discourage. As Graham Tomlin has recently explored – we still live in a society that equates over-indulgence with a lack of virtue.  

It’s one thing for shame to guard certain moral boundaries (as long as we can all agree what they are) but we’re in a troubling place with the social ones. 

However, when an individual does step out of line, then the shaming process has two modes of presentation: exposure or exclusion, sometimes both. This is most clearly seen in a court of law, where an offender is first ceremonially declared to be guilty (exposure) and then is subsequently sentenced (exclusion) – often “removed” from society, at least for a while, via a custodial sentence or a curfew. In this very clear way, shaming plays a functional role for the well-being of society as a whole.  

But these two prongs of the shaming process can also happen in rather dysfunctional ways, some of which are dangerously subtle. We fear the recycling bin disgorging its contents because there is a certain social shame in being seen to consume too much junk. Fine. But what about the teenager who is compelled into a cycle of disordered eating because a schoolfellow has pointed the finger and said the dreaded word, “fat”? Likewise, many people love a chit-chat, and the fear of being excluded from a social group usefully prevents most of us from being too fixed on one topic, or from appearing inattentive or impolite. But in my research with autistic people, some have shared that they feel shamed out of social groups entirely simply because “chit-chat” is not right for them. Some have a language processing delay, others find “small talk” a bit confusing and inane and would rather talk about something specific. It’s one thing for shame to guard certain moral boundaries (as long as we can all agree what they are) but we’re in a troubling place with the social ones. Some of this shaming doesn’t sound very functional, not if the wellbeing of society is supposedly the goal.  

The inverse of shame is empathy. Where shame excludes, empathy shows attentiveness. 

Perhaps the saltiest example of this problem is the now infamous “cancel culture”. I know – even I can’t believe I would risk bringing that up as a writer, that’s how charged this debate has become. But de-platforming, boycotting, or publicly castigating someone for the views that they express – these are shaming activities, an attempt to render an individual exposed and excluded. It can be a very tricky argument as to whether this counts as functional shame, guarding the wellbeing of society, or dysfunctional shame, guarding little more than social norms.  

We ought to try and take it on a case-by-case basis, but even then, sometimes what one person takes as a moral absolute another person sees as a social choice. At the same time, those who hold dearly to certain moral absolutes sometimes lose sight of the societal impact of what they say. The result can be a strange kind of war, one where there is virtually no engagement between two opposing factions, and the only weapons are a string of press releases and a whole lot of contempt. Eventually, often regardless of there being no engagement and no progress, both sides vigorously declare themselves to be the winner.   

Jesus once said a strange thing when he was talking to a crowd. He said: “Settle matters quickly with your adversary who is taking you to court. Do it while you are still together on the way.” In other words, “Just have a chat first,” says Jesus, “and see if you can’t come to terms.” It was part of a much longer discourse where he also told the crowd to “love your enemies” – and this with the kind of love called agape, a love which favourably discriminates and chooses someone – very much the opposite of shaming them.  

For my own research I have looked in depth at the shaming experience, and one of the conclusions that I come to is that the inverse of shame is empathy. Where shame excludes, empathy shows attentiveness. Where shame exposes an individual, empathy draws them into discussion. To empathise with someone is not to agree with them, but it is to recognise they are human just the same, and that through openness and dialogue it is possible for people, even those who have very different experiences of the world, to explore each other’s perspectives. The end point of that exploration may not be agreement – it might still be everyone back to their corners. But in the process no one has been shamed, no one exposed or excluded, no-one othered or dehumanised.  

Of course, it is far easier to point the finger, to expose someone to the court of public opinion, and then to turn one’s face away, nose in the air, mouth clamped shut in an apparently dignified silence. On the surface this seems like the elegant response – live and let live – but in fact it is not: to designate someone as not worthy of attention is to very publicly inflict shame. We might as well clamp them into a scold’s bridle and lead them down the street. And, as we do so, let’s hope it’s not a windy day – or if it is, let’s be sure that we have firmly tied down the lids of our recycling bins.   

Explainer
Creed
5 min read

Creator or creature – a centuries old question of identity

Why does a 1,700-year-old creed still matter?

Frances Young is Emeritus Professor of Theology at the University of Birmingham. 

An abstract depiction of The Creation shows an aperture in a cloud like formation over water.
The Creation, James Tissot.
James Tissot, Public domain, via Wikimedia Commons.

2025 will be the 1,700th anniversary of the Nicaea Creed. In October 2024, Prof. Frances Young gave the inaugural lecture of the McDonald Agape Nicaea Project at St Mellitus College.

 

In the year 325CE the first ever “ecumenical” (= “worldwide”) council of bishops assembled at Nicaea near Constantinople (now Istanbul). It was summoned by Constantine, the first Roman Emperor to convert to Christianity and patronize the Church. Why does this seventeenth centenary of an obscure discussion around complex words matter to us today? 

The outcome of the Council was agreement to the text of a creed, and banishment of a pesky priest named Arius, whose bishop disapproved of his teaching. Unfortunately, some other bishops remained sympathetic to something like Arius’ viewpoint, and for political reasons Constantine was desperate for Church unity. Argument over the issues went on for half a century, until another Council in 381CE reaffirmed the position established in 325CE and agreed the version labelled “the Nicene Creed” and still used in Church liturgies across the world today. 

The controversy was basically about the identity of the pre-existent Word or Son of God incarnate in Jesus Christ. Nicaea established that the Son was “of one substance” (homoousios) with the Father – in other words, he was fully God in every sense of the word. But for many traditional believers at the time this was difficult to accept. 

The common sense of the culture thought in terms of a “chain of being.” Most people in the Roman Empire were polytheists – there were loads of gods: Mars, god of war, Nepture, god of the sea, and so on. Each city, each ethnic group, had its own god, as did every family, every interest group, every burial society – you name it. But generally there was a sense that above all these was the Supreme God, who was worshipped indirectly through worship of these lower gods, and below them were all sorts of nature spirits, daemons, benign and malign, then souls incarnate in human persons, then animals, even vegetables as living entities, and finally inert matter like earth and stones, at the bottom of the hierarchy or chain of being. 

Jews identified their God with the Supreme God and insisted the one God alone should be worshipped. But they also imagined a heavenly court of archangels and angels, then below that the souls of the righteous, and so on in a somewhat parallel hierarchy. No surprise then that Christians assumed a similar picture: God, then the Son of God, then the Holy Spirit, then archangels and angels, then souls, and so on in a hierarchical ladder. 

But in the second century Christians had argued their way to the idea of “creation out of nothing.” Many non-Jewish thinkers, including some early Christians, followed Plato, conceiving creation as the outcome of Mind (the Demiurge or Craftsman) shaping Matter into whatever Forms or Ideas were in mind. But other Christian thinkers argued that God was not a mere Craftsman who needed stone or wood to work on like a sculptor – God produced the Matter in the first place. This then triggered a full-blown critique: God did not create out of pre-existent Matter or there would be two first principles; God did not create from God’s own self or everything would be divine; so God must have created out of nothing. 

Now try to fit that to the chain of being: where do you draw the line between God the Creator and everything else made out of nothing? This was the issue which surfaced in the so-called Arian controversy. What we might call the “mainstream” remained wedded to the hierarchy, not least because of earlier controversies about God’s monarchia. The word did mean “monarchy” – single sovereignty; but arche could mean “rule” or “beginning,” so monarchia also referred to the single first principle of all that is. It was natural to attribute monarchia to God the Father, a view that worked OK with the hierarchy. But some had suggested that the one God 'changed mode', as it were, appearing now as Father, now as Son, now as Holy Spirit, taking different roles in the overarching scriptural story. This suggestion was mocked as all too similar to the pagan god, Proteus, who in mythology kept changing shape. It is even possible that that key word homoousios had been condemned along with this “Modalist” view.  

Traditionalists were suspicious. The first historian of the Church, Eusebius of Caesarea, was present at Nicaea, and wrote a somewhat embarrassed letter to his congregation explaining how he had come to agree to this formula. Even Athanasius - the one who would come to be regarded as the staunch defender of Nicaea - largely avoided the term for a quarter of a century, though that does not mean he did not identify the principal issue. He campaigned hard and ended up in exile five times over. The fundamental issue was whether Christ was God incarnate or some kind of divinised superman, or a semi-divine mediating figure, a created Creator. Arius is supposed to have said, “there was a when he was not,” even though he was “the first and greatest of the creatures” through whom God created everything else. 

So why does it still matter? Four simple reasons:

Because it was basically about identity, and the question of Christ’s identity still matters. 

Because we still find people treating Jesus Christ as superhuman – not really one of us, or semi-divine – not God in the same sense as the God the Father. If we are to be ecumenical, across different denominations today but also across time, we need to affirm that God’s Son and Spirit are truly of the one God. As early as the second century the first great Christian theologian, Irenaeus, characterized the Word and the Spirit as God’s two hands – we can imagine the Trinity reaching out first to create and then to embrace us with God’s redeeming love. 

Because it means we can look to Jesus and there catch a glimpse of God’s very own loving face - not just a dim image but the reality itself.

And because only God could recreate us in God’s own image and raise us to new life. 

  

To find out more about the McDonald Agape Nicaea Project being held by St. Mellitus College in London, come and join the public lectures, or look out for other Nicene celebrations in 2025. 

For more information or to register for these events, you can visit the Nicaea Project website  

Watch the lecture