Explainer
America
Comment
6 min read

The Cold War, the Internet and America’s nones

How does a culture lose religion so rapidly? Stephen Bullivant investigates the American phenomenon of ‘nonversion'.

Stephen Bullivant is a professor of theology and sociology at St Mary’s University, UK, and the University of Notre Dame, Australia.

Image from the series South Park

Those even passingly familiar with American religious trends over the past couple of decades will have read or heard the phrase ‘rise of the nones’ countless times. And with good reason. While in Britain the proportion of people telling pollsters they have ‘no religion’ grew rapidly over the twentieth century – it was already 43% when the British Social Attitudes survey began in 1983; it tipped 50% a little over a decade later – in America the figure stayed stubbornly low. According to Gallup polls, only 5% of American adults identified with no religion in 1975. Twenty years later, in 1995, it was still 5%.

But then, seemingly very suddenly, things started to change. Beginning in the late nineties, then rapidly accelerating in the early 2000s, each new survey showed the nones getting bigger and bigger. Depending on which survey one looks at, nones now account for  somewhere between a quarter and third of American adults. Even at the lower end, that amounts to some 60 million people. And they’re still growing.

This raises a natural question: Why now? Or rather, what is it about the nineties and early 2000s that pushed or pulled large swathes out of thinking of themselves as religious? Various ways of measuring American religiosity all indicate that something significant must have happened around then. But what

A prior, deeper puzzle

That, at least, is the obvious way of approaching things. And to be fair, it has much to recommend it: something, or rather a combination of somethings, certainly did happen to American religion in those critical years. But this in itself raises a prior, deeper puzzle: why hadn’t the numbers of American nones already risen before the late nineties or early naughts? In all manner of other, quasi-comparable countries – Britain, Canada, Australia, France – the nones started growing steadily from the 1960s onwards. Yet while the sixties had all manner of other disruptive and destabilizing effects on American culture, society, politics, and religion, the proportion of nones grew only a little bit, then stopped.

At the risk of gross oversimplification, if one were to look for a sufficiently big ‘something’ within American society, mostly absent from those other countries, which could plausibly have kept non-religiosity artificially low in these decades, then there is an obvious candidate: the Cold War. Or more specifically, the precise and peculiarly religious way in which it was framed in the USA. 

A final, all-out battle

We Brits were as up to our neck in the Cold War as anyone. But only in America, I think, was the Cold War ever popularly framed as a “final, all-out battle between commu­nistic atheism and Christianity”, to quote Republican Senator Joseph McCarthy. Remember too that it was only in the mid-1950s that Congress adopted “In God We Trust” as America’s official motto, and “under God” was added to the Pledge. During the Pledge debates in Congress, the Democrat Louis C. Rabaut’s summed up a common view on both sides of the aisle:

“You may argue from dawn to dusk about differing po­litical, economic, and social systems but the fundamental issue which is the unbridgeable gap between America and Communist Russia is a belief in almighty God.”

This wasn’t just an issue with wide bipartisan and popular support view, it was thoroughly ecumenical too. While McCarthy and Rabaut were Catholics, it was a Presbyterian president, Eisenhower, who signed the “under God” bill into law. As Eisenhower himself put it during his 1952 election campaign:

“What is our battle against communism if it is not a fight between anti-God and a belief in the Almighty?”

Embellishing the city on a hill

It was also during the Cold War that presidents began likening America to the biblical “city built on a hill” – all the better positioned, one presumes, to scour the horizon for incoming Soviet missiles. Kennedy was the first US president to use it. Reagan, adding his own embellishment of “shining,” would make it his, and many of his countrymen’s, own. Taken together, all this helped lay down a deep, implicit association between being un-religious and being un-American. Atheism itself bore the brunt of this, but it more generally ruled out as­sociated ideas and identities – including thinking of oneself as having “no religion” – as live options for the great majority of Americans.

Riven fault lines

Meanwhile, the cultural fault lines that begin obviously opening up in the late sixties – gender equality, sexual liberation – kept on widening, with new generations socialized into ever more liberal baselines. This created a growing values gap between traditional Christian views and the wider mainstream culture, on topics that were very personal to, and thus felt very deeply by, people on all sides. This meant that, while churches tended to be most visible on the 'conservative side' of various battlegrounds, they were also often deeply riven by internal versions of the same debates. Not surprisingly, church attendance, at least within Catholic and mainline churches, started falling steadily in the seventies and (except where immigration has helped fill the pews) has never really stopped.

The Internet of ideas and identities

On this basic account – and there is much that could be, and elsewhere has been, added to it – the thawing of the Cold War is obviously significant. Note that it is the Millennial generation, only the youngest of whom are able to remember the Cold War (and even then mostly from holiday reruns of Red Dawn and Rocky IV), who were at the vanguard of the rise of the nones. They were also the first generation to be true digital natives, opening many of them up to a much wider range of ideas and identities than hitherto available. This has been especially effective at chipping away the walls of some of America’s stronger religious subcultures. My ex-Mormon interviewees, especially, cite “the wonderful thing called the internet” as being “the game-changer”.

Serious discussion and South Park

The Millennials started coming of age, and indeed responding to pollsters’ surveys, in the early 2000s. This was also around the time when, arguably for the first time since maybe the hugely popular writer and speaker  Robert “The Great Agnostic” Ingersoll a century before, unbelief was being seriously discussed everywhere from primetime talkshows to episodes of South Park. The bestselling books of the New Atheists – principally Sam Harris, Richard Dawkins, Daniel Dennett, and Christopher Hitchens – evidently hit upon some long pent-up demand. They were also, significantly here, able to position atheism, and 'no religion' more generally, as a panacea for a world awash with religion. Harris, for example, makes much of how he started writing The End of Faith on September 12th. Dawkins made no secret about his wanting to run adverts with an image of the Twin Towers and the tagline “Imagine no religion…”.

Cultural space opens

Whatever one makes of such arguments, similar rhetorical moves would have had less intuitive appeal to earlier American generations, learning to duck and cover from atheists’ H-bombs: the stuff of Americans’ nightmares were now those with too much religion, rather than not enough. While the long term impact of the not-so-New Atheism is hard to judge – many nones are keen to distance themselves from what they saw as its “dogmatism” and “extremism”, even while agreeing with much of it – it certainly helped open up ‘cultural space’ for being both American and non-religious that the Cold War had (outside of various enclaves, such as college towns and certain big cities) largely sealed shut. As we have seen, it is one that a good quarter of American adults are quite comfortable placing themselves within.

So yes, new somethings indeed happened in the final years of the twentieth century and the first years of the twenty-first: and these helped drive the uptick of nones. But these happened at the same time as the none-inhibiting effects of a much earlier something had more or less worn off, especially among the emerging genera­tions most affected by the new somethings. It is this combination of factors— akin to pulling one foot off the brake as you slam the other down on the accelerator— that explains quite why the nones rose so suddenly and (seemingly) out of nowhere.  

 

Article
Assisted dying
Care
Comment
Politics
6 min read

Assisted dying’s problems are unsolvable

There’s hollow rhetoric on keeping people safe from coercion.
Members of a parliamentary committee sit at a curving table, in front of which a video screen shows other participants.
A parliamentary committee scrutinises the bill.
Parliament TV.

One in five people given six months to live by an NHS doctor are still alive three years later, data from the Department of Work and Pensions shows. This is good news for these individuals, and bad news for ‘assisted dying’ campaigners. Two ‘assisted dying’ Bills are being considered by UK Parliamentarians at present, one at Westminster and the other at the Scottish Parliament. And both rely on accurate prognosis as a ‘safeguard’ - they seek to cover people with terminal illnesses who are not expected to recover. 

An obvious problem with this approach is the fact, evidenced above, that doctors cannot be sure how a patient’s condition is going to develop. Doctors try their best to gauge how much time a person has left, but they often get prognosis wrong. People can go on to live months and even years longer than estimated. They can even make a complete recovery. This happened to a man I knew who was diagnosed with terminal cancer and told he had six months left but went on to live a further twelve years. Prognosis is far from an exact science. 

All of this raises the disturbing thought that if the UK ‘assisted dying’ Bills become law, people will inevitably end their lives due to well-meaning but incorrect advice from doctors. Patients who believe their condition is going to deteriorate rapidly — that they may soon face very difficult experiences — will choose suicide with the help of a doctor, when in fact they would have gone on to a very different season of life. Perhaps years of invaluable time with loved ones, new births and marriages in their families, and restored relationships. 

Accurate prognosis is far from the only problem inherent to ‘assisted dying’, however, as critics of this practice made clear at the – now concluded – oral evidence sessions held by committees scrutinising UK Bills. Proponents of Kim Leadbeater’s Terminally Ill Adults (End of Life) Bill and Liam McArthur’s Assisted Dying for Terminally Ill Adults (Scotland) Bill have claimed that their proposals will usher in ‘safe’ laws, but statements by experts show this rhetoric to be hollow. These Bills, like others before them, are beset by unsolvable problems. 

Coercion 

Take, for example, the issue of coercion. People who understand coercive control know that it is an insidious crime that’s hard to detect. Consequently, there are few prosecutions. Doctors are not trained to identify foul play and even if they were, these busy professionals with dozens if not hundreds of patients could hardly be counted on to spot every case. People would fall through the cracks. The CEO of Hourglass, a charity that works to prevent the abuse of older people, told MPs on the committee overseeing Kim Leadbeater’s Bill that "coercion is underplayed significantly" in cases, and stressed that it takes place behind closed doors. 

There is also nothing in either UK Bill that would rule out people acting on internal pressure to opt for assisted death. In evidence to the Scottish Parliament’s Health, Social Care and Sport Committee last month, Dr Gordon MacDonald, CEO of Care Not Killing, said: “You also have to consider the autonomy of other people who might feel pressured into assisted dying or feel burdensome. Having the option available would add to that burden and pressure.” 

What legal clause could possibly remove this threat? Some people would feel an obligation to ‘make way’ in order to avoid inheritance money being spent on personal care. Some would die due to the emotional strain they feel they are putting on their loved ones. Should our society really legislate for this situation? As campaigners have noted, it is likely that a ‘right to die’ will be seen as a ‘duty to die’ by some. Paving the way for this would surely be a moral failure. 

Inequality 

Even parliamentarians who support assisted suicide in principle ought to recognise that people will not approach the option of an ‘assisted death’ on an equal footing. This is another unsolvable problem. A middle-class citizen who has a strong family support network and enough savings to pay for care may view assisted death as needless, or a ‘last resort’. A person grappling with poverty, social isolation, and insufficient healthcare or disability support would approach it very differently. This person’s ‘choice’ would be by a dearth of support. 

As Disability Studies Scholar Dr Miro Griffiths told the Scottish Parliament committee last month, “many communities facing injustice will be presented with this as a choice, but it will seem like a path they have to go down due to the inequalities they face”. Assisted suicide will compound existing disparities in the worst way: people will remove themselves from society after losing hope that society will remove the inequalities they face. 

Politicians should also assess the claim that assisted deaths are “compassionate”. The rhetoric of campaigners vying for a change in the law have led many to believe that it is a “good death” — a “gentle goodnight”, compared to the agony of a prolonged natural death from terminal illness. However, senior palliative medics underline the fact that assisted deaths are accompanied by distressing complications. They can also take wildly different amounts of time: one hour; several hours; even days. Many people would not consider a prolonged death by drug overdose as anguished family members watch on to be compassionate. 

Suicide prevention 

 It is very important to consider the moral danger involved with changing our societal approach to suicide. Assisted suicide violates the fundamental principle behind suicide prevention — that every life is inherently valuable, equal in value, and deserving of protection. It creates a two-tier society where some lives are seen as not worth living, and the value of human life is seen as merely extrinsic and conditional. This approach offers a much lower view of human dignity than the one we have ascribed to historically, which has benefited our society so much.  

Professor Allan House, a psychiatrist who appeared before the Westminster Committee that’s considering Kim Leadbeater’s Bill, described the danger of taking this step well: “We’d have to change our national suicide prevention strategy, because at the moment it includes identifying suicidal thoughts in people with severe physical illness as something that merits intervention – and that intervention is not an intervention to help people proceed to suicide.” 

 Professor House expressed concern that this would “change both the medical and societal approach to suicide prevention in general”, adding: “There is no evidence that introducing this sort of legislation reduces what we might call ‘unassisted suicide’.” He also noted that in the last ten years in the State of Oregon – a jurisdiction often held up as a model by ‘assisted dying’ campaigners – “the number of people going through the assisted dying programme has gone up five hundred percent, and the number of suicides have gone up twenty per cent”. 

The evidence of various experts demonstrates that problems associated with assisted suicide are unsolvable. And this practice does not provide a true recognition of human dignity. Instead of changing the law, UK politicians must double down on existing, life-affirming responses to the suffering that accompanies serious illness. The progress we have made in areas like palliative medicine, and the talent and technology available to us in 2025, makes another path forwards available to leaders if they choose to take it. I pray they will. 

Join with us - Behind the Seen

Seen & Unseen is free for everyone and is made possible through the generosity of our amazing community of supporters.

If you’re enjoying Seen & Unseen, would you consider making a gift towards our work?

Alongside other benefits (book discounts etc.), you’ll receive an extra fortnightly email from me sharing what I’m reading and my reflections on the ideas that are shaping our times.

Graham Tomlin

Editor-in-Chief