Article
Belief
Care
Creed
4 min read

Understand what we thirst for

Whether for water or meaning, it’s a primal force.

Helen is a registered nurse and freelance writer, writing for audiences ranging from the general public to practitioners and scientists.

A child wearing a wool hat holds a glass and drink water from it.
Johnny McClung on Unsplash

Quenching thirst is a global problem. It can also be profoundly personal, impaired by illness. For nurses, it can be ethically and emotionally difficult, when treating dying patients. But is there ultimate relief? 

Thirst is the subjective sensation of a desire to drink something that cannot be ignored. The world is thirsty; globally, 703 million people lack access to clean water. That’s 1 in 10 people on the planet. 

Thirst is a life-saving warning system that tells your body to seek satisfaction through swallowing fluid. It works in partnership with other body processes - such as changes in blood pressure, heart rate and kidney function - to restore fluid, and salt, levels back to where they belong. Failure of any part of this beautifully balanced system leads to dehydration (or water intoxication), and perhaps to seizures, swelling of the brain, kidney failure, shock, coma and even death. 

Sometimes it’s difficult to quench thirst, because of problems with supply. According to the World Health Organization, at least 1.7 billion people used a drinking water source contaminated with faeces in 2022. Sometimes in war, water is weaponised, with systematic destruction of water sources and pipes. Water laced with rat fur, arsenic and copper has meanwhile been reported in prisons across the USA.  

At other times, there may be “water everywhere, but not a drop to drink” because of individual problems with swallowing. As a nurse, some of my most heartbreaking moments have been when I have been unable to fulfil a need as basic as a patient’s thirst; when even thickened fluids have led to intense coughing and distress, and a realisation that I can only moisten mouths and give so-called “taste for pleasure”: very small amounts of a favourite liquid or taste using a soft toothbrush, or a circular brush gently sweeping around the mouth and lips to release some of the liquid - even, and especially, at the end of life when the patient is unconscious. 

Difficulties in drinking are common in dementia when fluid can seem foreign and swallowing a surprise to the system. It’s thought that over 50 per cent of people in care homes have an impaired ability to eat or drink safely; 30 to 60 per cent  of people who have had a stroke and 50 per cent of those living with Parkinson’s may struggle to swallow. 

Other conditions that may affect swallowing include multiple sclerosis, cerebral palsy, and head and neck cancers. Diabetes is characterised by a raging thirst owing to problems with insulin (diabetes mellitus) or an imbalance in antidiuretic hormone levels (diabetes insipidus). In intensive care, patients are predisposed to thirst through mechanical ventilation, receiving nothing by mouth, and as a side effect of some medications. But thirst is a “neglected area” in healthcare, writes palliative care researcher Dr Maria Friedrichsen.  

“Knowledge of thirst and thirst relief are not expressed, seldom discussed, there are no policy documents nor is thirst documented in the patient’s record. There is a need for nurses to take the lead in changing nursing practice regarding thirst.” 

Is there another thirst that is also being missed in nursing, and in life in general – a spiritual thirst, beyond the physical desire to drink? In his book, Living in Wonder, writer Rod Dreher argues that humans are made to be spiritual, and that a critical sixth sense has been lost in a “society so hooked on science and reason”. We humans crave love in our deepest selves; we have an insatiable thirst for everything which lies within – and beyond – ourselves. Auschwitz survivor Viktor Frankl, who was later appointed professor of psychiatry at the University of Vienna, became convinced that human beings have a basic “will for meaning.” “The striving to find a meaning in one’s life,” he wrote, “is the primary motivational force in man.”  

In the harsh sun of a Middle East day, an ancient story of a man and a woman encountering each other at a water well illustrates this dual thirst for water and meaning. The man, Jesus, thankful for a drink of water given to him at the well by an outcast Samaritan woman, said that “Everyone who drinks this water will be thirsty again, but whoever drinks the water I give them will never thirst. Indeed, the water I give them will become in them a spring of water welling up to eternal life.” In that midday sun, such imagery made a powerful statement.  

Being mindful of spiritual thirst when drinking water is something also captured in a Ghanaian proverb and pictured perfectly in the many birds that drink by gravity, so tipping their heads back when they swallow.  

“Even the chicken, when it drinks, 

Lifts its head to heaven to thank God for the water”. 

Unsatisfied thirst is part of the human condition, we long for something more; it’s living proof of our immortality, says French poet Charles Baudelaire. Despite his Olympic success, athlete Adam Peaty said that society didn’t have the answers he was seeking, and that a gold medal was the coldest thing to wear. He “discovered something that was missing” when attending church for the first time, and now has a cross with the words “Into the Light” tattooed across his abdomen, symbolizing his spiritual awakening. We are more than mechanical machines with physical needs. We are rather gardens to tend in a dry and thirsty land, with souls in need of intensive care.   

Article
Belief
Creed
Identity
Truth and Trust
5 min read

Calls to revive the Enlightenment ignore its own illusions

Returning to the Age of Reason won’t save us from post-Truth

Alister McGrath retired as Andreas Idreos Professor of Science and Religion at Oxford University in 2022.

In the style of a Raeburn portrait, a set of young people lounge around on their phones looking diffident
Enlightened disagreement (with apologies to Henry Raeburn).
Nick Jones/Midjourney.ai.

Is truth dead? Are we living in a post-truth era where forcefully asserted opinions overshadow evidence-based public truths that once commanded widespread respect and agreement? Many people are deeply concerned about the rise of irrational beliefs, particularly those connected to identity politics, which have gained considerable influence in recent years. It seems we now inhabit a culture where emotional truths take precedence, while factual truths are relegated to a secondary status. Challenging someone’s beliefs is often portrayed as abusive, or even as a hate crime. Is it any surprise that irrationality and fantasy thrive when open debate and discussion are so easily shut down? So, what has gone wrong—and what can we do to address it? 

We live in an era marked by cultural confusion and uncertainty, where a multitude of worldviews, opinions, and prejudices vie for our attention and loyalty. Many people feel overwhelmed and unsettled by this turmoil, often seeking comfort in earlier modes of thinking—such as the clear-cut universal certainties of the eighteenth-century “Age of Reason.” In a recent op-ed in The Times, James Marriott advocates for a return to this kind of rational thought. I share his frustration with the chaos in our culture and the widespread hesitation to challenge powerful irrationalities and absurdities out of fear of being canceled or marginalized. However, I am not convinced that his proposed solution is the right one. We cannot simply revert to the eighteenth century. Allow me to explain my concerns. 

What were once considered simple, universal certainties are now viewed by scholars as contested, ethnocentric opinions. These ideas gained prominence not because of their intellectual merit, but due to the economic, political, and cultural power of dominant cultures. “Rationality” does not refer to a single, universal, and correct way of thinking that exists independently of our cultural and historical context. Instead, global culture has always been a bricolage of multiple rationalities. 

The great voyages of navigation of the early seventeenth century made it clear that African and Asian understandings of morality and rationality differed greatly from those in England. These accounts should have challenged the emerging English philosophical belief in a universal human rationality. However, rather than recognizing a diverse spectrum of human rationalities—each shaped by its own unique cultural evolution—Western observers dismissed these perspectives as “primitive” or “savage” modes of reasoning that needed to be replaced by modern Western thought. This led to forms of intellectual colonialism, founded on the questionable assumption that imposing English rational philosophies was a civilizing mission intended to improve the world. 

Although Western intellectual colonialism was often driven by benign intentions, its consequences were destructive. The increasing influence of Charles Darwin’s theory of biological and cultural evolution in the late nineteenth century led Darwin’s colleague, Alfred Russel Wallace, to conclude that intellectually and morally superior Westerners would “displace the lower and more degraded races,” such as “the Tasmanian, Australian and New Zealander”—a process he believed would ultimately benefit humanity as a whole. 

We can now acknowledge the darker aspects of the British “Age of Reason”: it presumed to possess a definitive set of universal rational principles, which it then imposed on so-called “primitive” societies, such as its colonies in the south Pacific. This reflected an ethnocentric illusion that treated distinctly Western beliefs as if they were universal truths. 

A second challenge to the idea of returning to the rational simplicities of the “Age of Reason” is that its thinkers struggled to agree on what it meant to be “rational.” This insight is often attributed to the philosopher Alasdair MacIntyre, who argued that the Enlightenment’s legacy was the establishment of an ideal of rational justification that ultimately proved unattainable. As a result, philosophy relies on commitments whose truth cannot be definitively proven and must instead be defended on the basis of assumptions that carry weight for some, but not for all. 

We have clearly moved beyond the so-called rational certainties of the “Age of Reason,” entering a landscape characterized by multiple rationalities, each reasonable in its own unique way. This shift has led to a significant reevaluation of the rationality of belief in God. Recently, Australian atheist philosopher Graham Oppy has argued that atheism, agnosticism, and theism should all be regarded as “rationally permissible” based on the evidence and the rational arguments supporting each position. Although Oppy personally favours atheism, he does not expect all “sufficiently thoughtful, intelligent, and well-informed people” to share his view. He acknowledges that the evidence available is insufficient to compel a definitive conclusion on these issues. All three can claim to be reasonable beliefs. 

The British philosopher Bertrand Russell contended that we must learn to accept a certain level of uncertainty regarding the beliefs that really matter to us, such as the meaning of life. Russell’s perspective on philosophy provides a valuable counterbalance to the excesses of Enlightenment rationalism: “To teach how to live without certainty, and yet without being paralyzed by hesitation, is perhaps the chief thing that philosophy, in our age, can still do for those who study it.” 

Certainly, we must test everything and hold fast to what is good, as St Paul advised. It seems to me that it is essential to restore the role of evidence-based critical reasoning in Western culture. However, simply returning to the Enlightenment is not a practical solution. A more effective approach might be to gently challenge the notion, widespread in some parts of our society, that disagreement equates to hatred. We clearly need to develop ways of modelling a respectful and constructive disagreement, in which ideas can be debated and examined without diminishing the value and integrity of those who hold them. This is no easy task—yet we need to find a way of doing this if we are to avoid fragmentation into cultural tribes, and lose any sense of a “public good.” 

Support Seen & Unseen

Since Spring 2023, our readers have enjoyed over 1,500 articles. All for free. 
This is made possible through the generosity of our amazing community of supporters.

If you enjoy Seen & Unseen, would you consider making a gift towards our work?

Do so by joining Behind The Seen. Alongside other benefits, you’ll receive an extra fortnightly email from me sharing my reading and reflections on the ideas that are shaping our times.

Graham Tomlin
Editor-in-Chief