Essay
Change
Digital
9 min read

Time to strike a match under the social media titans

Smartphone boycotters can learn from the match girl strike and other historical protests.

Abigail is a journalist and editor specialising in religious affairs and the arts. 

A black and white image of Victorian women and a man standing together looking serious
Sarah Chapman and the match girls strike committee.
Wellcome Collection, Public domain, via Wikimedia Commons.

Meta's assets totalled nearly US$230 billion last year; Pinterest’s were over $3.5 billion and Beijing-based ByteDance, parent company of Tik Tok, was valued at $220 billion. Between them they have attracted billions of users, and, enabled by the spread of smartphones, transformed the way that young people especially communicate, spend time alone and carry out friendships.  

But parents’ concern at the impact of what their children are viewing, and the tech companies’ slow responses to a drip-drip-drip of teenage deaths linked to harmful online content, have pushed parents’ patience to the limit. In the last month an estimated 20,000 have joined a grassroots protest group – Parents United for a Smartphone-Free Childhood – whose founders are hastily developing a campaign strategy.  

This pushback against the march of big tech cannot come soon enough, and if well co-ordinated it could finally give parents a weapon: their numbers.  

The stats are becoming all too familiar – half of nine-year-olds own a smartphone and 68 per cent of children as young as three use it to get online. Drill down and it gets more startling: according to a Statista survey of 13-17-year-olds, 30 per cent of TikTok users had seen sexualised images or been trolled anonymously on the platform in the previous month; 14 per cent of respondents who visited YouTube had recently seen “violent or gory” content, and 10 per cent of respondents “had seen images of diet restriction on Instagram”. 

But these aren’t the only forms of online harm. I attended a meeting recently in the Palace of Westminster where one speaker argued that just as bad, especially for teenagers, are the algorithms that promote content that leads to peer comparison and discontent, which niggles away at contentment and self-confidence. 

The age checks on which they rely will be brought in by the tech companies, who thus far haven’t proven the most trustworthy partners on child safeguarding.   

Some of these hi-tech problems will require hi-tech solutions and new laws. But might there also be some far older wisdom that could help us a society chart a course to a safer online experience for young people?

The Government’s finally passed Online Safety Bill marks a pushback, making the tech companies legally responsible for keeping children and young people safe online. It mandates platforms to protect children from “harmful or age-inappropriate” content such as porn, depictions of violence, bullying, and sites promoting anorexia, and platforms will face tougher scrutiny of the measures they take to ensure under-13s can’t have social media accounts.  

However, these changes won’t take effect until part way through 2025; the age checks on which they rely will be brought in by the tech companies, who thus far haven’t proven the most trustworthy partners on child safeguarding.   

What are parents to do? And increasingly, employers and economists? After all, youth mental health experts were quick to point the finger at social media following the Resolution Foundation research that found five per cent of 20 to 24-year-olds were economically inactive due to ill health last year and 34 per cent of 18 to 24s reported symptoms of mental health conditions such as depression or anxiety – a reversal from two decades ago, when they had the lowest incidence of such disorders at only 24 per cent. 

The Department for Education wants heads to ban mobiles in school, which some already do. But what about outside school hours? As one participant and parent at the meeting asked, “Isn’t the genie already out of the bottle?” 

Molly and Brianna were not just vulnerable teenagers – they were victims of the powerful machinery of Third and Fourth Industrial Revolutions. 

A couple of voices suggested young people needed an engaging real-life alternative to their screens that involved learning to take risks, such as rock-climbing. Another added that young people are too protected in the real world and not protected enough online.  

One woman who has felt the sharpest cost of this inadequate protection is Esther Ghey. I would have hoped that the tech companies would be quick to change the ways their platforms work once they knew about the harmful material that her teenager Brianna was able to view online and the violent material her killers were able to discuss online.  

But then I hoped the same after 14-year-old Molly Russell took her own life in 2017 having viewed content promoting self-harm and suicide on Instagram. Instead, her family were made to wait two years for Meta, parent company of Instagram, to provide evidence for her inquest. Representatives from Meta and Pinterest apologised at the inquest, five years after her death. Big deal.  

Parents can – in theory – enact all parental controls offered by their internet provider, limit screen time and ban phones from their children’s bedrooms at night, although setting and reinforcing boundaries can be exhausting. Esther Ghey said Brianna’s phone usage “was a constant battle between me and her”. Other parents may lack the capacity to, or just not feel the need to, carry out such measures. And it only takes one child to share material for it to become a problem for a whole peer group.  

It’s a good step that phones are entering the market that are designed to be safe for children, with parental controls and minimal access to the internet. But they don’t get kids rock-climbing (or your wholesome outdoor team activity of choice), they still normalise children’s phone use, and they require parents to spend more time monitoring their own phones to check their children’s usage.  

So what’s to be done?  

Molly and Brianna were not just vulnerable teenagers – they were victims of the powerful machinery of Third and Fourth Industrial Revolutions, the rapid advances in tech that have taken computers from the office to the pocket and loaded them up with the capability of dozens of devices combined.  

Molly’s father Ian has teamed up with Esther Ghey to work together on holding the tech companies to account. And thanks to Parents United for a Smartphone Free Childhood, other parents now have a way of voicing their fears in a co-ordinated way, to try to prevent the next disaster. Organisers Clare Fernyhough and Daisy Greenwell estimate that already some 20,000 people have joined, from every county across Britain. This is an online campaign for an online age: it was sparked by a post by Greenwell in the fertile soil of Instagram, and communities are organised into WhatsApp groups. Nonetheless, the pair are encouraging parents not to give children smartphones until 14 and social media access until 16, and they have put together resources to help members urge headteachers to restrict, and other parents to delay, smartphone usage.  

Examples like William Booth are a reminder that, when it comes to systemic challenges, individuals are not without agency.

But what if these steps aren’t enough? History recalls some impressive David-vs-Goliath campaign victories that could be of use here. In the first Industrial Revolution, exhausted and overworked women and children lost limbs and even lives in the newly invented machinery. According to a landmark report commissioned by the House of Commons in 1832, these workers were often "abandoned from the moment that an accident occurs; their wages are stopped, no medical attendance is provided, and whatever the extent of the injury, no compensation is afforded." Years passed from the creation of these voracious machines to reformers such as Lord Shaftesbury, a politician driven by his Evangelical Christian faith, passing laws to cap children's hours at 58 hours a week and introduce other safeguards.  

A few decades later, the Bryant and May match company was employing hundreds of East End women to make matches using white phosphorus, which can cause phosphorus necrosis of the jaw or Phossy Jaw. The employees formed a union and went on strike; the Salvation Army, led by William Booth, another social reformer inspired by his Christian beliefs to help people in poverty, set up their own factory in 1891 offering better working conditions including the use of less toxic red phosphorus. Although their factory only ran for 10 years, the episode spelt bad publicity for Bryant and May and a ban on the use of white phosphorus in matches followed shortly after.  

A Salvation Army match box.

A yellow and red vintage match box laid out flat.
'Light in Darkest England.'

Examples like William Booth are a reminder that, when it comes to systemic challenges, individuals are not without agency. But other chapters in history underline that one person’s vision or persistence may need to be amplified by scale to be taken seriously. Had the civil rights activist Rosa Parks, who in 1955 refused to give up her seat for a white passenger, boycotted the buses alone, the authorities in Montgomery would have shrugged their shoulders. But when 40,000 other Black passengers, led by Rev Martin Luther King, joined her, the authorities could not afford to ignore them. 

So how do these three stories relate to young people’s social media use?  

The harmful effects of social media are a global issue, and if tech companies boast revenues greater than the GDP of several countries, governments may need to work together to get them to listen. And any calls from governments for better regulation and self-policing will be amplified if backed up by millions of parents.   

Perhaps we’re seeing the start of this: if the thousands of Parents United for a Smartphone Free Childhood can grow in number and start conversations with schools and other parents, then the demand for smartphones and their dominance of some young people’s lives can be challenged. Such conversations can’t come soon enough. But how can parents make themselves heard? And what do nineteenth-century industrialists, East End match girls or 1950s African Americans have to do with it?  

The parallel, in Christian jargon, is the undervaluing of the human person. The tech companies do not just exist to help us stay in touch with our friends or look cooler. So bear with me, if you will, for a thought exercise.  

In short, and I wince: adults’ relationship with smartphones needs to be rethought just as much as children’s. 

The Shaftesburies of our day need to ensure existing laws are applied, that the tech companies’ promised age controls are water-tight, and harsh penalties are applied for platforms that fail to take down harmful or illegal content. The William Booths need to provide alternatives to dopamine-inducing social media, that affirm the value of each young person and teach them to manage real-world, appropriate levels of risk. Hence the suggestion of rock-climbing or similar. And could we also imagine social network being conceived, funded and constructed on European soil which takes the wellbeing of its users seriously? And some form of online policing?  

In the meantime, the Rosa Parks of our age – which is all of us social media users, and Parents United for a Smartphone Free Childhood could lead the way – must consider investing in a dumb phone and enacting a smartphone boycott, at least outside our professional lives. The half of nine-year-olds who reportedly own smartphones can’t buy or fund them themselves; therefore, pretty much half of parents of nine-year-olds have passed theirs on or bought new ones and kept paying the bills. That gives them leverage.  

Leading by example would also mean parents swapping their own smartphones for dumbphones – at least in front of their children. An old laptop could be kept in the kitchen for searches that then become public, functional and brief – just like twentieth-century dips into the Phone Book or Yellow Pages. Smartphone ownership could be seen as a privilege of maturity like drinking, learning to drive and (previously) smoking, and doom-scrolling in front of children blacklisted. In short, and I wince: adults’ relationship with smartphones needs to be rethought just as much as children’s.  

The reforms of Shaftesbury and others and the ban on white phosphorus helped lay the foundation for today’s health and safety laws. The bus boycott was a key step in the Civil Rights Movement’s long and hard-fought journey towards equality.  

The tragic, needless loss of the lives of Brianna and Molly (and, sadly, others) must lead to laws and a wider social rethink that lay the foundations for a safer, more grown-up, properly regulated, internet age. We need to set ourselves on a course from where future generations will look back aghast, just as we do on child labour or white phosphorus or racial segregation, and ask, ‘What were they thinking?’ 

Article
AI
Attention
Culture
5 min read

Will AI’s attentions amplify or suffocate us?

Keeping attention on the right things has always been a problem.

Mark is a research mathematician who writes on ethics, human identity and the nature of intelligence.

A cute-looking robot with big eyes stares up at the viewer.
Robots - always cuter than AI.
Alex Knight on Unsplash.

Taking inspiration from human attention has made AI vastly more powerful. Can this focus our minds on why attention really matters? 

Artificial intelligence has been developing at a dizzying rate. Chatbots like ChatGPT and Copilot can automate everyday tasks and can effortlessly summarise information. Photorealistic images and videos can be generated from a couple of words and medical AI promises to revolutionise both drug discovery and healthcare. The technology (or at least the hype around it) gives an impression of boundless acceleration. 

So far, 2025 has been the year AI has become a real big-ticket political item. The new Trump administration has promised half a trillion dollars for AI infrastructure and UK prime minister Keir Starmer plans to ‘turbocharge’ AI in the UK. Predictions of our future with this new technology range from doom-laden apocalypse to techno-utopian superabundance. The only certainty is that it will lead to dramatic personal and social change. 

This technological impact feels even more dramatic given the relative simplicity of its components. Huge volumes of text, image and videos are converted into vast arrays of numbers. These grids are then pushed through repeated processes of addition, multiplication and comparison. As more data is fed into this process, the numbers (or weights) in the system are updated and the AI ‘learns’ from the data. With enough data, meaningful relationships between words are internalised and the model becomes capable of generating useful answers to questions. 

So why have these algorithms become so much more powerful over the past few years? One major driver has been to take inspiration from human attention. An ‘attention mechanism’ allows very distant parts of texts or images to be associated together. This means that when processing a passage of conversation in a novel, the system is able to take cues on the mood of the characters from earlier in the chapter. This ability to attend to the broader context of the text has allowed the success of the current wave of ‘large language models’ or ‘generative AI’. In fact, these models with the technical name ‘Transformer’ were developed by removing other features and concentrating only on the attention mechanisms. This was first published in the memorably named ‘Attention is All You Need’ paper written by scientists working at Google in 2017. 

If you’re wondering whether this machine replication of human attention has much to do with the real thing, you might be right to be sceptical. That said, this attention-imitating technology has profound effects on how we attend to the world. On the one hand, it has shown the ability to focus and amplify our attention, but on the other, to distract and suffocate it. 

Attention is a moral act, directed towards care for others.

A radiologist acts with professional care for her patients. Armed with a lifetime of knowledge and expertise, she diligently checks scans for evidence of malignant tumours. Using new AI tools can amplify her expertise and attention. These can automatically detect suspicious patterns in the image including very fine detail that a human eye could miss. These additional pairs of eyes can free her professional attention to other aspects of the scan or other aspects of the job. 

Meanwhile, a government acts with obligations to keep its spending down. It decides to automate welfare claim handling using a “state of the art” AI system. The system flags more claimants as being overpaid than the human employees used to. The politicians and senior bureaucrats congratulate themselves on the system’s efficiency and they resolve to extend it to other types of payments. Meanwhile, hundreds of thousands are being forced to pay non-existent debts. With echoes of the British Post Office Horizon Scandal, the 2017-2020 the Australian Robo-debt scandal was due to flaws in the algorithm used to calculate the debts. To have a properly functioning welfare safety net, there needs to be public scrutiny, and a misplaced deference to machines and algorithms suffocated the attention that was needed.   

These examples illustrate the interplay between AI and our attention, but they also show that human attention has a broader meaning than just being the efficient channelling of information. In both cases, attention is a moral act, directed towards care for others. There are many other ways algorithms interact with our attention – how social media is optimised to keep us scrolling, how chatbots are being touted as a solution to loneliness among the elderly, but also how translation apps help break language barriers. 

Algorithms are not the first thing to get in the way of our attention, and keeping our attention on the right things has always been a problem. One of the best stories about attention and noticing other people is Jesus’ parable of the Good Samaritan. A man lies badly beaten on the side of the road after a robbery. Several respectable people walk past without attending to the man. A stranger stops. His people and the injured man’s people are bitter enemies. Despite this, he generously attends to the wounded stranger. He risks the danger of stopping – perhaps the injured man will attack him? He then tends the man’s wounds and uses his money to pay for an indefinite stay in a hotel. 

This is the true model of attention. Risky, loving “noticing” which is action as much as intellect. A model of attention better than even the best neuroscientist or programmer could come up with, one modelled by God himself. In this story, the stranger, the Good Samaritan, is Jesus, and we all sit wounded and in need of attention. 

But not only this, we are born to imitate the Good Samaritan’s attention to others. Just as we can receive God’s love, we can also attend to the needs of others. This mirrors our relationship to artificial intelligence, just as our AI toys are conduits of our attention, we can be conduits of God’s perfect loving attention. This is what our attention is really for, and if we remember this while being prudent about the dangers of technology, then we might succeed in elevating our attention-inspired tools to make AI an amplifier of real attention. 

Join with us - Behind the Seen

Seen & Unseen is free for everyone and is made possible through the generosity of our amazing community of supporters.

If you’re enjoying Seen & Unseen, would you consider making a gift towards our work?

Alongside other benefits (book discounts etc.), you’ll receive an extra fortnightly email from me sharing what I’m reading and my reflections on the ideas that are shaping our times.

Graham Tomlin

Editor-in-Chief