Technology is making us miserable – the time has come for government to intervene
The global population is becoming more and more reliant on technology, even to the detriment of our health. Sarah Steele, Tyler Shores and Christopher Markou tell us why the government must act now before we face a mental health crisis
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Social media and screens are omnipresent. Many are concerned about the amount of time we – and our children – spend on devices. Soon to be a father, Prince Harry recently suggested “social media is more addictive than drugs and alcohol, yet it’s more dangerous because it’s normalised and there are no restrictions to it”.
But worries are not just limited to personal use. Many schools and workplaces are increasingly delivering content digitally, and even using game-playing elements like point scoring and competition with others in non-game contexts to drive better performance.
This “always on” lifestyle means many can’t just “switch off”. There are now suggestions that many of us are at risk of “digital burnout” as we find ourselves chronically stressed by hyperconnectivity. But is there evidence that screen time is, in fact, bad for us? Or worse: is it making us miserable?
To answer this, the UK government recently summarised what we know about the impact of technology use on children, drawing from a nascent, but robust, body of academic research exploring these questions. The Australian government has done the same but focused on screen time’s link to inactivity. Governments around the world are drawing together evidence.
We know, for example, there is a connection between using screens and poorer attention span and academic performance in children, delayed development in children, increased loneliness, greater stress and depressive symptoms among teenagers, as well as increased blood pressure and risk of diabetes.
When to act
While there are clearly correlations between increased screen use and psychosocial and physical health issues, correlation doesn’t mean causation. But without definitive scientific evidence, can we afford to ignore them? Should we refrain from making recommendations or regulations until there is direct proof, like the UK’s Royal College of Paediatrics and Child Health recently suggested?
From a public health perspective, the answer is a firm no. While evidence-based public health policy remains the gold standard, we have enough information to know action is required. Definitive scientific evidence of a causal link between technology use, or screen time, and negative health is unnecessary to justify appropriate action. This is because what is ultimately at stake is public safety, health and wellbeing. And, of course, we might never find the evidence.
The “precautionary principle” gives us a basis to act. It argues that, even without scientific consensus, governments have a duty to protect the public from harm. Policy interventions are justifiable where there exists even a plausible risk of harm. With correlations mounting, harm is more than plausible. The UK and Australia are already acting. But what should be done? A few obvious actions stand out.
Moving forward
YouTube, to start with, has been described as the “great radicaliser” because of how content recommendation engines lead people towards increasingly extreme content. This is because its algorithms have “learned” that people are drawn towards increasingly extreme content from what they started out searching for. We are all searching for that dopamine “fix” and hoping the next video will provide it. This problem could be addressed by regulating content recommendation systems and disabling YouTube’s “auto play” feature by default.
We also know tech companies use elaborate strategies to keep eyes on screens. By exploiting the brain’s reward system, they’ve mastered how to keep people scrolling, clicking and liking – and potentially making them addicted. The “gamification” of online marketing and product or service engagement weaponises neuroscience by using the brain’s reward system to drive continuous engagement.
It is also used against workers where competition and gamified approaches like targets or step counters drive increased performance levels. Amazon warehouses exemplify these strategies. This is something employment and human rights law will need to address – and government should investigate, especially as children are thought to be particularly susceptible.
A broader problem is, as tech writer Shoshana Zuboff has masterfully illustrated, the way big data is collected and used against us. We know that Google, Facebook, Amazon and other tech giants constantly collect our data and then use this data to target individuals and drive particular behaviours and responses.
With “surveillance capitalism”, the business model of the internet, there are no easy solutions. What we urgently need is courage from government to reign in big tech’s excesses and most insidious harms. Of course tech companies will act like their industrial predecessors. Lobbying and advocacy will be their weapons of choice for influencing laws and sustaining profitability. But it is critical politicians and professional organisations prioritise public health over industry money.
An issue for governments
Thankfully, several governments have indicated a desire to “make the online world a safer place” and take concrete steps towards regulating what techniques big tech can use on the public. A major step will be restricting behavioural advertising, as Germany recently has.
Of course, given that advertising accounted for the majority of Google’s revenue in 2018, we shouldn’t expect it to respond with anything but hostility when its core business model is threatened. It is encouraging that the UK government is taking the lead, with a forthcoming proposal calling for a new regulator and social media bosses to be legally liable for harms their platforms cause. This would be a bold step in the right direction.
We can also restrict what personal data can be used to sell products to people, and how advertisements are presented – allowing users greater control over what they see. A return to contextual advertising, where users only see ads related to what they’re searching or browsing for, would be a more modest but nonetheless important step.
We should expect these tech firms to use the playbook established by big tobacco, food and pharma. And so transparency mechanisms and robust reporting requirements must be put in place. We must also discuss our options for – and with – regulators.
It is key that we take a precautionary approach to industry-funded research by these tech giants, in the same way we have done with tobacco industry-funded research and bodies. While technology is part of our lives, how we understand it and how we regulate it must be in the interests of public health at large.
Sarah Steele is a senior research associate at the University of Cambridge. Christopher Markou is a Leverhulme fellow and lecturer at the University of Cambridge. Tyler Shores is a PhD candidate in social media and online culture at the University of Cambridge. This article first appeared on The Conversation
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments