Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.How do you monitor social media when it is so popular, complex and fast-mutating? It’s a serious and complicated question – and the tragic case of 14-year-old Molly Russell, who took her life, reminded us that we must find the answers.
In an effort to respond, and tackle the cyber abuse that affects so many teenagers, the picture-sharing app Instagram has now announced two new features. The first is an anti-bullying tool, which will use artificial intelligence to recognise when new posts are similar to those that have been reported as inappropriate. So far, so good. But once the nefarious or abusive language has been identified, the perpetrator is not served with a reminder of the social and legal ramifications of online abuse. No, they’re merely asked a single, feeble question.
In an example scenario, Instagram shows a user typing “you are so ugly and stupid” (a catch-all insult). They are met with a notice asking, “Are you sure you want to post this? Learn more.”
This is designed to prompt users into reassessing what they post online. It is definitely a positive step, but the placid question feels like a weak response to behaviour that, as we have seen, can lead to suicide.
The second tool, which will be implemented at some undisclosed point in the future, is called “Restrict”. Users who are reluctant to block accounts completely (for teenagers especially, this action can have ramifications in real life) can instead filter content from others, meaning that the person who created a post will be able to read and approve comments. Only after the recipient grants approval will those comments become public, but restricted users will not know that they have been restricted.
Both new measures shift the responsibilities of moderation onto young social media users, expecting them to curate content – and handle abuse with detachment and maturity. Vulnerable teens who may already have an unhealthy relationship with social media cannot be expected to take the issue of cyberbullying into their own hands.
The new features are yet another indicator that tech giants, while keen to pay lip-service to the issues children face online, simply aren’t taking them seriously.
A few weeks ago, Facebook announced plans to fully encrypt its services. The company, which owns Instagram, intends to use end-to-end encryption on its Facebook Messenger service, a move which has been heavily criticised by the NSPCC. “It places privacy and secrecy ahead of accountability and transparency,” argued the charity’s chief executive, Peter Wanless. “It’s really disappointing that the reaction to the NSPCC’s and young people’s call for a safer internet is to make it a lot more secret and more dangerous for them.”
So Instagram is not alone in failing to tackle the big issues comprehensively, and it is difficult to know where the boundaries of free expression should begin and end. But the app’s toothless updates do nothing to counter the online culture that fosters and feeds teenage insecurity. Everything from the use of “face filters” to the practice of targeting users with deliberately provocative, aspirational images create a toxic landscape that leaves young people in a very vulnerable position.
The political, social and economic power of the social media tech giants is almost unfathomable. Their level of influence is becoming dystopian. They must be reminded that no company, however large and however globally influential, is beyond accountability.
The issue of children’s mental health is a battle worth fighting, especially given that tech companies have a known history of putting online safety too low down their list of priorities. How the tools of social media are employed is, of course, down to the user – but when those users are young and socially inexperienced, they cannot be expected to police themselves. The organisations that design these tools mustn’t be allowed to shirk their social responsibilities; they have the time and the resources to solve the problems they have created, but to do so they’ll need some better ideas than this.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments