Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Instagram wants you to think it’s acted fast over Molly Russell’s suicide – but these small steps were already overdue

Removing self-harm images is welcome, of course, but how exactly will that be done? Social media users need to be told

Madeline Palacz
Tuesday 29 October 2019 17:27 GMT
Comments
Instagram boss says hiding likes could make app less of a competition

On Sunday, Instagram announced that it would be extending its pledge to remove all graphic images of self-harm from its platform to incorporate “fictional depictions of self-harm”. That includes drawings, cartoons and memes about suicide, in addition to any other method “promoting” self-harm. It is a welcome move from a key player in the debate over online harms, but it represents a small drop in what is fast becoming an even murkier ocean.

The regulation of harmful, yet non-illegal content, remains a concept devoid of clarity. Even the government seems to be perplexed by the issue – the plans set out in the online harms white paper which was published earlier this year was widely criticised as lacking in definition. The question, it seems, is proving more complicated than the answer. In the meantime, and in the absence of any form of regulation, tech giants have taken action in response to public pressure, which has been growing since the tragic death of British teenager Molly Russell. In identifying the scale of the issue, her death has rightly changed the sense of immediacy.

In February this year, Instagram pledged to remove graphic self-harm related content and not to show non-graphic self-harm related content in its search, hashtags and the explore tab. Instagram says that between April and June it removed 834,000 pieces of content from its site. Yet the lack of transparency over the specific nature of this content, who (or what) made the decision to remove it, and on what basis, should give us all cause for concern as Instagram moves into this new phase.

Memes, for example, which will now be the subject of Instagram’s scrutiny, are a relatively new phenomenon. They are widely considered to be an expression of a cultural idea or practice. They can be interpreted as being humorous, critical, divisive – even dangerous. The multifaceted meanings attached to one small box on the internet will require careful consideration by Instagram if it wishes to be consistent in its decision making.

By extending its pledge to remove harmful content to involve more creative forms of self-expression, Instagram is moving into grey territory. It will need to take care to avoid any unnecessary infringement of a person’s right to speak about their personal experiences. How might Instagram resolve a case where the meme’s image shows non-graphic, self-harm related content, but the meaning of the text which accompanies it encourages survival? When non-graphic images of self-harm such as healed scars were blurred by Instagram, the hashtag #youcantcensormyskin ignited debate about why such content is considered harmful.

A balance must be struck between the rights of those individuals effected by self-harm or thoughts of suicide who wish to share their experiences, and the potential harm which sharing those stories might cause to others. Since February, algorithms have helped to remove a large quantity of graphic self-harm content from the site. There has, so far, been no public explanation by Instagram regarding the reasons why its algorithms deemed certain content to require censorship. This sets a dangerous precedent. It remains to be seen how an algorithm will deal with the nuances involved in more creative forms of self-expression, such as memes and drawings. Transparency and accountability around such decision-making will be vital to establish and maintain public trust in the platform.

Instagram was right to acknowledge the complexity of the issue and that it will need time to fully implement its new strategy. “It’s not going to be the last step we take,” Instagram’s chief Adam Mosseri confirmed.

Instagram’s next step is already overdue. It will need to be a crucial one. While the removal of graphic self-harm related content is certainly welcome, it does not address a central problem which has, as yet, been left largely unaddressed: the influence of Instagram’s algorithms.

Support free-thinking journalism and attend Independent events

Instagram’s popularity comes from its ability to give a user’s most “relevant posts” the most visibility. What shows up first in your feed is determined by what posts and accounts you engage with the most, as well as other contributing factors such as the timeliness of posts, how often you use Instagram, and how many people you follow. This is undoubtedly a valuable tool, which allows a user to cultivate their own personal space on the internet. However, according to her father, it was Instagram’s algorithm which allowed “similar content” to be “pushed” on Molly as she viewed images of self-harm.

As users of the platform, it seems we have little control over the alien concept of the algorithm. There is certainly a lack of available information on whether it is even possible for a user to substantially alter the algorithm once it becomes established on a page, should they wish to do so. It will remain to be seen how Instagram will address this thorny issue. It will need to admit that its algorithm, its core appeal, might be just as harmful as the content which it is pledging to remove.

If you are experiencing feelings of distress and isolation, or are struggling to cope, The Samaritans offers support; you can speak to someone for free over the phone, in confidence, on 116 123 (UK and Ireland), email jo@samaritans.org, or visit the Samaritans website to find details of your nearest branch.

For services local to you, the national mental health database – hubofhope.co.uk – allows you to enter your postcode to search for organisations and charities who offer mental health advice and support in your area.

Madeline Palacz is The Independent’s editorial compliance manager

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in