The Independent's journalism is supported by our readers. When you purchase through links on our site, we may earn commission.

Tumblr app disappears on iPhone after child sex abuse images found on blogs

'We really appreciate your patience'

Andrew Griffin
Tuesday 20 November 2018 10:32 GMT
Comments
Tumblr says it was targeted by Russian hackers for the spread of 'fake news' during the 2016 US presidential election
Tumblr says it was targeted by Russian hackers for the spread of 'fake news' during the 2016 US presidential election (Reuters)

Tumblr's app has disappeared on the iPhone App Store after child sex abuse images were found on its platform.

The company says it is acting urgently to remove the content and to have the app restored onto iOS.

The app mysterious disappeared last week, being removed from the App Store with no announcement. While that didn't take it from existing phones, it meant that the app would not update and that nobody could download it newly.

Since then, speculation has been rife about what had happened to mean Apple had removed the app from its store.

Tumblr has always had a relatively permissive attitude towards explicit content, and some had speculated it was that availability of pornography and other images and video that led it to be taken off the App Store.

But Tumblr said that it was actually the consequence of finding "child sexual abuse material" being traded around the platform. Ordinarily, all content on Tumblr and other sites is scanned against a database of such images, to stop them being uploaded – but some content appears not to have been in that database, and so was uploaded onto the site, Tumblr said.

"We’re committed to helping build a safe online environment for all users, and we have a zero tolerance policy when it comes to media featuring child sexual exploitation and abuse," it said in a new statement posted to a page devoted to updates on its efforts to make the app available again.

"As this is an industry-wide problem, we work collaboratively with our industry peers and partners like NCMEC to actively monitor content uploaded to the platform. Every image uploaded to Tumblr is scanned against an industry database of known child sexual abuse material, and images that are detected never reach the platform.

"A routine audit discovered content on our platform that had not yet been included in the industry database. We immediately removed this content. Content safeguards are a challenging aspect of operating scaled platforms. We’re continuously assessing further steps we can take to improve and there is no higher priority for our team."

It also thanked users for their patience and said it was doing all it could to get the app back online.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in