Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Australia will require social media platforms to act to prevent online harm to users

Australia plans to require social media platforms to act to prevent online harms to users such as bullying, predatory behavior and algorithms pushing destructive content

Rod McGuirk
Thursday 14 November 2024 07:12 GMT

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Australia plans to require social media platforms to act to prevent online harms to users such as bullying, predatory behavior and algorithms pushing destructive content, the government said Thursday.

“The Digital Duty of Care will place the onus on digital platforms to proactively keep Australians safe and better prevent online harms,” Communications Minister Michelle Rowland said in a statement.

The proposed changes to the Online Safety Act were announced before the government next week introduces to Parliament world-first legislation that would ban children younger than 16 from platforms including X, Instagram, Facebook and TikTok.

Critics have argued that removing children from social media reduced incentives for platforms to provide safer online environments.

Social media has been blamed for an increase in children taking their own lives and developing eating disorders due to bulling and exposures to negative body images.

Rowland said making tech companies legally responsible for keeping Australians safe was an approach already adopted by Britain and the European Union.

Digital businesses would be required to take reasonable steps to prevent foreseeable harms on their platforms and services. The duty of care framework would be underpinned by risk assessment and risk mitigation, and informed by safety-by-design principles, the minister said.

Legislating a duty of care would mean services can’t “set and forget.” Instead, their obligations would mean they need to continually identify and mitigate potential risks, as technology and service offerings change and evolve, she said.

The categories of harm in the legislation include harm to young people and mental well-being, promotion of harmful practices and illegal activity.

The government has not said when the duty of care legislation will be introduced to Parliament or outlined the punishment for breaches.

The Digital Industry Group Inc., an advocate for the digital industry in Australia better known as DIGI, welcomed government efforts to “future-proof” the Online Safety Act.

“DIGI’s members together represent some of the safest sections of the Internet, and their work to keep people safe on their services never stops,” DIGI managing director Sunita Bose said in a statement.

“While we wait for further details about this announcement, DIGI’s members will continue to deliver safety-by-design on their services and work constructively with the government to keep Australians safe online,” Bose added.

Swinburne University digital media expert Belinda Barnet described the duty of care as a “great idea.”

“It’s quite pioneering to expect that platforms that host Australian users would have a duty of care responsibility in terms of the content they show and the experiences they offer,” Barnet said.

“It’s making the platforms take responsibility and that just simply doesn’t happen at the moment. There’s an assumption that they’re a neutral third party. They’re not responsible for the impact of that content,” Barnet added.

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in