Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

X chief Linda Yaccarino admits at Senate hearing platform needs more moderators

‘We’ve increased the number of trust-and-safety employees across the world,’ the X boss said

Vishwam Sankaran
Thursday 01 February 2024 06:48 GMT
Comments
Senate Judiciary Committee grills social media CEOs on child safety

X chief Linda Yaccarino told a US Senate committee that the company – formerly Twitter – is increasing the number of its trust and safety staff, acknowledging that it needed more content moderators.

Several chiefs of big tech companies, including those of X, Meta, Snap, Discord, and TikTok, gathered in Washington, DC, to testify before the Senate on child safety policies on Wednesday.

Ms Yaccarino, who was subpoenaed to attend the hearing, said X increased the number of its trust and safety staff by 10 per cent over the past year, and planned to hire 100 new moderators to curb child sexual exploitation content.

The X chief saying the platform is hiring more moderators has raised eyebrows since its owner Elon Musk fired thousands of employees, including outsourced workers moderating content just weeks after buying the company in 2022.

Senator Peter Welch, a Democrat from Vermont, asked Ms Yaccarino how many of those kinds of employees X had “before.”

She replied that “the company is just coming through a significant restructuring, so we’ve increased the number of trust-and-safety employees across the world in the last 14 months.”

US tech hearing: CEOs questioned over child exploitation online

Even as Mr Musk claimed after his takeover of the company that “[r]emoving child exploitation is priority #1,” reports revealed the global team responsible for tackling such content on the site was overwhelmed after massive layoffs at X.

Australia’s online safety watchdog eSafety Commission reported earlier this month that X slashed its global trust and safety staff by 30 per cent including an 80 per cent reduction in the number of safety engineers since Mr Musk’s takeover of the company in 2022.

The commission, which describes itself as the world’s first government agency dedicated to keeping people safer online, said this job cut was 30 per cent globally and 45 per cent in the Asia-Pacific region.

The report pointed out that full-time employee content moderators had been reduced by more than half from 107 to 51 with the number of content moderators employed on contract falling by over 10 per cent from 2,613 to 2,305.

Even if the company added 100 more content moderators, as the X chief claimed before the Senate, it may not be nearly enough to curb content related to child exploitation.

Ms Yaccarino repeated in her statements to the Senate that currently less than 1 per cent of X users in the US are under 18.

It is unclear how X arrived at this figure as a 2023 Pew survey suggested nearly a fifth of 13- to 17-year-olds in the US use the microblogging platform.

X did not immediately respond to The Independent’s request for comment.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in