YouTube, TikTok, Snap execs face senators on kids’ safety
The leaders of a Senate panel have called executives from YouTube, TikTok and Snapchat to face questions on what the companies are doing to ensure young users’ safety
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Bearing down on hugely popular social media platforms and their impact on children, the leaders of a Senate panel have called executives from YouTube TikTok and Snapchat to face questions on what their companies are doing to ensure young users’ safety.
The Senate Commerce subcommittee on consumer protection is fresh off a highly charged hearing with a former Facebook data scientist, who laid out internal company research showing that the company's Instagram photo-sharing service appears to seriously harm some teens.
The panel is widening its focus to examine other tech platforms, with millions or billions of users, that also compete for young people’s attention and loyalty.
The three executives — Michael Beckerman, a TikTok vice president and head of public policy for the Americas; Leslie Miller, vice president for government affairs and public policy of YouTube’s owner Google; and Jennifer Stout, vice president for global public policy of Snapchat parent Snap Inc. — are due to appear at a subcommittee hearing Tuesday.
The three platforms are woven into the fabric of young people’s lives, often influencing their dress, dance moves and diet, potentially to the point of obsession. Peer pressure to get on the apps is strong. Social media can offer entertainment and education, but platforms have been misused to harm children and promote bullying, vandalism in schools, eating disorders and manipulative marketing, lawmakers say.
“We need to understand the impact of popular platforms like Snapchat, TikTok and YouTube on children and what companies can do better to keep them safe," Sen. Richard Blumenthal, D-Conn., the subcommittee’s chairman, said in a statement.
The panel wants to learn how algorithms and product designs can magnify harm to children, foster addiction and intrusions of privacy, Blumenthal says. The aim is to develop legislation to protect young people and give parents tools to protect their children.
The video platform TikTok, wildly popular with teens and younger children, is owned by the Chinese company ByteDance. In only five years since launching, it has gained an estimated 1 billion monthly users.
TikTok denies allegations, most notably from conservative Republican lawmakers, that it operates at the behest of the Chinese government and provides it with users’ personal data. The company says it stores all TikTok U.S. data in the United States. The company also rejects criticisms of promoting harmful content to children.
TikTok says it has tools in place, such as screen time management, to help young people and parents moderate how long children spend on the app and what they see. The company says it focuses on age-appropriate experiences, noting that some features, such as direct messaging, are not available to younger users.
Early this year after federal regulators ordered TikTok to disclose how its practices affect children and teenagers, the platform tightened its privacy practices for the under-18 crowd.
A separate House committee has investigated video service YouTube Kids this year. Lawmakers said the YouTube offshoot feeds children inappropriate material in “a wasteland of vapid, consumerist content” so it can serve ads to them. The app, with both video hosting and original shows, is available in about 70 countries.
A panel of the House Oversight and Reform Committee told YouTube CEO Susan Wojcicki that the service doesn’t do enough to protect children from potentially harmful material. Instead it relies on artificial intelligence and self-policing by content creators to decide which videos make it onto the platform, the panel’s chairman said in a letter to Wojcicki.
Parent company Google agreed to pay $170 million in 2019 settlements with the Federal Trade Commission and New York state of allegations that YouTube collected personal data on children without their parents’ consent.
Despite changes made after the settlements, the lawmaker’s letter said, YouTube Kids still shows ads to children.
YouTube says it has worked to provide children and families with protections and parental controls like time limits, to limit viewing to age-appropriate content. It emphasizes that the 2019 settlements involved the primary YouTube platform, not the kids’ version.
“We took action on more than 7 million accounts in the first three quarters of 2021 when we learned they may belong to a user under the age of 13 — 3 million of those in the third quarter alone — as we have ramped up our automated removal efforts," Miller, the Google vice president, says in written testimony prepared for the hearing.
Snap Inc.'s Snapchat service allows people to send photos, videos and messages that are meant to quickly disappear, an enticement to its young users seeking to avoid snooping parents and teachers. Hence its “Ghostface Chillah” faceless (and word-less) white logo.
Only 10 years old, Snapchat says an eye-popping 90% of 13- to 24-year-olds in the U.S. use the service. It reported 306 million daily users in the July-September quarter.
The company agreed in 2014 to settle the FTC’s allegations that it deceived users about how effectively the shared material vanished and that it collected users’ contacts without telling them or asking permission. The messages, known as “snaps,” could be saved by using third-party apps or other ways, the regulators said.
Snapchat wasn’t fined but agreed to establish a privacy program to be monitored by an outside expert for the next 20 years — similar to oversight imposed on Facebook, Google and Myspace in privacy settlements in recent years.
__
Follow Marcy Gordon at https://twitter.com/mgordonap
Subscribe to Independent Premium to bookmark this article
Want to bookmark your favourite articles and stories to read or reference later? Start your Independent Premium subscription today.