Instagram is building an app for kids - here’s what you need to know
The Facebook-owned app will be targeted at children under 13 with greater restrictions
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Social media was not built with children in mind. Facebook was originally made for college kids, Instagram stemmed from its founder’s love of bourbon, and YouTube started as a video dating site.
But teens are active on social media, and many children under 13 already have online social lives. They build worlds together in Minecraft, FaceTime with friends, and send texts and emoji through tools like Facebook Messenger Kids. But they also use apps and browse an Internet that wasn’t designed with them in mind.
Before letting children fall in love with TikTok, fall into YouTube holes or start their own Instagram accounts, parents need to weigh what social media is right for their family. It’s a complicated question, especially when other forms of socialization for kids are still on hold in many parts of the country.
Adding to the conundrum, companies are increasingly making tools specifically for the Internet’s youngest users, who are old enough to type words on a smartphone or computer, but too young for existing social media apps. There is already YouTube Kids and Facebook Messenger Kids. Now Facebook is working on a version of Instagram specifically for children who are under 13.
In the United States, federal laws limit the tracking and targeting of people younger than 13, which companies have often gotten around by using weak age verification. To get on popular sites and apps, children might borrow an adult’s account, have their parents make one for them or lie about their age and start their own. Or in the case of YouTube, just open it in a browser - maybe even on a school-provided Chromebook.
Read more:
With a looming threat of privacy regulation, increasing competition for young users and a desire to hook children on an online ecosystem before they enter middle school, social media companies are branching out. Here are some questions parents should ponder before signing their children up.
- What are the biggest worries about letting kids on social media?
Parents’ main concern about allowing young kids on social media is exposure to sexual content and predators, according to Titania Jordan, the chief parenting officer at online-monitoring company Bark.
She worries that giving kids a screen-based alternative to in-person interaction is a bad idea no matter what precautions are included. Screen time concerns, however, have been put on a back burner by many during the pandemic, as parents and children have more pressing things to worry about and fewer options for in-person socialization.
Not all online interactions are the same. While some parents might be okay with text-based communication, something like Instagram would raise different issues. A photo-based social experience could affect self-esteem and mental health more than just one-on-one texts.
The experts we spoke to are specifically worried about the companies behind these apps and, in the case of Instagram’s plans, bristle at Facebook’s track record. “Facebook’s priority isn’t protecting children; they’re a for-profit company looking to monetize time spent,” Jordan said.
Common Sense Media’s CEO, Jim Steyer, agrees: “This is basically Facebook digging back into their old bag of tricks to get young kids hooked when they’re most vulnerable.”
- Why are tech companies making apps for young kids?
Children are one of the next big untapped online markets, and major tech companies may be interested in appealing to people before they are 13. That requires making a product that parents approve of so they’re not worried about issues like predators or radicalization.
“At Disney, we called it cradle to cane. If you got a kid excited about the Disney brand - excited about princesses at age 3, 4 and 5 - and you could keep that engagement . . . you’ve created a lifelong attachment,” said KC Estenson, a former Disney executive and current CEO of GoNoodle, an app that makes videos, music and games for young kids.
There is also growing pressure on legislators to regulate how Big Tech companies track and handle younger users. By creating apps that claim to be safer on their own, companies like Facebook could be trying to hold off any additional laws that would force them to be even stricter about things like data collection.
- What features should I look for before letting my kids sign up for a social media app?
If you end up considering a social network for your child, here are features and policies you should check first. Jordan recommends looking out for any ephemeral features that make it harder to monitor communications like a vanishing mode, or in the case of Instagram, its Stories feature, which removes posts after 24 hours.
Review direct messaging features, and make sure only approved contacts can communicate with your child. Look for options that let a parent approve contacts, like in Facebook’s Messenger Kids. Check the parental monitoring features, and see how much control you would really have - and if your kid can turn them off without you getting notified.
“Ask, is the app specifically designed for kids? If not, you should totally be on alert,” said Steyer, whose Common Sense Media, a nonprofit advocacy group, reviews kids’ content.
He recommends looking at an app’s business model to avoid anything that is based on targeted ads, and beware of businesses that make their money on in-app purchases. See whether there is an associated adult app, like with Messenger Kids, and consider whether the children’s version is just a way to sign up users and get them on the main site when they’re old enough, Steyer said.
Look beyond promises of safety to see how much data an app is collecting about your child. Does it track or share a device’s geolocation? If it does, look to see how much of those settings you can turn off, and avoid anything that won’t let you opt out.
- And if I do decide to let my kids use adult or kid versions of social media, then what?
Have an honest conversation with your kids about what to look out for online, including bullying, predatory behavior and inappropriate content. Also, keep close tabs on their mental health. Bark, which is used to track the online activity of 5.4 million children, says an annual survey showed Instagram was frequently flagged for suicidal ideation, depression and body image concerns.
Don’t make social media education a one-time conversation, either. “Get to know the platforms they’re on, the games they play, the people they follow and regularly edit those things. Kids are able to have structured adult conversation about what’s right and wrong,” Estenson said.
- Why not just keep them offline?
Taking away phones and computers, banning screen time and video games, and forbidding social media are also options. But the pandemic has shown us that children, with the right guidance and a little space, can find fulfillment and friendships online. If you accept that they’re going to be on the Internet in some way or another, the next step is trying to make it ready for them.
“We need to build a world for them online; it takes a lot of people to do that,” Estenson said. “We need to try and do it with noble intention, not just to make a buck. We need it because the kids are already there.”
- What is Instagram for kids, and does it even exist?
Last week, BuzzFeed News reported that Facebook was working on a children’s version of Instagram - the popular photo-sharing app it bought in 2012. The new app, which was announced by the company internally, would be specifically for users 12 and younger. Officially, Instagram proper is only for people who are 13 and older, but there is no strict age verification, and many younger kids have their own accounts, often with their parents’ permission.
There is no release date for the Instagram app for children. In a statement, the company said kids were asking to “keep up” with their friends, and that’s why it was working on additional social media apps to be “suitable for kids, managed by parents.” Instagram recently hired Pavni Diwanji, a Google executive who oversaw the development of YouTube Kids.
- How would it be different from regular Instagram?
There aren’t many details on what an Instagram for kids would look like or what would make it different or safer than grown-up Instagram. But some clues can be found in a recent blog post from the company.
Last week, Instagram outlined some ways it was trying to make its main app safer for teenage users, including using artificial intelligence to make age-checks more accurate and harder to fake. It added a restriction that prevents adults from sending messages to users who have said they are under 18, unless the younger person already follows them. The company is also adding safety notices for teens when it detects an adult is acting suspiciously, say by mass messaging younger people.
Instagram has also been experimenting with downplaying likes on photos. Limiting that kind of feedback could be key in an under-13 Instagram offering, which would want to avoid the kind of FOMO and pressure to look good that is common on the main app.
We can also look at Facebook’s existing product for kids, Facebook Messenger Kids. Released in 2017 to a flood of criticism, the app had some early issues that the company addressed and is now widely used without much attention. It does not require kids to sign up for Facebook and is controlled through a parent’s account.
- Do kids even want to use apps made for their age group?
Well-designed children’s apps that put privacy first and have strict safeguards to protect young users from harassers and predators might sound great to parents, but not always to the target users. Many children might prefer the less restricted, adult versions and find ways to access them. YouTube Kids, for example, has been that company’s attempt to create a safer space away from the problematic, untamed world of regular YouTube, but kids of all ages are still flocking to the main site. And TikTok, which has no option for people under 13 in the United States, is enormously popular with young users and creators.
© Washington Post
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments