close
close

Instagram makes teen accounts private as app faces growing pressure to protect kids

Instagram is making teen accounts private by default in an effort to make the platform safer for children amid growing criticism of the impact of social media on young people’s lives.

Starting Tuesday in the US, UK, Canada and Australia, anyone under 18 who signs up for Instagram will be placed on restrictive teen accounts, and those with existing accounts will be migrated over the next 60 days. Teen accounts in the European Union will be adjusted later this year.

Meta’s parent company acknowledges that teens may lie about their age and says it will require them to verify their age in more cases, such as when they try to create a new account with an adult’s birthdate. The Menlo Park, California-based company also said it is developing technology that proactively finds accounts of teens pretending to be adults and automatically places them in restricted teen accounts.

Teen accounts will be private by default. Private messages are restricted, so teens can only receive messages from people they follow or are already connected to. “Sensitive content,” such as videos of people fighting or promoting beauty treatments, will be restricted, Meta said. Teens will also receive notifications if they’re on Instagram for more than 60 minutes, and a “sleep mode” will be enabled that turns off notifications and sends automatic replies to direct messages from 10 p.m. to 7 a.m.

While these settings will be enabled for all teens, 16- and 17-year-olds will be able to turn them off. Children under 16 will need parental consent to do so.

“The three concerns we hear from parents are that their teens are seeing content they don’t want to see, or that they’re being contacted by people they don’t want to be contacted, or that they’re spending too much time on the app,” said Naomi Gleit, Meta’s head of product. “So the teen accounts are really focused on addressing those three concerns.”

The statement comes after the company faced lawsuits from dozens of U.S. states accusing it of harming young people and contributing to the youth mental health crisis by knowingly and intentionally designing features on Instagram and Facebook that make children addicted to those platforms.

While Meta did not provide details on how the changes might affect its business, the company said the changes could mean that teens will be using Instagram less in the short term. Emarketer analyst Jasmine Enberg said the impact on revenue “will likely be minimal.”

“Even if Meta continues to put teen safety first, it’s unlikely to make radical changes that could take a major financial hit,” she said, adding that teen accounts are unlikely to significantly impact how much teens engage with Instagram “in the slightest, as there are still plenty of ways to get around the rules, and it may even make them more motivated to bypass the age restrictions.”

New York Attorney General Letitia James said the announcement by Meth was “an important first step, but much more needs to be done to ensure our children are protected from the harms of social media.” James’ office is working with other New York officials on how to implement a new state law aimed at limiting children’s access to what critics call addictive social media channels.

Others were more critical. Nicole Gil, co-founder and executive director of the nonprofit Accountable Tech, called Instagram’s announcement “the latest attempt to avoid real independent oversight and regulation and instead pursue self-regulation, putting the health, safety and privacy of young people at risk.”

“Today’s PR efforts fall short of the safety and accountability that young people and their parents deserve, and that only meaningful policy can provide,” she said. “Meta’s business model is based on addicting users and mining their data for profit; no amount of parental and teen controls like Meta is proposing will change that.”

Sen. Marsha Blackburn (R-Tenn.), co-sponsor of the Kids Online Safety Act that recently passed the Senate, questioned the timing of the announcement “on the eve of the House of Representatives’ consideration” of the bill.

“Like clockwork, the Children’s Online Safety Act is moving forward and the industry is coming up with a new set of self-enforceable guidelines,” she said.

Meta’s past efforts to address teen safety and mental health on its platforms have also drawn criticism that the changes don’t go far enough. For example, while kids will get a notification when they spend 60 minutes on the app, they can still skip it and continue scrolling.

Unless the child’s parents enable “parental supervision” mode, in which they can limit the time the teenager spends on Instagram to a specific time period, e.g. 15 minutes.

With its latest changes, Meta is giving parents more options to supervise their children’s accounts. Those under 16 will need permission from a parent or guardian to change their settings to less restrictive ones. They can do this by setting “parental supervision” on their accounts and connecting them to a parent or guardian.

Nick Clegg, global president of Meta, said last week that parents are not using the parental controls the company has introduced in recent years.

Meta’s Gleit believes teen accounts will encourage parents to use them.

“Parents will be able to see, through the family hub, who is messaging their teen and hopefully have a conversation with their teen,” she said. “If there is bullying or harassment, parents will have insight into who their teen is following, who is following their teen, who their teen has been messaging in the last seven days and hopefully be able to have some of those conversations and help them navigate those really difficult online situations.”

US Health Secretary Vivek Murthy said last year that tech companies put too much responsibility on parents when it comes to keeping children safe using social media.

“We’re asking parents to steward technology that’s evolving rapidly and that’s fundamentally changing how their kids think about themselves, how they build friendships, how they experience the world — and technology, by the way, that previous generations never had to deal with,” Murthy said in May 2023.