close
close

Why Utah Governor Says New Teen Instagram Accounts Aren’t Enough

NEW YORK — Instagram announced Tuesday that it has launched its most dramatic initiative yet to protect young users from dangers on its platform, introducing new “teen account” settings that will automatically make millions of teen accounts private and limit the types of content those users can view on the app.

The change in how Instagram allows teenagers to use its platform comes almost three years after the Facebook Papers scandal first brought to public attention the dangers the platform posed to young users.

The new restrictions are also intended to encourage teens to adopt parental supervision through the app. Instagram will automatically apply new “teen account” settings to all users under the age of 18. After the update, users aged 16 and 17 will be able to manually change the app back to their preferred settings, but users aged 13 to 15 will need to get parental consent for any such changes.

The new “teen accounts” settings build on the more than 30 wellness and parental control tools that parent company Meta has introduced in recent years, such as “take a break” prompts and restrictions on “age-inappropriate” content like posts about eating disorders. Despite these earlier updates, the company has still faced criticism for placing too much responsibility for safety in the hands of parents and, in some cases, teens themselves. The parental control tools, for example, relied on teens to let their parents know they were on the app.

Pressure on Meta to do more to protect teens has intensified again after Facebook newcomer and whistleblower Arturo Bejar told a Senate subcommittee in November that top Meta executives, including CEO Mark Zuckerberg, ignored warnings about the dangers of using its platforms for years.

Court documents from recent lawsuits against the company also allege that Zuckerberg repeatedly blocked initiatives aimed at protecting the health of teens, that Meta knowingly refused to close accounts belonging to children under 13 and that the company facilitated pedophiles.

During a Senate hearing in January, Zuckerberg apologized to families who said their children had been harmed by social media.

Meta says the latest changes are designed to “address parents’ biggest concerns: who their teens are talking to online, what content they’re watching, and whether they’re using their time well.”

But for Utah Gov. Spencer Cox, the changes don’t go far enough. Utah has lawsuits against social media companies, including Instagram, accusing them of harming the mental health of Utah’s youth.

Cox responded to the changes by saying, “Utah has always been at the forefront of protecting our children in the digital age, and we appreciate Meta taking a step in the right direction by announcing teen accounts. Many of these new features mirror our recently passed laws, demonstrating a growing awareness of the responsibility social media companies have to their younger users. However, while these are positive steps, we believe they do not go far enough to ensure the safety and well-being of Utah’s children online. We encourage Meta and all social media platforms to continue to innovate and implement even stronger protections for minors.”

The “teen accounts” update means that accounts for users under 18, both new and existing, will automatically be set to private and placed on the strictest messaging settings. This change will only allow teen users to receive messages from people they are already connected to. Instagram will also limit who can tag teens in photos or mention them in comments to only people they follow.

Additionally, teens will be placed in Instagram’s most restrictive content control settings. This change limits the types of “sensitive” content teens can see on their Explore page and in Reels, such as posts promoting beauty treatments.

Instagram began implementing this strategy on a more limited basis earlier this year.

Teens will also receive time-out reminders that will prompt them to exit the app after spending 1 hour per day. The app will go into “sleep mode” by default, silencing notifications and sending automatic replies to direct messages between 10pm and 7am.

Instagram plans to roll out the changes to all teen accounts in select countries, including the United States, starting next week.

The app will also add new features to its parental supervision tool, allowing parents to see which accounts their teen has recently messaged, set an overall daily limit on how much time their teen can spend on Instagram, prevent teens from using Instagram at night or during certain times, and see what topics their teen has chosen to watch on the app.

The changes are expected to roll out to all teen accounts in the US, UK, Canada and Australia within the next 60 days, and then to other countries later this year and next year.

But the effectiveness of some of the changes could be limited by a simple truth: Meta has no way of knowing for sure whether a parent is actually monitoring a teen’s account, as opposed to, say, an older friend. Meta doesn’t conduct formal parental verification, but says it relies on signals like an adult user’s date of birth and the number of other accounts it supervises to determine whether it should be allowed to supervise a teen’s account, the spokesperson said.

The Meta platform has long faced criticism for its failure to take steps to prevent teenagers from lying about their age when creating new accounts in order to bypass security restrictions.

The company says it is implementing artificial intelligence technology that will be designed to identify teen accounts that incorrectly state an adult’s date of birth.

Meta says the new features were developed in consultation with its Security Advisory Board, which includes independent experts and organizations focused on online safety, as well as a group of youth advisors. It also took into account the opinions of other teens, parents and government representatives.