Instagram will force millions of teens into protected accounts

Clare Duffy, CNN | 9/17/2024, 12:57 p.m.
Instagram on Tuesday announced its most dramatic effort yet to protect young users from dangers on its platform, implementing new …
Instagram, pictured here in 2021 on a smartphone in Berlin, is rolling out new safety features for teen users. Mandatory Credit: Fabian Sommer/dpa/picture alliance/Getty Images via CNN Newsource

 Instagram on Tuesday announced its most dramatic effort yet to protect young users from dangers on its platform, implementing new “teen account” settings that will automatically make millions of teen accounts private and restrict what kinds of content those users can view on the app.

The change to how Instagram lets teens use its platform comes nearly three years after the explosive “Facebook Papers” first drew mass attention to the risks the platform poses for young users.

The new restrictions are also designed to pushteens to adopt parental supervision through the app. Instagram will automatically apply the new “teen accounts” settings to all users under the age of 18. After the update, 16- and 17-year-old users will be able to manually change the app back to their preferred settings, but 13- to 15-year-old users will be required to obtain parental approval for any such changes.

The new “teen accounts” settings build on more than 30 well-being and parental oversight tools parent company Meta had rolled out in recent years, such as “take a break” nudges and restrictions on “age inappropriate” content like posts about eating disorders. Despite those earlier updates, the company has continued to face criticism for placing too much responsibility for safety in the hands of parents and, in some cases, teens themselves. The parental supervision tools, for example, relied on teens letting parents know that they are on the app.

Pressure on Meta to do more to protect teens ramped up again after a new Facebook employee-turned-whistleblower, Arturo Bejar, said in a November Senate subcommittee hearing that Meta’s top executives, including CEO Mark Zuckerberg, ignored warnings for years about harms to teens on its platforms.

Court documents from recent lawsuits against the company have also alleged that Zuckerberg repeatedly thwarted teen well-being initiatives, that Meta knowingly refused to shut down accounts belonging to children under the age of 13 and that the company has enabled child predators.

At a Senate hearing in January, Zuckerberg apologized to families who said their children had been harmed by social media.

Meta says the most recent changes aim to “address parents’ biggest concerns: who their teens are talking to online, the content they’re seeing and whether their time is being well spent.”

The “teen accounts” update means that accounts for users under 18, both new and existing, will automatically be set to private and placed in the strictest messaging settings. The revision will allow teen users to receive messages only from people they are already connected to. Instagram will also limit who can tag teens in photos or mention them in comments to only people they follow.

Additionally, teens will be placed into Instagram’s most restrictive content control settings. The shift limits the types of “sensitive” content teens can see on their Explore page and in Reels, such as posts promoting cosmetic procedures.

Instagram had already begun implementing that strategy in a more limited fashion earlier this year.

Teen users will also receive time limit reminders nudging them to leave after spending 1 hour on the app each day. And the app will default to “sleep mode,” muting notifications and sending auto-replies to direct messages between 10 p.m. and 7 a.m.

Instagram plans to apply the changes for all teen accounts in select countries, including the United States, starting next week.

The app will also add new features to its parental supervision tool, allowing parents to see what accounts their teen has recently messaged, to set total daily time limits for teens’ Instagram use, to block teens from using Instagram at night or during other, specific time periods and to see the topics their teen has chosen to see content from on the app.

The changes are expected to be made to all teen accounts in the United States, the United Kingdom, Canada and Australia within the next 60 days, before rolling out to other countries later this year and next.

But the effectiveness of some of the changes may be hampered by a simple truth: Meta has no way to know for sure if it’s a parent actually monitoring teen accounts rather than, say, an older friend. Meta does not conduct formal parent verification but says it relies on signals, such as the birthdate of the adult user and how many other accounts they supervise to determine if it should be allowed to oversee a teen’s account, a spokesperson said.

Meta has also long faced criticism for failing to do more to prevent teens from lying about their age when they create a new account to bypass safety restrictions.

The company says it is implementing artificial intelligence technology that will aim to identify teen accounts that may have wrongly listed an adult birthdate.

Meta says the new features were developed in consultation with its Safety Advisory Council, comprised of independent online safety experts and organizations, and a group of youth advisors, as well as feedback from other teens, parents and government officials.