Parents have long been frustrated by internet platforms that treat teen accounts much the same as adult accounts. Data-collection laws protecting children under 13 don’t extend to older minors. Social media’s biggest operator is addressing this with a shift in content filtering.
Meta Platforms plans to automatically restrict teen Instagram and Facebook accounts from harmful content including videos and posts about self-harm, graphic violence and eating disorders. The changes are expected to roll out in the coming weeks.
This marks the biggest change the tech giant has made to ensure younger users have a more age-appropriate experience on its social-media sites. The new content restrictions come as more than 40 states are suing Meta, alleging the tech company misled the public about the dangers its platforms pose to young people.
State attorneys general, in their October lawsuit against Meta, cited internal Meta documents showing that the company designed its products to capitalize on young users’ predisposition to peer pressure and potentially risky behavior. Meta in November told the Journal it didn’t design its products to be addictive for teens.
Teen accounts—that is, accounts of under-18 users, based on the birth date entered during sign-up—will automatically be placed into the most restrictive content settings. Teens under 16 won’t be shown sexually explicit content. On Instagram, this is called Sensitive Content Control, while on Facebook, it is known as Reduce. Previously, teens could choose less stringent settings. Teen users can’t opt out of these new settings.
The new restricted status of teen accounts means teens won’t be able to see or search for harmful content, even if it is shared by a friend or someone they follow. For example, if a teen’s friend had been posting about dieting, those posts will no longer be visible to the teen. However, teens might still see content related to a friend’s recovery from an eating disorder.
Teens won’t necessarily know what they aren’t seeing, a company spokeswoman said, because the content simply won’t be available to them. Meta says it consulted with experts in adolescent development to determine what types of content are inappropriate for teens.
Meta said its algorithms already avoid recommending harmful content to teens in its video Reels and Explore page. With the new changes, such content will no longer be shown to teens in their Feed and Stories.
The changes will be automatically applied to existing teen accounts starting this week. Newly created teen accounts will also be restricted to age-appropriate content.
Resources for help
When teens search for terms related to suicide, self-harm and eating disorders, Instagram and Facebook will hide related results and direct them to expert resources for help. The company already hides results for suicide and self-harm search terms that break the platforms’ rules; now Meta is extending the protection to include additional terms.
No system is perfect. Teens have found workarounds on social media to search for restricted content by misspelling words or creating new search terms.
Meta is also introducing a tool to make teens’ sharing settings more private on Instagram. A notification will surface that offers teen users the option to “turn on recommended settings” with one tap. The notification pops up in situations where the teen account is tagged by, or has some other interaction with, an unknown account.
Once account holders turn on the recommended settings, their accounts will restrict who can repost their content, tag or mention them, or include their content in Reels Remixes. With those settings, only teens’ followers can message them as well.
—For Family & Tech columns, advice and answers to your most pressing family-related technology questions, sign up for my weekly newsletter.
Write to Julie Jargon at Julie.Jargon@wsj.com