Three years ago, Meta restricted the sending of messages to children from adults unless they were on the friends list, and now it is expanding the rules to their peers.
Users under the age of 16 or 18 (depending on the country) will no longer receive personal messages from third-party users — that is, those they are not subscribed to. The option is enabled by default.
“We want kids to have a safe experience with our apps,” Meta said in a statement.
Earlier this month, Meta announced that it would begin hiding content related to self-harm, violent images, eating disorders and other harmful topics on Instagram and Facebook from children. If a user is under 16, they won’t see posts about these topics in their feeds and stories, even if they’ve been shared by friends. The company also recently introduced nightly notifications that will send users under 18 and prompt them to close the app and go to bed if they’ve been scrolling for more than 10 minutes.
Meta has previously faced a number of lawsuits and complaints about how it protects children. In particular, one lawsuit filed by 33 states accused the company of “actively targeting” children under 13 to use its apps and websites, and of “continuing to collect their data” even after users disclosed their age.
A report by The Wall Street Journal also said that Instagram allegedly recommends “dangerous footage of children” as well as “explicitly sexual adult videos” to accounts that are subscribed to teenagers. In December 2023, the state of New Mexico sued Meta, alleging that Facebook and Instagram’s algorithms recommended sexual content to minors. Employees estimated (via an internal company presentation) that about 100,000 children were harassed on Facebook and Instagram every day.