Some teens will now have their Facebook and Instagram accounts defaulted to a setting that blocks strangers from sending direct messages, Meta announced on Thursday.
This default setting is designed to stop teens under 16 (“or under 18 in certain countries”) from receiving “unwanted contact,” Meta said. In addition to restricting “adults over the age of 19 from messaging teens who don’t follow them” on Instagram, the new policy also blocks teens from receiving direct messages from other teens they do not follow. On Facebook, it restricts any accounts from contacting teens on Messenger who appear neither on their Facebook friends list or in their phone’s contacts.
This change comes after a whistleblower, Arturo Bejar—a senior engineer who formerly led online security, safety, and protection efforts at Meta—told Congress last November that he returned to work for Meta as a consultant after discovering that his 14-year-old child and her friends “repeatedly faced unwanted sexual advances, misogyny, and harassment” on Instagram. According to Bejar, his subsequent research documented “staggering levels of abuse” targeting young users, with at least 13 percent of users aged 13–15 reporting that they received unwanted sexual advances in a single week.
Any teens currently on Instagram will receive a notification “at the top of their feed” alerting them to the change in their settings. Teens can then change this default setting, but any teens currently using “supervised accounts” must get a parent’s permission to do so. And more changes will be coming “later this year,” Meta said, when the company is “planning to launch a new feature designed to help protect teens from seeing unwanted and potentially inappropriate images in their messages from people they’re already connected to, and to discourage them from sending these types of images themselves.”
Meta’s policy change also comes after the company announced other changes earlier this year that block teens from seeing sensitive content. But “radically” improving the experience of teens on Meta platforms must be done “without eliminating the joy and value that they otherwise get from using such services,” Bejar said.
Efforts by federal and state governments to pass laws imposing similar restrictions on social media platforms to protect the mental health of teens have been met with resistance. Last August, more than 90 organizations opposed the Kids Online Safety Act, a bill introduced in the US Senate, concerned that platforms imposing “broad content filtering to limit minors’ access to certain online content” could impact what content everyone sees online.
This month, a teenager named Hannah Zoulek sued Utah in an attempt to block the enforcement of a social media law imposing restrictions on teen users. Her complaint alleged that the law was flawed in many ways. Perhaps most significantly, by subjecting teens’ accounts to parental surveillance, it could allegedly censor teens’ speech. And by preventing direct messaging between any “account held by a Utah minor” and “any other user that is not linked to the account through friending,” the state’s law could hinder “minors’ ability to find support and make connections with people outside their existing circle, a key feature of social media—particularly for vulnerable youth.” It’s likely that Zoulek and co-plaintiffs would see the same flaws in Meta’s recent policy updates.
So what should Meta be doing? Bejar’s recommendations for Meta went beyond updating policies in response to limited Facebook and Instagram data. He told Congress that identifying the best approaches to harm reduction for teens on Meta platforms required Meta to meticulously and transparently gather teen user experience data to find out how teens are being harmed over time and whether current tools were preventing harm. With this data, Meta can measure the effectiveness of solutions like the policy change announced today and detect when new problems that require new solutions arise.
In Zoulek’s complaint, plaintiffs suing Utah suggested that Meta’s policy of providing parental supervision tools that enable parents to restrict their kids’ social media accounts is at least preferable to Utah making those decisions for all families in the state.