In the aftermath of a damaging report involving a former whistleblower, Instagram has unveiled a new “PG-13” safety mode for its teenage users. This move by parent company Meta is a clear attempt to address the report’s conclusion that “Kids are not safe on Instagram.”
The new system will default all users under 18 to a more restrictive “13+” setting. This setting will filter out a wider range of content, including strong language and risky stunts, and will require parental permission to be disabled.
The context of this announcement is the independent review led by Arturo Béjar, a former senior engineer at Meta. His research claimed that two-thirds of the platform’s new safety tools were ineffective, adding significant weight to long-standing criticisms of the company.
While Meta rejected the report’s findings publicly, the launch of this comprehensive new system suggests the criticism had a significant internal impact. The PG-13 framework directly addresses the types of harmful content that critics and whistleblowers have been highlighting for years.
As the feature rolls out, the legacy of the whistleblower report looms large. Safety advocates, including the Molly Rose Foundation which was involved in the report, are now demanding that Meta allow independent testing to ensure this new feature is not just another ineffective tool.