Meta, the parent company of Instagram and Facebook, is implementing new measures to protect teenagers from harmful content.
Under-18 users will be subjected to stricter content settings to prevent exposure to self-harm or explicit material.
This comes in response to legal actions by several states and pending federal legislation aimed at regulating tech companies’ impact on minors. (Trending: First Moon Mission In 50 Years Blasts Off)
Rachel Rodger, an associate professor in the Department of Applied Psychology at Northeastern University, said, “Meta is evolving its policies around content that could be more sensitive for teens, which is an important step in making social media platforms spaces where teens can connect and be creative in age-appropriate ways.”
Sen. Marsha Blackburn wrote on X, “Make no mistake: last minute announcements of new kids online safety features by tech companies are nothing but a distraction from historic efforts to hold them accountable for the young lives they have taken.”
The move has received mixed reactions from lawmakers, with some criticizing it as a distraction from broader accountability efforts.
Additionally, Meta’s policy changes coincide with impending CEO hearings on their platforms’ treatment of teenagers.
Most Popular:
Epstein’s Brother Finally Breaks His Silence
Chilling Arrest Footage of Trump Co-Defendant Provides Glimpse Into Jack Smith Probe
U.S. State Passes Personal Pronoun Ban