Meta is under
pressure both in the United States and Europe over allegations that its apps
are addictive and have caused fuel a youth mental health crisis.
Attorneys in
U.S. including California and New York sued the company in October 2023,
claiming it repeatedly misled the public about the danger of its platforms. More
recently, a former engineering director and consultant for the social media
company testified in a congressional hearing in November 2023 that company was
aware of harassment and other harms facing teens on its platforms but failed to
act against them. Whilst in Europe, the European Commission has sought
information on how Meta protects children from illegal and harmful content.
Meta Platforms
said it would hide more content from teens on Instagram and Facebook, after
regulators around the globe pressed the social media giant to protect children
from sensitive and illegal content on its apps.
All teens will now be placed into the most restrictive
content control settings on the apps and additional search terms will be
limited on Instagram. Meta will remove content around self-harm, suicide, and
eating disorders from teen users on Instagram and Facebook, even if it’s shared
by a user they follow, stating that while content on self-harm “can help
destigmatize these issues,” “it’s a complex topic and isn’t necessarily
suitable for all young people.” Additionally, when teen users search for terms
related to self-harm, suicide, and eating disorders, Meta will hide related
results and redirect the search to a helpline or other resources for users to
seek support.
While some of the restrictions are already applied to
new teen users who sign up for Instagram and Facebook, Meta said the additional
protections will roll out to all teen users in the coming weeks and months.
Sources: