Home   News   National   Article

Meta to restrict more content for teenagers on Facebook and Instagram


By PA News

Register for free to read more of the latest local news. It's easy and will only take a moment.



Click here to sign up to our free newsletters!

Facebook and Instagram are to start hiding more types of content for teenagers as part of an effort to better protect younger users from harmful material online.

As part of the changes, teenager users will no longer see posts from others discussing their personal struggle with thoughts of self-harm or suicide – even if they follow the user in question.

Meta said it was placing all under 18s into the most restrictive content control settings categories on Instagram and Facebook, and was restricting additional terms in Search on Instagram.

This setting already applies to new users who join the site, but is now being expanded to all teenagers using the apps.

The new measures will be rolled out on the two platforms over the coming months (Alamy/PA)
The new measures will be rolled out on the two platforms over the coming months (Alamy/PA)

Meta said the settings make it more difficult for people to come across potentially sensitive content or accounts across the apps, including in the Explore sections.

The new measures will be rolled out on the two platforms over the coming months.

On self-harm and suicide content on Instagram, Meta said it was “focused on ways to make it harder to find”, while also offering support to those who post about it.

“While we allow people to share content discussing their own struggles with suicide, self-harm and eating disorders, our policy is not to recommend this content and we have been focused on ways to make it harder to find,” the social media firm said in a blog post.

“Now, when people search for terms related to suicide, self-harm and eating disorders, we’ll start hiding these related results and will direct them to expert resources for help.

“We already hide results for suicide and self harm search terms that inherently break our rules and we’re extending this protection to include more terms. This update will roll out for everyone over the coming weeks.”

In addition, Meta said it would also begin sending notifications to teens, reminding them to check and update their privacy settings.

In response to the measures, Andy Burrows, adviser to online safety group the Molly Rose Foundation, said Meta’s changes were welcome, but failed to address the issue.

He added: “Our recent research shows teenagers continue to be bombarded with content on Instagram that promotes suicide and self-harm and extensively references suicide ideation and depression.

“While Meta’s policy changes are welcome, the vast majority of harmful content currently available on Instagram isn’t covered by this announcement, and the platform will continue to recommend substantial amounts of dangerous material to children.

“Unfortunately this looks like another piecemeal step when a giant leap is urgently required.”

The charity claimed that much of the harmful content it identified came from meme-style accounts and were not covered by Meta’s announcement.

Do you want to respond to this article? If so, click here to submit your thoughts and they may be published in print.

Keep up-to-date with important news from your community, and access exclusive, subscriber only content online. Read a copy of your favourite newspaper on any device via the HNM App.

Learn more


This site uses cookies. By continuing to browse the site you are agreeing to our use of cookies - Learn More