Meta has announced that it will start to hide more types of age-inappropriate content for teens on Instagram and Facebook, in line with expert guidance.

The company said that they are automatically placing teens into the most restrictive content control setting on both platforms.

“We already apply this setting for new teens when they join Instagram and Facebook and are now expanding it to teens who are already using these apps,” Meta said in a blog post on Tuesday.

“Our content recommendation controls — known as ‘Sensitive Content Control’ on Instagram and ‘Reduce’ on Facebook — make it more difficult for people to come across potentially sensitive content or accounts in places like Search and Explore,” it added.

Moreover, the tech giant stated that it will start hiding results related to suicide, self-harm, and eating disorders when people search for these terms and will direct them to expert resources for help.

Buy Me a Coffee

“We already hide results for suicide and self-harm search terms that inherently break our rules and we’re extending this protection to include more terms,” Meta said.

This update will roll out for everyone over the coming weeks.

To help make sure teens are regularly checking their safety and privacy settings on Instagram, and are aware of the more private settings available, Meta said they are sending new notifications encouraging them to update their settings to a more private experience with a single tap.

READ
OpenAI Launches Sora, Its Advanced Text-to-Video AI Model

If teens choose to “Turn on recommended settings”, the company will automatically change their settings to restrict who can repost their content, tag or mention them, or include their content in Reels Remixes, the company explained.

The tech giant will also ensure only their followers can message them and help hide offensive comments.