Meta has announced their latest bundle tools and resources available on Facebook and Instagram to support age-appropriate experiences on Meta’s Apps, and encourage healthier online habits.
With the efforts to continue to prioritise young people’s safety and wellbeing on their Apps, Meta said this significant update underscores its dedication to creating a safer digital environment for teenagers.
Meta said this provides valuable insights for the teens and their parents seeking to cover advancements in online safety.
“We want teens to have safe, age-appropriate experiences on our apps. We’ve developed more than 30 tools and resources to support teens and their parents, and we’ve spent over a decade developing policies and technology to address content that breaks our rules or could be seen as sensitive.”
ALSO READ: WATCH: Meta unveils #EbaSafeOnline Comic Book in SA for parents and children
Meta said it regularly consult with experts in adolescent development, psychology, and mental health to help make the company’s platforms safe and age-appropriate for young people, including improving and understanding of which types of content may be less appropriate for teens.
“Take the example of someone posting about their ongoing struggle with thoughts of self-harm. This is an important story, and can help destigmatize these issues, but it’s a complex topic and isn’t necessarily suitable for all young people.
“Now, we’ll start to remove this type of content from teens’ experiences on Instagram and Facebook, as well as other types of age-inappropriate content. We already aim not to recommend this type of content to teens in places like Reels and Explore, and with these changes, we’ll no longer show it to teens in Feed and Stories, even if it’s shared by someone they follow,” Meta said.
“We’re automatically placing teens into the most restrictive content control setting on Instagram and Facebook. We already apply this setting for new teens when they join Instagram and Facebook, and are now expanding it to teens who are already using these social media apps.
“Our content recommendation controls – known as “Sensitive Content Control” on Instagram and “Reduce” on Facebook – make it more difficult for people to come across potentially sensitive content or accounts in places like Search and Explore,” Meta said.
Meta said while it allows people to share content discussing their own struggles with suicide, self-harm and eating disorders, their policy is not to recommend this content and we have been focused on ways to make it harder to find.
“Now, when people search for terms related to suicide, self-harm and eating disorders, we’ll start hiding these related results and will direct them to expert resources for help.
“We already hide results for suicide and self-harm search terms that inherently break our rules and we’re extending this protection to include more terms. This update will roll out for everyone over the coming weeks,” it said.
Meta said to help make sure teens are regularly checking their safety and privacy settings on Instagram, and are aware of the more private settings available, they are sending new notifications encouraging them to update their settings to a more private experience with a single tap.
“If teens choose to “Turn on recommended settings,” we will automatically change their settings to restrict who can repost their content, tag or mention them, or include their content in Reels Remixes. We’ll also ensure only their followers can message them and help hide offensive comments.”
Meta added its starting to roll these changes out those under 18 now and they’ll be fully in place on Instagram and Facebook in the coming months.
ALSO READ: It’s all about AI, Samsung launches new Galaxy S24 series − Video
Download our app and read this and other great stories on the move. Available for Android and iOS.