Meta announced it is introducing additional protections focused on the types of content teens view on Instagram and Facebook. These measures are part of the company's ongoing commitment to ensuring safe and age-appropriate experiences for young users on its platforms. In a post that appeared on the Meta website, the company says it has initiated various collaborations with experts in adolescent development, psychology and mental health, in order to improve the security of its platforms.
Meta is automatically placing teens in the most restrictive content control setting on Instagram and Facebook. This setting, known as “Sensitive Content Control” on Instagram and “Minimize” on Facebook, makes it harder for users to come across potentially sensitive content or accounts in places like Search and Explore.
Hiding Search Results on Instagram Related to Suicide, Self-Harm and Eating Disorders
While Meta allows people to share content that discusses their struggles with suicide, self-harm, and eating disorders, its policy is to not recommend this content and make it harder to find. Now, when people search for terms related to suicide, self-harm and eating disorders, related results will be hidden and they will be directed to expert resources for help.
Vicki Shotbolt, CEO of ParentZone.org, commented that these policies, along with Meta's parental supervision tools, will give parents greater peace of mind knowing that their teens are viewing age-appropriate content online.
These changes are already rolling out to teens under 18 and will be fully effective across Instagram and Facebook in the coming months. Meta will continue to share resources from expert organizations when someone posts content related to personal struggles with self-harm or eating disorders, reflecting its commitment to offering support to those who need it.
#Meta39s #Security #Measures #Teens #Instagram #Facebook