Meta, Snap, and TikTok have teamed up to launch a new initiative called “Thrive,” aimed at reducing the spread of graphic content related to self-harm and suicide on their platforms.
Through Thrive, these tech giants will be able to share “signals” that alert each other to any violating content, making it easier to remove harmful material across multiple platforms.
The program was developed in collaboration with the Mental Health Coalition, an organization dedicated to breaking down the stigma surrounding mental health conversations. Meta provides the technical infrastructure for Thrive, allowing these “signals” to be securely shared between companies.
Thrive uses the same technology as Meta’s Lantern program, which helps fight the spread of child abuse content online. By sharing hashed versions of problematic media, participating companies can quickly identify and remove content that violates their guidelines.
Meta has emphasized that while it is making it more difficult to find self-harm and suicide-related content, it still allows users to openly discuss mental health issues and share personal stories—as long as these discussions do not promote or graphically describe harmful behaviors.
Meta’s data shows that the company takes action on millions of pieces of self-harm and suicide-related content every quarter. In the last quarter alone, it restored about 25,000 posts, most of which were reinstated following user appeals. Thrive represents a significant step toward creating safer online spaces while still allowing room for critical mental health conversations.
Bijay Pokharel
Related posts
Recent Posts
Subscribe
Cybersecurity Newsletter
You have Successfully Subscribed!
Sign up for cybersecurity newsletter and get latest news updates delivered straight to your inbox. You are also consenting to our Privacy Policy and Terms of Use.