
Meta, TikTok, and Snap will participate in an external grading process that evaluates social platforms on their protection of adolescent mental health. YouTube and Roblox are also participating in this program. Discord will also participate.
The Mental Health Coalition‘s Safe Online Standards (SOS) initiative created this program. SOS includes approximately two dozen standards. These standards cover areas such as platform policy, functionality, governance, transparency, and content oversight.
Dr. Dan Reidenberg, Managing Director of the National Council for Suicide Prevention, leads the SOS initiative. Participating companies will voluntarily submit documentation regarding their policies, tools, and product features. An independent panel of global experts will evaluate this information.
The SOS initiative aims to establish user-informed data on how digital platforms design products, protect users aged 13–19, and address exposure to suicide and self-harm content.
Platforms will receive one of three ratings after evaluation:
The Mental Health Coalition, founded in 2020, has collaborated with Meta (formerly Facebook) since its early days.
The Mental Health Coalition website lists Meta as a “creative partner.”
Last year, allegations surfaced that Meta concealed internal data, known as “Project Mercury” (started in 2020), showing negative effects of its products on users’ mental health. Meta has since introduced measures such as Instagram teen accounts. Meta is currently facing a class-action lawsuit in California over allegations of child harm from addictive products.
Roblox has faced accusations regarding child well-being on its platform. Discord has increased its age-verification processes due to child endangerment concerns.