More than ever, social media is providing connections and support for people with mental illness or those who want to improve their lives. But with increased discussion about mental health on the Internet, what responsibility do social media platforms have when it comes to helping people who suffer from mental illness?
Different platforms have responded to the problem with a variety of responses. Instagram, for example, has banned certain hashtags (#) that promote unhealthy behaviors like self-harm. The platform even has a page dedicated to educating users about how to detect symptoms of eating disorders among other users and how to connect them to help.
Other popular sites like Tumblr have chosen not to censor content, believing that a reactive response like removing material will not help users. Instead, when you search for a word like “depression” on the site, a feature will pop up that asks if you’re okay. It provides a link to a hotline as well as the site 7cupsoftea.com, where users can speak anonymously to a trained active listener.
As behavioral health content becomes more and more interactive, these ethical questions will continue to arise regarding what’s allowable as content and what merits directing a person towards the help they deserve. Though web content can save lives, the best way to intervene is to educate people about how to look for signs and symptoms among friends and family members’ social media use.
With loved ones looking out and web platforms taking responsibility for helping those with mental health problems, future crises can be prevented and the public can be educated. Often one phone number or one friendly intervention is all it takes to direct people towards the knowledge and the help that can change their lives.