Why Won’t Social Media Let Us Talk About Our Feelings?
Content creators and mental health pros share what the platforms are getting wrong and why it matters.A while back, like years and years ago, Christina Wolfgram watched a video from one of her favorite YouTubers, Anna Akana, who casually dropped that she was on an antidepressant. “Not to be dramatic, but that changed my life,” Wolfgram tells Wondermind. At that point, Wolfgram was roughly two years into experimenting her way through different coping tools to deal with her depression diagnosis, but, “It was pretty clear I wasn't gonna be able to just, uh, get functional by myself.” That prescription Akana talked about happened to be the exact one Wolfgram’s doctor had recommended, so she decided to give it a shot—and she’s glad she did.
Since then, Wolfgram has gone on to use her own social media accounts to be super honest about her mental health highs and lows with funny and relatable content that makes her nearly 50,000 Instagram followers feel seen.
But, more recently, she's noticed that talking about things like depression, medication, and other mental health issues on the internet has become a minefield of code words and shadowbans. “People would censor themselves while they were talking [to the camera],” she says. “Saying like, ‘I’m a victim of child bleep, and this is my story,’ when referring to abuse.” People have started swapping vowels for symbols or using code words to discuss certain topics: “Suicide” translates to “su!cide” or “unaliving.” “Eating disorder” becomes “ed.” Even medications are written out in code: “Adderall” has evolved to “@dderal.”
At first, she thought people were being overly cautious for their audience’s sake. “Some of that stuff is triggering, and I thought of it as a courtesy.” But when those same people started asking followers to support them so that their posts and videos could get past shadowbans on TikTok and Instagram, she realized, “It wasn’t necessarily for the audience—but for the algorithm.”
How did we get here?
Platforms like Instagram and Tiktok don’t have a public list of the terms or phrases that keep a post from showing up in your feed, but creators who talk about their mental health, like Wolfgram, have long suspected that posts or videos with certain words like the ones above will get their content banned or deprioritized (aka shadowbanned).
Instagram did not respond to our request for comment, but a spokesperson for TikTok said that their platform provides community guidelines for what you can and can’t share. The goal, according to that page, is to “welcome people coming together to find connections, participate in shared experiences, and feel part of a broader community … in a supportive space that does not negatively impact people’s physical or psychological health.”
On TikTok, things like promoting self-harm and encouraging eating disorders are understandably a no-go, according to these guidelines, but discussing “emotionally complex topics in a supportive way without increasing the risk of harm” is fair game.
Similarly, Instagram’s community guidelines acknowledge that their platform can be a space for people experiencing or recovering from mental health issues to “create awareness or find support,” without encouraging self-harm.
Yeah, all of this sounds pretty damn reasonable and might even lead you to believe that openly sharing experiences and science-backed facts about mental health is very much OK or even encouraged by these platforms.
The problem is that, anecdotally, content that mentions mental health meds, suicidal ideation, and other mental health conditions—even when it comes from licensed mental health pros who know how to mediate the risks platforms are trying to avoid—is maybe, probably, definitely being suppressed by the algorithm.
Juan Romero-Gaddi, MD, psychiatrist and TikTok creator with nearly 60,000 followers, has taken notice. “When you post hundreds of videos about [mental health], you start to see a pattern,” says Dr. Romero-Gaddi, who’s a member of Wondermind’s Advisory Committee. He adds that if he doesn’t censor his wording or captions to get past the algorithm, the views of that video will be lower than the rest of his content. “Definitely don't even say suicide. Even if you’re talking to someone during a [TikTok] live, you may get banned for a few days.”
Still, the spokesperson from TikTok insists, “Our moderation decisions rely on a range of factors, including a video, any caption, [or] audio to understand the context of a post, rather than focussing on a specific use of a single word.” But, you know, the math isn’t mathing here.
Is this helpful?
Honestly, all of this might leave you shrugging. You might think, Better safe than sorry! And you’re not wrong. It’s definitely possible that any content about any mental health topics—even the helpful, expert-backed kind—could trigger trauma responses or harmful behavior in social media users, says therapist Alo Johnston, LMFT, author of Am I Trans Enough?, who posts about mental health regularly under the handle @thetranstherapist. “Something that can feel empowering to someone can be triggering for someone else and what they’re dealing with,” he explains. “It’s not possible to avoid all triggers for everyone,” adds Johnston, who’s also a member of Wondermind’s Advisory Committee.
So suppressing mental health content could, in theory, prevent some people from being triggered, especially if they’re not looking for that stuff. That’s totally possible. But, in reality, these algorithmic safety nets may just make it harder to find actually supportive content while also letting through tons of posts and videos that do encourage people to hurt themselves or engage in disordered eating habits—the exact thing TikTok and Instagram say they want to keep off of your feed.
That’s partly because the people posting harmful content are also using code words to get around the algorithms, Dr. Romero-Gaddi explains.
In December 2022, a report from the Center for Countering Digital Hate found that the app likely didn’t distinguish videos that may help users in crisis from those that could harm them. The report’s authors said they found 21 hashtags that “contained healthy discussion of eating disorders as well as harmful pro-eating disorder videos.” They write, “TikTok does not appear to label or moderate content on these hashtags, making them a place where recovery content mingles freely with content that promotes eating disorders.”
This coded language isn’t exactly stopping people from finding content that validates their eating disorder or self-harm, if that’s what they’re looking for. They can just type whatever code words into the search bar to get it.
At the same time, these creative workarounds make it harder for people who aren’t fluent in social media to search for help, explains Dr. Romero-Gaddi. He feels conflicted about censoring his content because, for example, someone searching for Adderall might not see his posts or videos unless they search for @dderall. If he doesn’t use a code word his post might be suppressed. “[Mental health professionals] could have a bigger impact if we used the right terms and people can find the content that they want and need,” he explains.
Then there’s the issue of stigma. As you probably know, anytime a subject is censored, even on social media, it can feel embarrassing, offensive, shameful, or even dangerous to talk about. When you’re personally experiencing that thing, welp, you’re bound to feel any of those types of ways about yourself. “People don’t know if they’re alone or if there are ways to address the problem or how prevalent it is, and that makes them feel isolated,” says Jonhston.
Take suicidal ideation. While romanticized or sensationalized coverage of deaths by suicide has been shown to increase the risk of self-harm in people experiencing suicidality, responsible reporting of those same events can minimize the mental health toll. Research also suggests that acknowledging and asking about suicidality doesn’t increase the risk of someone acting on those thoughts, and it may even reduce suicidal ideation.
Sure, these studies don’t directly address what’s happening on TikTok and Instagram, but they’re examples of how dangerous it can be to villainize all content related to heavy mental health issues, especially when that means silencing expert-led messages of hope. “Suicidal ideation is a complicated problem, and instead of figuring out the nuance, [social platforms] just try to avoid it entirely,” explains Johnston. That includes making it harder for licensed experts like Johnston and Dr. Romero-Gaddi to share resources and information that could encourage someone to get help.
Could we do this better?
It’s not like mental health professionals and content creators claim to be app developers or social media scientists who can create a perfectly safe algorithm that delivers on well-intentioned community guidelines. That said, they have some ideas.
First, Dr. Romero-Gaddi thinks that it would be really helpful if TikTok and Instagram could verify the credentials of licensed mental health pros posting content. That way, advice from professionals can be prioritized above the riff-raff and misinformation from people who don’t really know what they’re talking about or those who intend to do harm. “Based on my experience, people who work in the medical field or mental health professions, they're really responsible for using these terms when they actually need it and when it's required,” he explains.
And to prevent people from being unpleasantly surprised by suicide prevention resources or eating disorder experiences, deploying trigger warnings over that content could definitely help, says Johnston.
Allowing people to filter out certain topics on their feed—or their children's feed—(the way you can mute certain words on Twitter), could also be helpful.
Obviously, those are just a few ideas that might make a difference in a large, complex issue. But perhaps the bigger problem is that many of us rely on the internet and influencers to teach us about our mental health—including the younger version of Wolfgram. “There have been times when I haven’t had health insurance, so I couldn’t see a therapist,” she says. “During those times, I definitely turned to the internet,” says Wolfgram.
While there’s nothing wrong with a TikTok full of fun facts or finding community with people dealing with the same things as you, those are no substitute for professional mental health help, which just so happens to be hard to access for a lot of us. “I don’t know what the solution is,” says Wolfgram. “But I guess better health care and more therapy for everyone would be great.”
Wondermind does not provide medical advice, diagnosis, or treatment. Any information published on this website or by this brand is not intended as a replacement for medical advice. Always consult a qualified health or mental health professional with any questions or concerns about your mental health.