Dylan Thomas Doyle was a college junior traveling abroad when he got word that his friend Jack had taken his own life back home. Shaken, but reluctant to talk about it to his friends in person, he turned to online grief support spaces like Facebook and Reddit.
A decade later, when serving as a hospital chaplain and Unitarian Universalist minister, he lost two more loved ones to suicide. He found solace online again—this time in a subreddit specifically created for suicide bereavement.
“All grief is hard, but suicide is often sudden, traumatic and has a lot of social stigma around it. No one knows what to say, so you can feel really isolated,” said Doyle, now a doctoral candidate in the Department of Information Science at CU Boulder. “It’s comforting to go to these spaces and have people say, ‘I’ve been through that. I know what you’re feeling.’”
But as Doyle reports in two new studies, such spaces also have the potential to do harm, exposing emotionally vulnerable people, including children, to graphic stories, unhelpful comments and other potentially re-traumatizing content.
The studies, published in the Proceedings of the ACM on Human-Computer Interaction, are among the first to explore what goes on in suicide bereavement groups.
“It’s great that these communities exist,” said Doyle, who is now working to make them safer. “But right now, it’s sort of a free for all.”
The power of sharing stories
On average, 132 people in the U.S. complete suicide daily. More than half of the population will, at some point, grieve a loved one who has died this way. Professional help can be hard to find because suicide bereavement is specialized. One recent study found that 62% of people grieving a loved one who died by suicide turn to social media for support.
For their study, Doyle and his co-authors examined nearly 2,600 posts and 16,502 comments in the r/SuicideBereavement subreddit.
The team used AI natural language processing (NLP) technology to get insight into the emotional state of users and identify different kinds of posts, from lengthy stories to short questions or requests for resources.
They found that nearly half of content posted was narrative storytelling, and many of those stories were extremely graphic.
When the team noticed a large subset of users were writing letters to the deceased, they launched a companion study in which they read through 189 such posts and 652 comments.
The posts were anonymized and the research team made sure to take care of their own mental health along the way.
“Even as researchers, we struggled to read some of these,” said Doyle.
Some letter-writers shared how they had found out and how it affected them. Others asked for explanations or sought forgiveness for not doing enough. One shared a story about a final trip they and the deceased had taken to the mountains, and how much they laughed afterward. Many commenters responded with comfort, reassurance, gratitude and offers of direct support outside the platform.
But some shared detailed descriptions of the way they had found their loved ones or the way their death had been carried out. Some expressed rage and hatred for being left behind.
The team was heartened to find almost no deliberately abusive comments but they did find some they deemed “unsupportive,” in which commenters replied with their own graphic stories.
“Some people come there just seeking resources or asking factual questions, and don’t expect to find people sharing narratives of really tough images,” Doyle said.
Due to the way social media algorithms operate, the most graphic comments tended to rise to the top and generated more comments.
If you or someone you know is struggling or in crisis, call or text 988 or chat 988lifeline.org. Read about suicide prevention resources at CU Boulder.
“If you’re, say, 13-years-old and you come upon this and start taking it all in, that could really be harmful,” he said. “And for people who are already in a vulnerable emotional state, it can be damaging to their grieving process.”
Building a more supportive platform
Doyle stressed that he is not specifically critiquing Reddit, but rather raising questions about how to more effectively support people using social media platforms for suicide bereavement support. He believes more research is needed and does not think banning narrative storytelling on platforms is the answer. (Previous research shows that in offline support groups, such storytelling can be extremely therapeutic.)
He does believe platforms could serve users better.
At present, r/SuicideBereavement subreddit moderators are not required to be certified or trained in mental health.
On its homepage, the site clearly prohibits “actively suicidal content” and advises that it is reserved only for those bereaved by suicide. But the subreddit, just like an NFL or travel subreddit, operates with few guardrails.
Doyle imagines a day when, using the AI tool his team developed, narrative posts could be categorized and users, when logging on, could opt in or out of seeing them.
He also suggests that moderators get training around grief support and users have an opportunity to customize what they want to see at the top of their feed.
“Social media platforms in general don’t really know what to do with death or the bereaved,” he said. “We believe that more needs to be done to make these spaces customized to the unique needs of the grieving.”