UXR Case Study – Designing for Suicide Bereavement Support

Social Media User Safety

Improving Online Platform Design for Online Suicide Bereavement Support

Design Challenge

How might we support diverse expressions of grief in online suicide bereavement communities while protecting users from harm?

Expressions of grief—especially intense or stigmatized ones—often go unsupported or are policed in offline and online spaces. This project explored platform interventions to support authentic, complex expressions of grief through computational tools and qualitative insights.

Methods

  • Computational Linguistic Analysis using LIWC, VADER, and Syuzhet
  • Machine Learning Classification to identify narrative patterns in grief posts
  • Qualitative Thematic Analysis of 189 grief letters and 652 user comments
  • Axial Coding to identify emotional expression types and support dynamics

This mixed-methods approach enabled us to triangulate grief expression types, understand emotional impact, and design content-sensitive support structures. Collaboration spanned data scientists, HCI researchers, grief therapists, Reddit moderators, and trauma-informed care advocates.

Addressing the Challenge

We developed and deployed a narrative tagging model on grief support subreddits, then contextualized our findings through close qualitative reading. This allowed us to:

  • Identify linguistic markers of therapeutic narratives vs. risk-prone content
  • Design interface prompts to guide users toward agentic, healing-centered storytelling
  • Propose moderation protocols sensitive to grief intensity rather than censorship
  • Iterate with design team on developing new platform features

Real World Impacts

  • Implemented narrative recognition into a Reddit support bot prototype to flag healing-focused posts
  • Partnered with a nonprofit grief helpline to evaluate effectiveness of narrative scaffolding prompts
  • 90% of pilot participants said the letter-writing guidance helped them feel more supported and less isolated