Human-AI Collaboration to Encourage Empathic Conversations

Improving mental health continues to be a need and focus for many. As many as one in five adults in the United States experience mental illness each year. The demand for mental health practitioners has skyrocketed in recent years.  According to the American Psychological Association, “the number of psychologists who reported receiving an increase in referrals doubled in 2021 compared to the previous year and nearly 7 in 10 psychologists who had a waitlist reported that it had grown longer since the start of the pandemic.  

To address the lack of providers to meet the increased need for support, a number of mental health organizations have taken the path of peer-to-peer support, connecting millions of people online. While this model helps scale efforts, an opportunity exists in enabling more effective and successful conversations between users. Typically, peer supporters have the best intentions and the desire to help others, but they may not be aware of tools and strategies to maximize effectiveness, such as empathy. Empathy means the ability to understand and feel the emotions and experiences of others. 

A team from the University of Washington (UW), led by Assistant Professor Tim Althoff, assistant professor at the Paul G. Allen School of Computer Science of Engineering and director of the Behavioural Data Science Group, set out to explore ways of create more empathy in the peer-to-peer support conversations on mental health platforms. The team included UW PhD students Ashish Sharma and Inna Lin, as well as clinical psychologists Adam Miner (Stanford) and Dave Atkins (UW). Through previous research the group discovered empathy is not self-learnt over time, and it in fact can decrease.  

The goal of this was to examine how AI systems could collaborate with humans to facilitate empathy in online and text-based peer-to-peer support conversations. The premise was that peer supporters on online mental health platform could be empowered through feedback and training. For example, machine-in-the loop writing systems had the potential to help supporters express higher levels of empathy. In turn, it could improve the effectiveness of the interactions and online mental health platforms.  

Traditional methods of training empathy, such as in-person training for counsellors, are difficult to scale to millions of online users of a mental health support platform. Computational methods could facilitate a more empathetic response from peer-supporters at scale to meet the needs of and improve outcomes for the support seeker. 

To explore this opportunity, the team at UW trained natural language processing models to identify and improve empathy in peer support conversations. They also designed interactive tools for providing model-based, real-time feedback to peer supporters. This includes introducing Partner, a novel, reinforcing learning agent that learns to make edits to text to increase empathy in a conversation. By leveraging Partner, the researchers developed and evaluated a human-AI collaborative approach to help people write more empathic responses.  

Three mobile phones that display an example of a peer-to-peer support interaction. It includes a post from a support seeker, the proposed response from a peer supporter, and feedback generated by Partner to the peer supporter. The support seeker states, “I hope I didn’t lose my only two best friends. They are the only ones I could relate to,” and the peer supporter responds with, “What happened between you two?”. The AI prompts the question, “Would you like some help with your response?” Upon saying yes, the peer supporter receives suggestions to consider adding “That’s awful.” and “What caused you to break?”.

Image description: The AI + human interaction model developed by UW 

A video overview and demo and project can be found on YouTube.  

This research has helped improve the understanding of the role of empathy. A randomized trial of 300 peer supports on TalkLife, which is a mental health app that provides a global peer support network, showed that this model led to a 20 percent increase in expressed empathy overall and a 39 percent increase in empathy for participants who self-reported that they find it challenging to express empathy. 

Analysis of the human-AI collaboration patterns showed participants were able to use the AI feedback both directly and indirectly. Users were able to adopt the AI-generated feedback without becoming overly reliant on it, reporting improved self-efficacy after receiving the feedback.  

Empathy is critical to having successful, supportive conversations. It has strong associations with symptom improvement and is instrumental in building alliance and rapport. Specifically, on online peer-to-peer support platforms like TalkLife, the researchers found that compared to non-empathetic conversations, empathic conversations receive approximately 45 percent more “hearts” and “likes” and are approximately 80 percent more likely to result in the formation of relationships. “Rarely does mental health research have the practical application and potential for impact that we believe this research will have,” said Jamie Druitt, CEO of TalkLife. “This proposed research directly addresses real challenges and can be implemented to create measurable change on mental health platforms.” 

The research and project also received the WebConf Best Paper Award in 2021. Kira Radinsky, best paper award chair, shared: “We felt the work is going to have significant impact on the community based on the algorithmic approach, but also the very thorough experiments done in the paper itself, but also during all this year this is the direction we felt is going to bring the most value to the community.”