Chatbots as Mental Healthcare Proxies: Possibilities and Limitations
Presenter:
Catherine Stinson, PhD
Assistant Professor, Queen’s University
Abstract:
There is a great need for more affordable, accessible mental health treatment options, especially labour-intensive talk therapy. Given the impressive abilities of a new generation of chatbots like ChatGPT to mimic human conversational skills, there is hope that they might prove useful as proxies for human psychotherapists. In particular, there is hope that for communities facing barriers to mental health care, chatbots might fill the gap. We look in detail at the current generation of chatbots to understand what they do well, and what their limitations are, with support from empirical work in natural language processing. Where these tools perform best is on formulaic language tasks, in domains well covered in training corpora, using standard English. Unfortunately among the under-served communities are migrant and minority groups who may not communicate in standard English, and are not well represented in training corpora. For some psychotherapeutic interactions, particularly formulaic ones, the capacities of chatbots may be a good match. However, for for interactions where an empathetic relationship is essential, the current generation of therapy chatbots are potentially dangerous. While there is some room for chatbots to act as proxies for human psychotherapists, we should not overestimate their abilities.
Disease:
- Mental Health
Other:
- Health Equity