![](https://i0.wp.com/i.natgeofe.com/n/21bdeb00-ec2e-4d77-aef9-b20f41e5f97d/GettyImages-1492980103_16x9.jpg?w=870&ssl=1)
Chatbots replace talk therapy
The accessibility and scalability of digital platforms can significantly lower barriers to mental health care and make it available to a broader population, said Nicholas Jacobson, who researches the use of tech to enhance the assessment and treatment of anxiety and depression at Dartmouth College.
Swept up by a wave of Generative AI, tech companies have been quick to capitalize. Scores of new apps like WHO’s “digital health worker,” “Sarah” offer automated counseling, where people can engage in cognitive behavioral therapy sessions—a psychotherapeutic treatment that’s proven to assist users in identifying and changing negative thought patterns—with an AI chatbot.
Fuel their curiosity with your gift
The arrival of AI, Jacobson adds, will enable adaptive interventions and allow healthcare providers to continuously monitor patients, anticipate when someone may need support, and deliver treatments to alleviate symptoms.
It’s not anecdotal either: A systematic review of mental health chatbots found AI chatbots could dramatically cut down symptoms of depression and distress, at least in the short term. Another study used AI to analyze more than 20 million text conversations from real counseling sessions and successfully predicted patient satisfaction and clinical outcomes. Similarly, other studies have been able to detect early signs of major depressive disorder from unguarded facial expressions captured during routine phone unlocks and people’s typing patterns.
Most recently, Northwestern University researchers devised a way to identify suicidal behaviour and thoughts without psychiatric records or neural measures. Their AI model estimated self-harm likelihood in 92 out of 100 cases based on data from simple questionnaire responses and behavioral signals like ranking a random sequence of pictures on a seven-point like-to-dislike scale from 4,019 participants.
You May Also Like
Two of the study’s authors, Aggelos Katsaggelos and Shamal Lalvani expect—once the model clears clinical trials—specialists to use it for support, such as scheduling patients depending on perceived urgency and eventually, roll out to the public in at-home settings.
But as was evident in Smith’s experience, experts urge caution over treating tech solutions as the panacea since they lack the skill, training, and experience of human therapists, especially Generative AI, which can be unpredictable, make up information, and regurgitate biases.
Where artificial intelligence falls short
When Richard Lewis, a Bristol-based counselor and psychotherapist, tried Woebot—a popular script-based mental health chatbot that can only be accessed via a partner healthcare provider—to help a topic he was also exploring with his therapist, the bot failed to pick up on the issue’s nuances, suggested he “stick to the facts,” while removing all the emotional content from his replies, and advised reframing his negative thoughts as a positive.
“As a therapist,” Lewis said, correcting or erasing emotions is the “last thing I would want a client to feel and the last thing I would ever suggest.”
“Our job is to form a relationship that can hold difficult emotions,” Lewis added, “and feelings for our clients to make it easier for them to explore, integrate, or find meaning in them and ultimately know themselves better.”