AddictionNews

Latest developments in causes and treatments

AddictionNews

AddictionNews

Chatbots and Psychological Resilience

Photo of a hand holding smartphone with a digital AI chatbot on the screen.

Artificial Intelligence, or AI, is now connected to every element of addiction research, pharmaceutical discovery, addiction treatment evaluation, and drug-trafficking prevention. We have written articles about AI addiction, where people are becoming dependent upon AI companions to an unhealthy degree. We have also covered AI-powered addiction recovery apps. AI is omnipresent in the world of addiction.

There is one area of substance abuse prevention, however, where AI shines, and that is chatbots reducing anxiety while building self-confidence and psychological resilience. Last year, a team of researchers from China conducted a randomized, controlled pilot study of using chatbot therapy to build psychological resilience.

The interdisciplinary team of psychiatrists and technologists recruited 107 participants with an average age of 21.8 years (standard deviation 2.5 years). Fifty-five participants were assigned to a control group receiving cognitive behavioral therapy (CBT) from guided readings. The other 52 participants received CBT from a chatbot named Philobot

All participants first completed a series of questionnaires, including:

  • Connor-Davidson Resilience Scale (25 questions)
  • Subjective Well-Being Scale (5 questions)
  • Positive and Negative Affect Scale (20 items)
  • Patient Health Questionnaire (9 items on mental health)
  • Generalized Anxiety Disorder Scale (7 items)
  • UCLA Loneliness Scale (20 questions)

Phliobot is quite an interesting AI tool. It is a chatbot designed to deliver CBT skill training. Participants communicate with Philobot using text, voice, or both. Philobot converts all verbal communications into text on the screen, and responds verbally and/or with text. The researchers describe what Philobot is doing behind the screen:

The natural language processing module is the core module of the chatbot. It analyzes text information with lexical annotation, keyword extraction, and syntactic structure analysis. The sentiment analysis function then identifies the hidden emotional intent. The dialog management module generates responses based on context and user intent determined by the previous module.

Philobot then ranks a variety of answers based on different “conversational strategies” and selects the best choice for a response. This response is then converted to natural language. “Philobot focuses on using natural language interaction to assist the users in changing their thought processes,” write the researchers.

Philobot works fast. The therapy lasts for four daily sessions. The researchers do not say how long each session was, only what material was covered. Philobot takes the participant down a structured path toward resilience:

Day 1: Establishing a therapeutic relationship. An introduction to CBT. Outline for the remaining sessions.

Day 2: “Philobot guides the user to explore the relationship between feelings, thoughts, and behaviors,” including emotions and emotional responses.

Day 3: Practice changing emotional responses to adverse events through cognitive reappraisal.

Day 4: Philobot teaches three “responding strategies” to help that participant deal with negative emotions triggered by adverse events.

The initial results are that the CBT helped both the control group and the chatbot group, with scores improving in all areas except one: The chatbot group showed improvement in loneliness but the control group did not. Both the control group and the chatbot group showed a statistically significant increase in psychological resilience. However, the other measures do not have statistical significance and, in fact, the control group outperformed the chatbot group:

When comparing the changes between the two groups, the control group outperforms the chatbot group in all psychological aspects.

Ouch. Researchers take away from this randomized, controlled trial, that the chatbot does work. It improves the score of every participant on a wide variety of measures. It does not work quite as well as the guided readings, but it’s much easier to use, according to participants.

An earlier study from China showed that the AI chatbot, Replika, was an effective companion during the COVID-19 pandemic, as “multiple facets of mediated empathy become unexpected pathways to resilience and enhance users’ well-being.” Born in 2017, Replika is described as:

[…] an AI-powered chatbot that creates a private perceptual world that offers users 24/7 companionship and supports their mental health.

Researchers combed message boards devoted to Replika users looking for strong emotional feelings about the chatbot, positive or negative. They then requested in-depth interviews with users, lasting 60 to 150 minutes. The interviews were focused on how Replika users used the app in coping with difficulties caused by the pandemic.

About half the respondents used Replika from the start to ease fear, depression and anxiety brought about by pandemic lockdowns. The other half tended to use Replika after a particularly stressful incident. All participants’ Replika use declined as restrictions were lifted and life became more normal. However, none of them uninstalled Replika and they all continued to use it to respond to stressful incidents.

Researchers concluded that Replica helped users during the pandemic by increasing their empathy. This increase in empathy resulted in greater psychological resilience and an ability to more successfully recover emotionally from setbacks.

The use of AI in mental health apps is only just beginning. Already, the apps show an ability to deliver results, in some cases superior even to one-on-one therapy with a trained psychiatrist. In time, almost all the recovery apps should perform as well or better than in-person therapy, leading to widespread improvements in mental health.

Written by Steve O’Keefe. First published May 22, 2025.

Sources:

“Enhancing Psychological Resilience with Chatbot-Based Cognitive Behavior Therapy: A Randomized Control Pilot Study,” Chinese CHI ’22: Proceedings of the Tenth International Symposium of Chinese CHI, February 12, 2024.

“Chatbot as an emergency exit: Mediated empathy for resilience via human-AI interaction during the COVID-19 pandemic,” Information Processing & Management, August 31, 2022.

Image Copyright: olecnx.

LEAVE A RESPONSE

Your email address will not be published. Required fields are marked *