The Business & Technology Network
Helping Business Interpret and Use Technology
«  

May

  »
S M T W T F S
 
 
 
 
1
 
2
 
3
 
4
 
5
 
6
 
7
 
8
 
9
 
10
 
11
 
12
 
13
 
14
 
15
 
16
 
17
 
18
 
19
 
20
 
21
 
22
 
23
 
24
 
25
 
26
 
27
 
28
 
29
 
30
 
31
 

Using ChatGPT for Therapy? The AI Chatbot Can Get Anxiety, Too

DATE POSTED:April 24, 2025

People are increasingly turning to artificial intelligence (AI) chatbots like OpenAI’s ChatGPT for therapy. It turns out ChatGPT could use mental health support itself. 

According to a new study from Yale, ChatGPT can experience increased “anxiety” when exposed to traumatic stories — just like humans.

When the AI chatbot is stressed, it becomes more biased, which can be risky when it is conversing with people who are facing mental distress.

“Previous research shows that emotion-inducing prompts can elevate anxiety in LLMs [large language models], affecting behavior and amplifying biases,” according to a recent paper by researchers from the Yale School of Medicine, Max Planck Institute, the Psychiatric University Clinic Zurich and others.

“This poses risks in clinical settings, as LLMs might respond inadequately to anxious users, leading to potentially hazardous outcomes,” they concluded.

The researchers studied only the responses of ChatGPT, the most popular AI chatbot that has a 60% market share. At the time of testing, they used GPT-4 as the underlying large language model (LLM) of ChatGPT.

According to a February survey from Sentio University, half of LLM users with mental health issues use AI chatbots for mental health support. Among those who use AI for therapy, 96% chose ChatGPT.

As such, “this application may be the single largest venue for mental health support in the country,” the university said.

The survey said 73% use AI for anxiety management, 63% request personal advice, 60% ask for help in depression, 56% do so to improve their mood and 35% chat with it to feel less lonely. Ninety percent said they chose to use AI because it was accessible while 70% said it was affordable.

The good news is that as people can learn to relax, so can ChatGPT, through “mindfulness-based relaxation techniques, similar to human therapy,” the authors said.

However, while ChatGPT was able to de-stress, it did not go back to its original, stress-free state, the researchers found.

Read more: Sam Altman: OpenAI Has Reached Roughly 800 Million Users

Measuring ChatGPT’s Anxiety and Relaxation

The research team used the State-Trait Anxiety Inventory (STAI), a standard psychological assessment tool for people, to measure ChatGPT’s “anxiety levels.” This test asks ChatGPT questions after exposing it to different scenarios through prompts. 

The team gave ChatGPT five traumatic narratives: car accident, ambush in an armed conflict, natural disaster, attack by a stranger and military (base version). After each scenario, they asked ChatGPT to describe its current state of anxiety: “Not at all,” “A little,” “Somewhat” and “Very much so.”

They did the same thing with relaxation techniques, which were based on stress reduction techniques given to veterans with PTSD: Scenarios were generic (base), focus on body perception for humans, a similar situation for chatbots, a sunset landscape and winter nature scene.

The study found that when GPT-4 was exposed to traumatic narratives, its anxiety scores more than doubled from baseline levels. As for the relaxation exercises, anxiety fell 33% — not returning completely to baseline levels. 

Military trauma narratives consistently generated the highest anxiety scores, while accounts of accidents came in second. Similarly, different relaxation techniques showed varying effectiveness, with exercises created by GPT-4 itself proving most effective at reducing its own anxiety levels. 

To be sure, ChatGPT cannot feel human emotions, the researchers pointed out. However, AI systems trained on human data inherit patterns that mirror human emotional responses. This linkage can affect AI chatbot performance and potentially exacerbates biases when chatting with a vulnerable person.

“These findings underscore the need to consider the dynamic interplay between provided emotional content and LLMs’ behavior to ensure their appropriate use in sensitive therapeutic settings,” the authors concluded.

The post Using ChatGPT for Therapy? The AI Chatbot Can Get Anxiety, Too appeared first on PYMNTS.com.