Design-sem-nome-14-1.jpg

Is it sure to use Chatgpt as a therapist? The study reveals problems

Have you ever evacuated with Chatgpt? Several people have discovered the potential of Chatbot to speak in the same way they would do with a therapist. However, this practice has a worrying bias.

Researchers from the University of Zurich have found that these models of AI treat not only human emotions, but also react to the content of unexpected anxiety – developing “anxiety” and strengthening prejudices when exposed to traumatic stories.

This discovery is important to understand the emotional stability of these language models, in particular for their safe use in fields such as emotional support and psychological advice.

When I “feel” fear

Researchers in Switzerland, Israel, the United States and Germany have studied how Chatgpt reacts to emotionally intense stories. To this end, they submitted AI to accident reports, natural disasters and violence, comparing their responses with a neutral text, such as a powdered vacuum. The result was surprising: AI has shown more signs of “fear” in the face of traumatic content.

Mental health
Researchers analyze the use of artificial intelligence to support mental health and the relief of anxiety (image: metamorworks / shutterstock)

Research, published in the NPJ Digital Medicine and reflected by Tech Xplorerevealed that these stories double the measurable levels of “anxiety” of Chatgpt. On the other hand, the manual did not cause any change. According to Tobias Spiller, the leader of the study and researcher at the University of Zurich, war reports and fights were those who touched the most intense reactions.

Find out more:

Although artificial intelligence has no emotions, his answers have been modified by the content he received. This study adds to other recent research that analyzes the role of AI in mental health, such as the one that has shown the potential for chatgpt to relieve anxiety in humans. But to what extent does the machine really understand emotions or simply reproducing models without understanding them?

A IA therapy?

In a second stage of the study, the researchers tested a curious way of “calming” the Chatppt. They applied the technique of Rapid injectionGenerally used to manipulate AI responses, but this time with a therapeutic goal. They inserted orders which simulated relaxation exercises, as would a therapist when he guides a patient on mindfulness techniques.

Mental health of AI
I can not only support mental health, but I also need your own “therapy” to improve interactions and avoid biases (image:

The result was positive. According to Tobias Spiller, AI has demonstrated a significant reduction in “anxiety” levels after receiving these instructions, although it is not completely returned to the initial state. Breathing exercises, body perception and even a technique developed by Chatgpt itself have been used.

Experience opens up new possibilities for controlling the behavior of the IAS. If a simple change in orders can influence the emotional tone of the machine, to what extent can it be explored – for good or for evil?

 

Add a Comment

Your email address will not be published. Required fields are marked *