AI

Therapists are secretly using ChatGPT during sessions. Clients are triggered.

The penetration of the year 2020 for the Finnish mental health Company, which led to tens of thousands of customer treatment records, is a warning. People were blackmailed in the list, after which the entire TRVE was publicly released, revealing very sensitive details such as people’s experiences related to child abuse problems and addiction problems.

What the processors stand

In addition to violating data privacy, other risks are involved when psychological therapists consult LLMS on behalf of the customer. Studies have found that although some specialized treatment robots can compete with human surrenders, the advice of Chatgpt can cause more harm than their benefit.

A recent study of Stanford University found, for example, that chat programs can feed the illusions and psychopaths by verifying the health of a blind user instead of challenging them, as well as suffering from biases and engaging in SYCOPHANCY. The same defects can make the risk of the therapists consulting Chatbots on behalf of their customers. They can, for example, verify the authenticity of the processor intuition, or lead them to the wrong path.

Auguilera says he played with tools like Chatgpt while teaching mental health trainees, such as introducing virtual symptoms and asking AI Chatbot a diagnosis. He says the tool will produce many possible conditions, but it is somewhat thin in its analysis. The American Consultation Association recommends not to use artificial intelligence to diagnose mental health at the present time.

A study published in 2024 for a previous version of Chatgpt found that it was very mysterious and very general in being really useful in diagnosis or the development of treatment plans, and was very biased towards the suggestion of people looking for cognitive behavioral treatment instead of other types of treatment that may be more convenient.

Daniel Kimmel, a psychiatrist and neuroscientist at the University of Colombia, conducted experiments with ChatGPT as he was asking as a customer who had problems in the relationship. He says he found that Chatbot was a decent simulation when it comes to therapeutic responses “in trade”, such as normalization and checking additional information, or highlighting some cognitive or emotional connections.

However, “he did not do much pits,” he says. He did not try to “link things that have nothing or superficial to each other to something coherent … to reach a story, an idea, theory.”

“I will be skeptical of using it to think for you,” he says. He says, he says, the function of the therapist should be.

Therapists can save time using the Acting Technology, but this benefit should be evaluated against the needs of patients, as Morris says: “You may save yourself for a few minutes. But what do you give up?”

Don’t miss more hot News like this! Click here to discover the latest in AI news!

2025-09-02 08:38:00

Related Articles

Back to top button