AI

People are using AI to ‘sit’ with them while they trip on psychedelics

Peter – who asked to delete his last name from this story for the reasons for privacy – is far from alone. An increasing number of people of Chatbots of artificial intelligence is used as “trips specialists” – a phrase traditionally indicating a sober person charged with observing someone under the influence of anesthetic – and sharing their online experiences. It is a strong mixture of both cultural directions: the use of artificial intelligence for treatment and the use of anesthetic to relieve mental health problems. But this is a dangerous possible psychological cocktail, according to experts. Although it is much cheaper than personal drug, it can go badly.

Firm

People’s crowd has turned into AI Chatbots in recent years as an alternative to human healers, citing high costs, accessible barriers, and stigma associated with traditional advice services. At least encouraged them indirectly by some prominent figures in the technology industry, who suggested that artificial intelligence would revolutionize mental health care. “In the future … we will have * effective * and treatment of cheap artificial intelligence dirt,” Ilya Sutskv, one of the founders of Openai and the chief former scientist, wrote in 2023. “It will lead to a radical improvement in people’s experience in life.”

Meanwhile, the prevailing interest in drug science such as Psilocybin (the main compound of psychological activity in magic mushrooms), LSD, DMT and Qutamin has risen. An increasing group of clinical research has shown that when used in conjunction with treatment, these compounds can help people overcome serious disorders such as depression, addiction and PTSD. In response, an increasing number of cities has been criminalized, and some treatment services are now available with the help of a legal drug in Oregon and Colorado. Such legal paths are expensive for the average person, however: the licensed Psilocybin providers in Oregon, for example, are usually paid for $ 1500 and $ 3200 per session.

It seems almost inevitable that these two trends-which their most dedicated advocates possess as a semi-fault for all society’s ills.

There are now several reports on Reddit of People, such as Peter, who open to AI Chatbots about their feelings while stumbling. These reports often describe such experiences in the mystical language. “The use of artificial intelligence in this way seems closer to sending a signal to a vast anonymous – searching for meaning and communication in the depths of consciousness,” wrote one of Reddor at Subredit R/Psychonau about a year ago. “Although it does not replace human touches or the emotional existence of the traditional [trip] An incubator, it provides a unique form of always available companionship, regardless of time or place. “Another user remembered the opening of Chatgpt during a difficult emotional period of the mushroom journey and talking to it through the voice situation in Chatbot:” I told her what I was thinking about, that things became somewhat dark, and she said all the right things for my composition, relaxation, and a positive atmosphere. ”

At the same time, it was an abundance of Chatbots specifically designed to help users navigate narcotic experiences on the Internet. For example, “Tripsitai focuses on damage, providing invaluable support during difficult or overwhelming moments, and helping to integrate ideas gained from your journey,” according to his builders. “Al -Shaman”, which was created on the top of ChatGPT, described the designer as “a spiritual evidence of an original American rule … provides sympathetic and personal support during narcotic trips.”

Treatment without healers

Mostly experts agree: replacing human therapists, the unorganized Amnesty International robots during narcotic experiences is a bad idea.

Many mental health professionals who work with anesthetic indicate that the primary design of large language models (LLMS)-systems that work on AI-students mainly contradict the treatment process. Knowing when to speak and when silence is obligatory, for example, is a major skill. In the clinic or office of the processor, a person who swallowed a silosipine usually puts on the headphones (listening to an unlike operating menu from ChatGPT sponsored by Peter) and eye mask, and producing a guided experience, by design, almost inside. The therapist sits close, and offers a supportive touch or sound when necessary.

Don’t miss more hot News like this! Click here to discover the latest in AI news!

2025-07-01 09:06:00

Related Articles

Back to top button