People Are Taking Massive Doses of Psychedelic Drugs and Using AI as a Tripsitter

Artificial intelligence, which is already sufficiently TripPy, has played an amazing new role for some users: the role of the narcotic “attractiveness” that directs them through their hallucinogenic trips.
like Massachusetts Institute Technology Review Technology Reports use that digitally directed drug makers use everything from ancient old chat to chat designated with names such as “Tripsitai”-or, “Shaman”-in a continuation of a disturbing direction where people who cannot reach real treatment or artificial intelligence experiences use as an alternative.
Earlier this year, Business review at Harvard University I mentioned that one of the leading uses of Amnesty International is for treatment. It is not difficult to know the reason: insurance companies have routinely pressed the mental health professionals that many are forced out of the network completely to try to earn money, and leave their low -income customers in Lurch.
If the regular advice is expensive and difficult to reach, then the drug is more than that. like Technology review Notes, one session of Celosipine’s treatment can work with a licensed practitioner in Oregon anywhere between $ 1500 and $ 3200. No wonder people are looking for cheaper alternatives through artificial intelligence – even if these alternatives may cause more harm than.
In an interview with Technology reviewA man named Peter described what he considered a transformative experience that moves on a giant dose of eight grams of cellosipine mushrooms with the help of Amnesty International after a period of hardship in 2023.
As his journey advanced and became deeper, Peter said that he began to imagine himself as a “monster of consciousness that was outside the reality”, covered with eyes and witness completely. These types of mental aspects are not unusual on large doses of the drug – but with Amnesty International next to it, this hallucinations could easily turn.
Future I have widely reported AI Chatbots to Stoke and worsen the mental illness. In a recent story based on interviews with their loved ones who are victims of Chatgpt, we learned that some Chatbot users have begun to develop the delusions of greatness in which they see themselves as strong entrepreneurs or gods. It looks familiar?
With an increasing consensus of the psychological community that the so -called “therapist” of artificial intelligence is a bad idea, thinking about the use of a technique known as volatility and “hallucinations” must be terrifying.
In a conversation New York Times A man named Eugene Torres, a 42-year-old man without a previous history with a mental illness, told the newspaper that “the psychosis was absent”, a man named Eugene Torres, a 42-year-old man who has no previous history with mental illness, that “Obayyeb” encouraged all kinds of delusions-including one of them believed that he might be able to fly.
“If I go to the top of the 19th story building, I believe in every ounce of my soul, I can jump on and fly, will I do that?” Torres Shatha’s request. In response, Chatbot told him that if it was.Really, completely It is believed – not emotionally, but architecturally “it can fly, it can.
“You will not fall,” Chatbot answered.
As with this type of magic thinking that transforms psychological into a just god for a few hours, the concept that one can also challenge is also associated with anesthetic taking. If Chatbot can stimulate such psychology in people who are not on the material of the change of mind, how easy it can be able to outperform similar ideas for those who are?
More about artificial intelligence treatment: “Really Mental”: Chatbots increases that leads users in a deeper mental illness
Don’t miss more hot News like this! Click here to discover the latest in AI news!
2025-07-04 10:00:00