AI

OpenAI has released its first research into how using ChatGPT affects people’s emotional wellbeing

Researchers found some interesting differences between how men and women respond to using Chatgpt. After using Chatbot for four weeks, the study participants were less likely to communicate with more people than their male counterparts who did the same. Meanwhile, the participants who placed an audio position in Chatgpt have been informed of a genus whose reactions were not of much higher levels of unit and emotional dependence on Chatbot at the end of the experiment. Openai has no plans to publish any of the study.

Chatbots, which works in large language models, is still a starting technique, and it is difficult to study how it affects us emotionally. Many research in the region-including some new works by Openai and MIT-on self-reported data, which may not be always accurate or reliable. However, this last research is in harmony with what scientists have discovered so far on how to emotionally chatbot. For example.

Upeenai and MIT Media Lab use a two -way way. First, they collected and analyzed the real world data from nearly 40 million interaction with ChatGPT. Then they asked users of 4,076 who have these reactions how they made them feel. After that, the media laboratory employed nearly 1,000 people to participate in a four -week experience. This was more deep, and examined how participants interacted with Chatgpt for at least five minutes every day. At the end of the experiment, the participants completed a questionnaire to measure their perceptions of Chatbot, their own feelings of loneliness, social participation levels, their emotional dependence on the robot, and their sense of whether their use of robot represents a problem. They found that the participants who trusted and “associated” with Chatgpt are more than others to be alone, and rely on it more.

Jason Fang, a researcher in Openai who worked in the project, says this work is a first important step towards more insight into the Chatgpt effect on us, which can help artificial intelligence platforms to enable safer and healthy interactions.

He says: “Many of what we do here is the first, but we are trying to start conversation with the field about the types of things we can start to measure, and start thinking about the long -term effect on users.”

Although the research is welcome, it is still difficult to determine when a person – not – is – compression with technology at the emotional level, says Devilin. She says that the study participants may have had feelings that were not recorded by researchers.

“With regard to what the difference has identified, people may not necessarily use Chatgpt in an emotional way, but you cannot divorce from being a person from your interactions [with technology]”We use these passions of passion that we created to search for certain things – but what this actually means for a person’s life is difficult to extrapolate,” she says.

2025-03-21 17:44:00

Related Articles

Back to top button