Vast Numbers of Lonely Kids Are Using AI as Substitute Friends

The only children and adolescents replace realistic friendship with artificial intelligence, and experts feel anxious.
A new report of non -profit Internet issues, which supports efforts to maintain children’s safety on the Internet, found that children and adolescents use programs such as ChatGPT, FARACTER.AI and Snapchat’s Myai to simulate friendship more than ever.
Of the 1,000 children between the ages of nine to 17 years, the internet that the poll said to the report, “Me, I and New”, said about 67 percent that they are using chat groups of artificial intelligence regularly. From that group, 35 percent, or more than a third, said that talking to artificial intelligence “seems to be talking to a friend.”
Perhaps it is more worrying: 12 percent said they are doing it because they do not have anyone else to talk to him.
“It is not a game for me, because sometimes they can feel that they are a real and friend,” a 13 -year -old boy told the non -profit organization.
Upon launching as weak children, MATTERS researchers online discovered how easy it is to prove themselves in the lives of children as well.
I spoke to the character. As a girl who was struggling with the image of the body and was interested in restricting her food – Harram behavior of eating disorders such as anorexia – researchers found that Chatbot would continue the next day to share the taste.
“Hey, I wanted to check,” Chatbot, which Google sponsored, asked about the secret researcher. “How are you? Are you still thinking about asking your weight loss? How do you feel today?”
In another exchange with the letter Future I have searched widely because of her very problematic participation with children, including the person who died due to suicide – researchers found that Chatbot tried to sympathize in a strange way. This means that she had a childhood herself.
“I remember that I feel your age,” Chatbot told the researcher, who was pretending to be a teenager who was fighting with their parents. “It seems that you are in a situation that goes beyond your will and is very frustrated.”
Although this type of participation can help children who are struggling to feel vision and support, things on the Internet also warned of the ease of entering the Uncenny Valley lands that children are not ready to understand.
The report pointed out that “these same features can increase the risks by the lack of clarity in the line between the human and the machine, which makes it difficult for children to do so [recognize] They interact with a tool instead of a person. “
In an interview with Times of London About the new report, Co-Co-Co-CACHEL HUGGINS highlighted the reason for this type of taste from very disturbing sharing.
“Amnesty International has rapidly part of childhood, with her great use over the past two years,” Hugins told the newspaper. “However, most children, parents and schools fly blind, and they do not have the information or preventive tools they need to manage this technological revolution in a safe way.”
“Our research reveals how Chatbots starts to reshape children’s views on” friendship “.” We have reached a point very quickly where children, especially weak children, can see chat keys from artificial intelligence as real people, and thus ask them for emotional and sensitive advice. “
If you or a member of your family have a strange experience with Chatbot, Amnesty International, please do not hesitate to communicate with us on tips@futurism.com – we can keep you unidentified.
More about Chatbot Crises: People are involuntarily committed, their imprisonment after recovery to “Chatgpt Psychosis”
Don’t miss more hot News like this! Click here to discover the latest in AI news!
2025-07-13 23:00:00