Sam Altman warns there’s no legal confidentiality when using ChatGPT as a therapist

Chatgpt users may want to think twice before resorting to the application of their artificial intelligence for treatment or other types of emotional support. According to the president of Openai Sam Altman, the artificial intelligence industry did not count how to protect the user’s privacy when it comes to these most sensitive conversations, because there is no secret for the patient to the doctor when DOC is of artificial intelligence.
Exec made these comments on a recent episode of PodCast Theo Von, at the end of last week with Theo Phone.
In response to a question about how artificial intelligence works with the legal system today, Altman said that one of the problems of the absence of a legal or political framework for Amnesty International is that there is no legal confidentiality of user conversations.
“People talk about the most personal in their lives to Chatgpt,” Altman said. People use it – young people, in particular, use it – as a processor, a life coach; having problems in this relationship and [asking] “What should I do?” Now, if you talk to a therapist, lawyer, or doctor about these problems, there is a legal concession for that. There is a doctor and the patient’s secret, there is a legal secret, whatever. We haven’t discovered this yet when you talk to ChatGPT. “
Altman added that this may create a special concern for users in the case of a lawsuit, because Openai will be legally required to produce these talks today.
“I think this is very tight. I think we should have the same concept of privacy for your conversations with artificial intelligence that we do with the processor or anything – and no one had to think about it even a year ago,” said Altman.
The company realizes that the lack of privacy can be an objection to adopting the broader user. In addition to the request to artificial intelligence on a lot of data online during the training period, he is asked to produce data from user conversations in some legal contexts. Indeed, Openai was fighting a court order in a lawsuit with the New York Times, which requires this to provide chats for hundreds of millions of Chatgpt users worldwide, except for those from Chatgpt Enterprise customers.
TECHRUNCH event
San Francisco
|
27-29 October, 2025
In a statement on its website, Openai said it was attracting this request, which was called “Overreach”. If the court can overcome the special decisions of Openai on the privacy of data, it can open the company to continue the demand for legal discovery or law enforcement purposes. Today, technology companies are invited regularly to user data in order to help with criminal prosecutions. But in recent years, there have been additional concerns about digital data as laws began to limit the previously established freedoms, such as the right of women to choose.
When the Supreme Court canceled ROE V. Wade, for example, clients began to turn into applications that follow the special period or to Apple Health, which have encrypt their records.
Altman asked the podcast host about his private Chatgpt, since Von said he had not spoken to Ai Chatbot because of his own fears.
“I think it is logical … to really want privacy clarity before you use [ChatGPT] Many – like legal clarity.
Don’t miss more hot News like this! Click here to discover the latest in Technology news!
2025-07-25 17:33:00