Sam Altman warns there isn’t any authorized confidentiality when utilizing ChatGPT as a therapist
ChatGPT customers could wish to suppose twice earlier than turning to their AI app for remedy or different kinds of emotional assist. In line with OpenAI CEO Sam Altman, the AI business hasn’t but found out easy methods to defend person privateness relating to these extra delicate conversations, as a result of there’s no doctor-patient confidentiality when your doc is an AI.
The exec made these feedback on a latest episode of Theo Von’s podcast, This Previous Weekend w/ Theo Von.
In response to a query about how AI works with at the moment’s authorized system, Altman mentioned one of many issues of not but having a authorized or coverage framework for AI is that there’s no authorized confidentiality for customers’ conversations.
“Folks discuss probably the most private sh** of their lives to ChatGPT,” Altman mentioned. “Folks use it — younger folks, particularly, use it — as a therapist, a life coach; having these relationship issues and [asking] ‘what ought to I do?’ And proper now, for those who discuss to a therapist or a lawyer or a health care provider about these issues, there’s authorized privilege for it. There’s doctor-patient confidentiality, there’s authorized confidentiality, no matter. And we haven’t figured that out but for whenever you discuss to ChatGPT.”
This might create a privateness concern for customers within the case of a lawsuit, Altman added, as a result of OpenAI can be legally required to provide these conversations at the moment.
“I feel that’s very screwed up. I feel we should always have the identical idea of privateness in your conversations with AI that we do with a therapist or no matter — and nobody had to consider that even a 12 months in the past,” Altman mentioned.
The corporate understands that the shortage of privateness may very well be a blocker to broader person adoption. Along with AI’s demand for a lot on-line information throughout the coaching interval, it’s being requested to provide information from customers’ chats in some authorized contexts. Already, OpenAI has been preventing a court docket order in its lawsuit with The New York Occasions, which might require it to save lots of the chats of tons of of tens of millions of ChatGPT customers globally, excluding these from ChatGPT Enterprise clients.
Techcrunch occasion
San Francisco
|
October 27-29, 2025
In an announcement on its web site, OpenAI mentioned it’s interesting this order, which it referred to as “an overreach.” If the court docket may override OpenAI’s personal selections round information privateness, it may open the corporate as much as additional demand for authorized discovery or regulation enforcement functions. At present’s tech corporations are usually subpoenaed for person information so as to support in prison prosecutions. However in newer years, there have been extra considerations about digital information as legal guidelines started limiting entry to beforehand established freedoms, like a lady’s proper to decide on.
When the Supreme Court docket overturned Roe v. Wade, for instance, clients started switching to extra personal period-tracking apps or to Apple Well being, which encrypted their data.
Altman requested the podcast host about his personal ChatGPT utilization, as nicely, provided that Von mentioned he didn’t discuss to the AI chatbot a lot as a consequence of his personal privateness considerations.
“I feel it is smart … to actually need the privateness readability earlier than you employ [ChatGPT] lots — just like the authorized readability,” Altman mentioned.