Sam Altman, known for his role with ChatGPT, recently made a compelling case against using cloud-based AI chatbots in favor of running a language model on your own PC. During his appearance on Theo Von’s podcast, Altman highlighted a significant issue: OpenAI retains all user interactions, which can range from simple conversations to deeply personal exchanges.
This raises the question of whether it’s wise to share intimate details with ChatGPT. While OpenAI maintains privacy, there are no legal safeguards mandating the anonymization or protection of these chats.
This means that should a court demand disclosure, OpenAI could be compelled to reveal your conversations. Altman illustrated this concern with a hypothetical situation in divorce proceedings, where someone had sought advice from ChatGPT about infidelity.
“People talk about the most personal stuff in their lives to ChatGPT,” he remarked, underscoring the lack of any equivalent legal protections that exist in conversations with therapists, lawyers, or doctors. Currently, there is no established confidentiality for discussions with AI.
If sensitive information is shared and subsequently a lawsuit arises, there is a risk that those conversations could be disclosed. Altman emphasized the importance of this issue, particularly for younger users seeking guidance through personal challenges.
On the other hand, running a local language model on your computer offers enhanced privacy. You have the option to keep or delete chat outputs at will, ensuring that sensitive discussions remain private.
However, it’s important to understand that if your device is subject to legal examination, you may not retain access to these deleted files, and attempting to conceal information can lead to legal consequences. Using a local AI chatbot is legal and allows for open communication.
However, for profound issues, consulting a licensed therapist remains the most beneficial approach.