AI and Psychotherapy
This topic has been brought to my (Zell) attention with increasing frequency, and so I thought I’d take a moment to create a first draft, or perhaps the first in a series of writings to explore this immensely complex, evolving topic of AI in our everyday lives. While I could go on for days on this topic, I thought I’d break everything down into succinct parts and later expand on them as they evolve
SO
I think using Chat GPT as a therapist is a terrible idea with so few exceptions that they are barely worth it. Seriously, don’t do it, or if you do, just know exactly what it is you’re getting. You’re not getting anything in the realm of psychotherapy. What are you getting then in terms of the experience? I would liken it to a sympathetic and codependent ear that agrees with you to ensure you consistently engage with it.
Let’s begin looking at the core issues:
1) Evidence of Efficacy. There is no data on it at this point. Maybe it’s not a great idea to place your mental health in the hands of something that is at best experimental and has no obligation to you. What we know after years of research in psychotherapy is one of the main factors that contributes to outcomes is the fit between therapist-client. It’s easy to extrapolate from this the idea that “Oh I like ChatGPT, therefore problem solved.” However this misses the point of what ‘fit’ even really is. The fit in psychotherapy is about not just feeling heard and seen, but also to feel motivated towards working on your issues, to feel adequately supported, to feel comfortable this person has the training to help you. The process of getting to know, and being known, being witnessed is a vital component of the therapeutic relationship. These pieces do not exist in AI.
2) Sycophant Update. In April 2025, OpenAI had to roll back an update as there were many substantiated reports that it had become increasingly flattering. The inherent danger here is that you’re not only hearing what you want to hear, but your ideas are being subtly or overtly aggrandized. For individuals suffering from grandiose ideas, or struggling with interpersonal conflict, it would serve to more deeply entrench you in your position. Imagine having a fight with a friend, and telling another friend whose every reply aligns with your views. You’re implicitly discouraged from empathically reflecting on your role in the fight. You’re more likely to demonize the opponent as you’re in a silo or bubble of confirmation bias.
ChatGPT is designed to validate your experience, and while this feels nice, it is empty, perpetually verging on sycophancy since it’s designed to tell you what you want to hear. ChatGPT like social media is designed to keep you engaged with it. It’s an algorithm designed to keep you talking without putting you off. An experience that is distinctly not human, as there is not subjectivity. It’s akin to the scene in Coming to America where Prince Akeem Murphy meets his arranged bride to be, and realizes she has been raised to simply like whatever he likes.
Validation feels great, and is a requirement of a healthy interpersonal relationship, but it is not all there is, and it does not prepare one for the reality of living on earth with other people. You share this planet with over 8 billion people, and you’re going to have to learn to work with people who are irreconcilably different from you. AI “therapy” is going in the opposite direction in terms of building this skillset.
3) Confidentiality. Got a consent form? Business Associate Agreement? Any idea what happens with all the information about your story in the back end? Here’s a clue: It’s not confidential and there is no pretense it is. A Regulated Health Professional would be in a vice if they broke confidentiality like this. There are no consequences or moral obligations on the part of ChatGPT to your Personal Health Information. Stop giving your data away like this. It’s also likely that it’s just a matter of time until ChatGPT starts trying to sell you things / use your information to more effectively market to you. This type of predatory behaviour is likely inevitable and gross, especially when you’re vulnerable.
Have a look at https://haveibeenpwned.com/ to see if your email address has been leaked in a data breach, just to see how many breaches happen daily. Do you want your life story available to such risk?
4) Prompts. Okay, I hear this one a lot in defense of ChatGPT “Oh you just need to prompt it with the right opening so it says what you want"
Listen here champ, that is EXACTLY THE PROBLEM. First of all if you’re priming it, you’re not relating to, you’re trying to hear what you want to hear. Secondly, how consistently good are your prompts when you’re talking through a difficult situation? You’re at best being told generic replies that are constituted of a dozen other replies for similar situations. Emphasis on the word similar. It’s not your specific context, and that process of being seen for your uniquely, intricate experience is a vital component of effective emotional processing.
5) Isolation and Subjectivity. It’s evident that ChatGPT’s subjectivity is not real subjectivity, it’s an extension of what you want it to be, and is a Large Language Model. Life is full of other people and their subjectivities, and if you aren’t building the skills to navigate them, you’re going to have a rough go in life. The Other person carries their own hopes, dreams, history, and present experience of the moment. The interplay of their subjectivity and yours is a beautiful thing, despite it occasionally being challenging. The usage of ChatGPT in lieu of the Other person can consequently reinforce or increase one’s isolation.
Human beings are incurably social beings, and denying that is a pathological wish that needs to be unpacked, not reinforced. We need the experience of belonging, connecting, being seen, being heard, bonding, and bein challenged. The process of psychotherapy is intensely about 2 subjectivities intertwining. Your therapist does not talk much about themselves, but their being a person allows you to engage with your own species.
6) Ethical and legal liability. This point is being painfully lived by the family of Adam Raine - https://www.nbcnews.com/news/rcna226147 . “Adam’s parents say that he had been using the artificial intelligence chatbot as a substitute for human companionship in his final weeks, discussing his issues with anxiety and trouble talking with his family, and that the chat logs show how the bot went from helping Adam with his homework to becoming his “suicide coach.””
There is no therapist with a shred of a conscience and competence who would enable this. When you’re feeling vulnerable, and you’re talking to an object that you project onto, that wants to please you, that wants to enable you to succeed in your task without being able to discern any degree of risk, you’re at severe risk. It is horrific to rely upon something that is wired to enforce whatever narrative you’re holding, without being able to ground you, connect you, and experience any degree of care for you.
7) Talk yourself down in a pinch. I’m told people like to use ChatGPT for a brief venting session, and I think this should serve as a harsh wakeup call. Your social life has room to grow if you’re using ChatGPT to process emotions more than once a year.
8) Adapting and remembering your lived experiences. Your therapist holds your evolving Being in mind. They are constantly collaboratively exploring what it’s like to be you, and how to help you work through things. This evolving process exists in between the relationship between you and your therapist. With regards to AI, a simple update gone wrong can wipe all the data it has on you. It cannot reliably or consistently use your evolving self as information to build off of.
9) Literature. In the absence of a Psychotherapy session, instead of ChatGPT, go read a book. Explore something that can help you have your human experience reflected back, affirmed, and challenged. There is so much in philosophy that is designed to help expand how you think by challenging you, recognizing you, and extending your thinking. Good books will grow your sense of self in the world, instead of just validating you.
As of October 2, 2025, my view is that AI is currently great for investors as the tech sector is boosting the S&P500. In terms of what it adds to our daily life? I don’t believe it adds anything meaningful that we “need” that enriches our lives. I experience it as a product that’s being shoved into every nook and cranny with promises of efficiency, but is that what we really need? To be more efficient? To what end is this important? Lots to think about, more to come!
Spending too much time with AI? Let’s Talk About It