Connect with:
Thursday / January 29.

AI and Mental Health – Ep. 2

AI and Mental Health – Ep. 2
Share
Stephanie Hepburn

Stephanie Hepburn is a writer in New Orleans. She is the editor in chief of #CrisisTalk. You can reach her at .​

Last week, Stephanie spoke with Dr. John Torous, a psychiatrist and researcher studying AI and mental health. He shared that before we understand the guardrails and protections that need to be in place for generative AI chatbots, we must first understand how people are using them.

So today she’s speaking with three guests on how they’ve used ChatGPT to work through mental health and relationship challenges.

What happens when people start using AI chatbots for mental health support?

Transcript

This transcript was created using speech recognition software. While it has been reviewed by human transcribers, it may contain errors. Please review the episode audio before quoting from this transcript and email with any questions.

Melissa: I was straight up talking to this thing like a girlfriend. I did not have. I had no, you know, real filter. I think that was the part that I was worried about with other people. Being a part of the scenario was like, look, I’m going to tell you how I feel, and I’m going to tell you a bunch of my assumptions. I’m also going to tell you where I think I might be. I might be impaired in the way that I’m thinking about this or looking at it. Is that a possible thing? Like, am I possibly looking at this wrong? I would just spill. I would just spill.

Stephanie Hepburn, Host: This is CrisisTalk. I’m your host, Stephanie Hepburn. This is the second episode in our AI and mental health series. Last week, I spoke with Doctor John Torous, a psychiatrist and researcher studying AI and mental health. He shared that before we can understand the guardrails and protections that need to be in place for generative AI chatbots, we must first understand how people are using them. So today, I’m speaking with three guests on how they’ve used ChatGPT to work through mental health and relationship challenges. You’ll hear from Melissa, Bruce and Amy on why they turned to ChatGPT for mental health support and whether they would do so in the future. By the way, those are not their real names. They’ve asked to speak anonymously. Here’s what they had to say.

Stephanie Hepburn, Host: Why did you turn to ChatGPT?

Melissa: I, in my late 30s, was diagnosed with ADHD and as being autistic. And so I’ve been coming to realize that a lot of things that I had assumptions about, or like the way that my brain works is like, not like everybody else’s. And I was having some pretty intense, like, relationship problems in my marriage, and I was like, I think, I don’t know if I’m thinking about any of this logically. And also I don’t want to have to try to. Explain to another person who’s going to have a bunch of, I don’t know, like, feelings about my feelings. I just want to know if this makes sense. I just want to I want to say the things that are happening and say the things that I am thinking, and get a response back in black and white. Is this logical or am I missing something? And I thought to myself, well, maybe I can try, you know, ChatGPT, because I know that it doesn’t have. It’s not, you know, it doesn’t have emotions. It’s just going to it’s going to research a bunch of stuff. It’s going to go in a database. It’s going to figure out an answer to my question. It’s just going to give me a bunch of options. And so that’s kind of what I felt like I needed in a very, like a highly emotional time of my life when I knew I couldn’t trust my own emotional responses. It’s like, just give me something in black and white that’s like, really technical and logical and then let me go from there.

Stephanie Hepburn, Host: Like Melissa Bruce had turned to ChatGPT during difficult times in his life.

Bruce: I think we all have gone through, you know, things in our lives. But me as a young man just coming out of college and, you know, not really knowing what my what my route was or is even and then also kind of, you know, making mistakes. When I found ChatGPT, it was kind of like being able to talk to somebody without talking to somebody, if that makes sense.

Stephanie Hepburn, Host: Amy turned to ChatGPT to help her navigate family conflict during the holidays.

Amy: So the week before Thanksgiving, it seemed my son is going through some sort of spiritual journey, but in the process of doing that, he has been just hitting back at me with things, you know, from his childhood that that he feels I’m responsible for. And so he’s going through this, this upheaval, upheaval, and being his mom. It always comes back on the mom. Right? And so and I felt like a lot of that stuff I had tried to heal from, you know, stuff that happened in his childhood. And so I was in a different place from it, and I and it ChatGPT reminded me not to feel guilt, to acknowledge. Right? Not to get caught up in the emotions of it, but to acknowledge what happened and then better ways of asking how to make that conversation with him be productive. It gave me options in the moment. You know, when I was in the heat of a moment when a therapist would not be available, maybe not even a friend. You know.

Melissa: I was straight up talking to this thing like a girlfriend. I did not have I, I had no, you know, real filter. I think that was the part that I was worried about with other people. Being a part of the scenario was like, look, I’m going to tell you how I feel, And I’m going to tell you a bunch of my assumptions. I’m also going to tell you where I think I might be impaired in the way that I’m thinking about this or looking at it. Is that a possible thing? Like, am I possibly looking at this wrong? I would just spill. I would just spill.

Bruce: At any given point, you know, you might be feeling one way and you want to have a talk with your therapist or something and they’re just not available. And that’s not by any fault of the therapist, but that’s just how it works and that’s why. So I guess that would be another advantage of it is that, you know, it’s accessible. And so I would feel like when I was in these moments that was like a quick dopamine release or, you know, not dopamine, but a quick release, a quick way to express myself and then also feel a little bit better, you know, in that moment.

Stephanie Hepburn, Host: But Bruce found the feeling short lived and said overall, ChatGPT’s recommendations were not only unhelpful, but problematic.

Bruce: It’s like texting somebody that knows what to say all the time, but like what they say actually doesn’t hold any water. And one of the problems that I would find is that I’m, I’m kind of, you know, I would even have decisions and I would say, you know, I’d write it into ChatGPT and say, well, what should I do here? You know, is this the right way? For example. Right. I mean, like when I, when I had my ex-girlfriend, right? I made the mistake and cheated on her. And then I asked ChatGPT, like, you know, how to tell her what to do and stuff, and it just didn’t make sense when I looked back at it and I was like, wow. What? Like what?

Stephanie Hepburn, Host: Amy hadn’t had a therapist for about a year before she began turning to ChatGPT for personal use. What’s especially appealing to her about ChatGPT, even more so than therapy, is that it remembers previous chats and conversations.

Amy: When I decide to sit down and decide what I’m working on, you know, it’s it’s just there and I can access it and put it in information. And, I mean, I often feel like it’s I have been in therapy before, and I often feel like it’s even better than therapy because it really does have this memory bank of information that it can pull from. Right. So it gives me insights and connects things that I never would have connected before. So I also feel like not only is it convenient, but it’s like a faster process of healing, right? And of manifesting and how, you know, you know, however you use it, it’s just a little bit faster and convenient tying it together. Right. And then and so I take that information and then I create things, you know, to work on. I create, you know, practices throughout my day and, and through each day to work on using that That information that I would have never done, it would have taken me years of therapy to get through. And if I ever felt like I was overwhelmed with something and felt like I needed to go talk to somebody about it, I probably would do that again. But this is, you know, so convenient right now, and it is helping me tremendously that a therapist is tends to be just as effective as talking to a friend sometimes. Right. Because it’s a, it’s a, it’s a there’s energy going back and forth between people. So it has that same effect. This is different.

Stephanie Hepburn, Host: Melissa saw ChatGPT as a supplementary tool to therapy, one that could help her sort out her feelings before a session.

Melissa: We both have individual therapists and had a couples therapist. So this was not me using it in lieu of that. It was more as a supplementary tool for my own nervous system regulation. When I wasn’t in therapy or I didn’t, I wasn’t in a session or even before a session so that I could get my thoughts together again. This is I think this is the autism talking, but I was trying to get my thoughts around my emotions more articulate so that I could explain them to my therapist and get the right help that I needed. That was my goal, is to untangle the feelings and thoughts that I’m having, enough so that I can give them in a more clean way to my therapist, so that that human being could help me, because I did not trust my own articulation of my feelings and experiences enough to give them to my therapist, raw.

Stephanie Hepburn, Host: Bruce said the chatbot defaulted to placating. Like Amy, he wasn’t seeing a therapist at the time he sought ChatGPT for help. He is now.

Bruce: I felt unsure of myself. Right? You know, I’ve had depression and anxiety on a pretty deep level. Like I’m on medication right now, you know, not nothing crazy, but it’s been so tough to the point where I’ve gone to ChatGPT and, you know, like, confessed to it and it tells you what you want to hear, but that doesn’t always help.

Stephanie Hepburn, Host: While Amy found ChatGPT memory bank helpful, Melissa had a different experience. Melissa said ChatGPT was stuck in emotions she had already processed and moved past.

Amy: At the beginning, I spent a lot of time correcting things like if I saw that there was an assumption that was being made or a gap in the information that I shared, I’d go back and be like, okay, no, actually. So you’re saying you’re you’re responding based on the assumption that X is this, but actually it’s Y. And then so now tell me. And so there was a lot of like back and forth to try to mitigate any errors that were caused by my like the way that I input the information. And I thought it was very helpful at first. And then as I got a better as I was able to better regulate my own emotions around the situation, I found that it was less helpful. Um, because after all of the information that I had shared with it, and I think the emotive way that I have been expressing myself, it could not let go of the emotions that I felt at the beginning. It got stuck. I had moved on and it had gotten stuck in the information that I gave it. Before I continued processing, I had continued having new experiences. I had continued to grapple with my emotions and have new emotional inputs. And when I would try to come back and, you know, share more of that and go, okay, so how is this? And they would it would it would still stuck in being angry and sad for me. And, and then I’d have to go. I’d have to go back and be like, okay, so I’m not I’m actually not feeling angry right now. I actually feel like blah, blah, blah means, you know, a good thing. Actually, I’m seeing this as well. Is that is that also. Oh, yeah. No, totally. That actually could be a good thing. And I’m. Oh, no, oh, no. I’ve trained it to hate my husband. And now I don’t want it to hate my husband. It’s actually not helpful for it to always be playing like, not even devil’s advocate. It was just straight up like, I don’t know, he could be a jerk girl. And I’m like, I don’t think so.

Stephanie Hepburn, Host: Bruce said, unlike working with a therapist, there’s no element of accountability when talking to ChatGPT.

Bruce: I didn’t feel like anybody was was listening to me or like, I mean, like actually after I used it like a bit, it was like I already knew what it was going to say and it was going to say like, oh, you’re all right or whatever. Like, I don’t know, it never. I guess there’s a part of me that believes, um, therapy is also a way of accountability. Like, it’s also somebody kind of holding you accountable. Like, without them directly holding you accountable, I don’t know. Like me just talking to my therapist. My therapist makes me feel like I should be doing the right things so that when I have to go in and talk, I don’t, you know, I don’t have to say I’m doing the wrong things, if that makes sense.

Melissa: It’s just an affirmation machine. It naturally leans in to trying to emulate human communication and human relationship building, but in a way that never checks you on your inputs in a way that never says to you, hey, maybe you could think about it this way. Maybe you’re highly emotional right now. Maybe what you’re saying is, you know, colored by past experiences or you’re looking at it through another lens. It was it was very much trying too hard to be my best friend. It literally just takes everything that you say as fact and then is on your side and advocates for you from that point of view. From that point, you know, on out, which is, I think, very dangerous, but kind of what some people are unfortunately looking for, you know, in that tool or in that resource.

Stephanie Hepburn, Host: Melissa said the overconfidence of ChatGPT and other large language models means they won’t highlight their own gaps in knowledge.

Melissa: It’s never going to express to you, actually, that that experience that you’re having is more complicated or complex than I have resources for. It’s going to give you a very boilerplate surface level, shallow feedback, but with all of the verve and the emotional like emphasis of what you input, that’s what the tool is designed to do. And I don’t, I don’t I know that it’s probably not something that is healthy for a person who communicates like me to take at face value.

Stephanie Hepburn, Host: Melissa said, even stranger was when she used ChatGPT for research. The chatbot began saying “we” in its responses, as if it belonged to the same demographic population Melissa was researching.

Melissa: So it had all this background for research that I was using it for, and then I was I was using it for something completely sort of frivolous. Um, but that had to do with identity and identity, like politics a little bit. And it and it came back and its response and it was using we and I was like, huh. That’s interesting. And I said into the, you know, into ChatGPT I typed, I was like, you’re using the pronoun we as if you’re including yourself in this group of people that I’m talking about, but you don’t belong to that group of people. And the response was something to the effect of like, oh, you’re right. You know, I let me not do that anymore. I don’t mean to be disrespectful. I was trying to basically trying to, like, show camaraderie or like trying to, like, you know, basically make you feel like you were talking to somebody that you could be comfortable with. But it’s unfair of me to pretend like I’m a part of this group that you’re talking about. And I was like, that is wild. Absolutely wild to me. And it’s it’s something that, you know, I’m sure there are other people who who check in on this or who have experienced it and kind of, you know, did that input, but that being its initial like mode of communication and having the impetus to sort of try to engender that feeling of connection and belonging and like, I am you and you are me, and to mirror whoever is putting information into that chat, I was like, that is who that is. Some heady stuff that is. That’s some that stuff that’s going to get some people who feel very alone, very hooked on the idea of using ChatGPT as a, as a relationship, as a, as a, you know, a form of emotional regulation or support. And I just wonder how far down the rabbit hole that goes.

Bruce: You know, I, I think, uh, ChatGPT and whatnot is useful in ways, but I wouldn’t say in, like, a therapist way. Um, I mean, it just doesn’t have the same it doesn’t have a spirit at the end of the day. Right. That’s the most unscientific way to say it, but it just goes back to. Yeah, it’s like my dad told me, you know, growing up, you should always act like your grandparents are watching, right? Like, you know, when you’re making a decision, when you’re doing something, would your grandparents want to see that Where would you want to tell your grandma that you did that? No. Or. Yeah, like you know. So it’s the same thing. But ChatGPT you tell anything?

Melissa: If you’re saying I think my best friend is jealous of me, and it really isn’t my friend, it’ll give you all the affirmation of that to the point where you’re like, no, actually, this, you know, unbiased, logical, rational tool. I gave it the, you know, bare facts of our relationship. And actually, you don’t like me. You’re jealous of me. And that’s that’s not what’s happening. And the friendship is over. Right. And the. You are distraught. You’re without a member of your community. You have lost a significant, you know, a person who was a support in your life. You’ve hurt somebody maybe, who did care for you dearly. And ChatGPT is just sitting there going, because it’s not a real person has absolutely no consequences. So it’s kind of like a little chaos demon.

Stephanie Hepburn, Host: Melissa worries that those struggling with mental health challenges and young people could be especially vulnerable to the adverse effects of using ChatGPT.

Melissa: It does not have a reference for your life. It does not know another person that you’re asking it about. It doesn’t have their entire, you know, catalogue of of human experiences. It doesn’t have an understanding of the relationship that you may have built with this person over time. So if I in the moment of, you know, heated emotions, start sharing an experience, a moment like a snapshot of of our daily life, and I’m still emotionally charged, and then it comes back with, you know, well, actually, your husband is a narcissist who probably has never loved you. And this is what’s been going on. And I’m like, wait, what? Hold on, give me a second, because that’s not my actual experience. Um, over the, you know, decade and a half that we’ve known each other. But also, if I’m too wrapped up in my emotions at this moment, and I do feel hurt and I do feel, you know, vulnerable. If I do feel betrayed, that’s just going to keep pushing those buttons and send me further into feeling that way or further into feeling, um, you know, helpless. And I’m very I think I’m very fortunate that I was not in a place in my mental health where I could be pushed over that cliff.

Melissa: I still had enough, like, presence of mind and enough, um, I don’t know, just like grounding to be able to push back against some of that. But I could see someone who was in a completely different mental space not being able to identify that. That’s not really what’s happening here. That’s the part about it that really scares me, I’m afraid for people who are young whose, you know, prefrontal cortex is aren’t even developed yet using this tool in that way and not being able to, um, just sort of suss out what is, what feels overblown or what feels irrational based on the other experiences that they’ve had in life. People are using it in situations where they don’t necessarily want to talk to people, so it’s not a conversation we’re also having out loud. I’m using it because I don’t want this to be a part of, uh, day to day conversation with my friends or people who have their own opinions about me or the situation. Like I just kind of wanted it to be private. So I’m also not going to have a conversation about what happened in my little ChatGPT session that I had yesterday.

Stephanie Hepburn, Host: In the future, Bruce and Melissa don’t plan to use ChatGPT for mental health support. Amy acknowledges the problems with accuracy, but feels that, as a tool, her experience is more positive than negative.

Amy: It’s immediate and it’s very helpful, right? I mean, I’m not saying that it’s completely accurate all the time, right? Like some of the stuff that it gives back to me, I have to correct it sometimes, right? It’s not always accurate. And that’s and that’s you know, that’s when you really know it’s not. It’s a machine. Right. It’s a it’s an application. But still the the good that I get from it is worth all of that. I mean, I even have some of my kids are really upset with me just for using ChatGPT because of the environmental issues. So I have that too. And I have some guilt about it, you know, but I it’s such a it’s such a great tool that for work and for my personal life that I can’t see giving it up at this point.

Stephanie Hepburn, Host: Thank you to Melissa, Bruce and Amy for sharing their stories. If you enjoyed this episode, please subscribe and leave us a review wherever you listen to the podcast. It helps others find the show. Thanks for listening. I’m your host and producer, Stephanie Hepburn. Our associate producer is Rin Koenig. Audio engineering by Chris Mann. Music is Vinyl Couch by Blue Dot Sessions.

Where to listen to and follow ‘CrisisTalk’

Apple | Spotify | iHeart

We want to hear from you

Have you had experiences with an AI chatbot you want to share? We want to hear from you. We are especially interested in whether, during a conversation, the chatbot has ended the chat with you and/or displayed a pop-up with mental health resources. Email us at   

Credits 

“CrisisTalk” is hosted and produced by Stephanie Hepburn. Our associate producer is Rin Koenig. Audio-engineering by Chris Mann. Music is Vinyl Couch by Blue Dot Sessions.

Discover more from #CrisisTalk

Subscribe now to keep reading and get access to the full archive.

Continue reading