fbpx

Can AI help, or hurt, your mental health?

It seems, much to many people’s dismay, that artificial intelligence is here to stay. Not only that, but AI seemingly is getting “smarter,” at least in some respects. But when it comes to your mental health, is AI helpful or hurtful? The answer is, it depends on how you use it.

I have heard several people say that they have turned to AI chatbots to assist them with their mental health, and some people have reported that this has been a helpful tool, a way to augment in-person therapy between human-to-human, face-to-face sessions. I also have heard horror stories of AI chatbots encouraging people to attempt suicide, with some deadly results. There have been several articles in reputable newspapers about the dangers of AI therapy, including this one: https://www.nytimes.com/2025/08/18/opinion/chat-gpt-mental-health-suicide.html?unlocked_article_code=1.fE8.4Ayo.a6QV2mcyJpq4&smid=nytcore-ios-share&referringSource=articleShare.

Can AI be a helpful tool for those folks who may need help during their regular therapist’s off hours or during the middle of the night? Perhaps. It depends on how these AI chatbots are being used and for what purpose. The fact of the matter is, though, that there are real people to turn to when you or someone you know is in crisis at 2 a.m. or when you cannot get in touch with your real, human therapist. It seems, though, that many people feel more comfortable endorsing suicidal ideation or panic attacks or deep depression to someone not actually in the room with them, and therein lies a huge problem. Can AI help those people? Perhaps. But it also could hurt. And, there are other resources available.

Anyone who is feeling depressed or manic or hopeless or suicidal can text or call 988 and text or speak with a real, live human, who actually has been trained to assist with suicidal thoughts and other mental health concerns. Has your AI chatbot really been trained to compassionately and empathetically respond to someone who is in crisis? I am not so sure about that. Furthermore, many county health departments offer assistance to those in mental health crisis just by calling a phone number. In DuPage County, Illinois, USA, people can call 630.627.1700 around the clock, every day of the year, and speak to a real human to help them through mental health crisis. Many hospitals also offer psychological and behavioral health services, and there are countless mental health providers offering in-person, telephone or telehealth sessions to those in need of such services.

Can AI be a useful tool for those not wanting or willing or able to speak with a real human therapist? Perhaps. But AI is limited in its breadth of mental health knowledge and know-how. AI really should only be used to augment real conversations with trained mental health providers. AI can be useful for certain things at certain times, perhaps.

For in-depth, real conversations regarding your mental health, it likely would be best to reach out to a real, live human being who has been educated and trained to provide evidence-based treatments for mental health and/or substance use concerns. Yes, AI might be helpful to use as a tool between sessions, but there are downsides to using artificial intelligence to help you on your mental health journey. A real therapist can offer you homework assignments to complete or augment your in-person or telehealth sessions, as well.

Can AI be a useful tool in assisting those who need services during off hours? Yes, potentially. But, do keep in mind the potential pitfalls of turning to something not educated or trained in mental health or substance use services. Perhaps someday, humans will be outmoded. Until then, know that human mental health and substance abuse professionals are available in a variety of ways or myriad concerns. You know you best. What can you do today to support your mental health?

~ Karri Christiansen, MSW, LCSW, CADC, CCTP

Share this blog post:

Explore the Latest Articles