Skip to main content

Verified by Psychology Today

Artificial Intelligence

Would You See an AI Therapist?

The transition to an AI therapist may be easier than you think.

Key points

  • AI therapists may offer unique and substantial conveniences.
  • The transition to AI may happen not out of a rejection of human therapy but simply because of convenience.
  • Research suggests that AI can sound more empathic than human therapists.

Picture this:

Life is stressing you out, and you’ve been irritable with your family. Even walking the dog, an activity that you typically enjoy, is frustrating you – how many flowers do you have to smell? You know it would be smart to see a therapist, but who has the time to even find one – let alone schedule an appointment and show up? You must comb through the therapists covered by your insurance, see who has availability, and determine if your schedule is compatible with theirs. Plus you’ve been down this road before – it’s not necessarily easy to find a therapist you resonate with. Doing nothing is easier, cheaper, and requires less effort…

Zainagatdinov Marat/Shutterstock
Source: Zainagatdinov Marat/Shutterstock

And now picture this:

Life is stressing you out, and you’ve been irritable with your family. Even walking the dog, an activity that you typically enjoy, is frustrating you – how many flowers do you have to smell? You know it would be smart to see a therapist. You recall an ad for these super inexpensive and research-backed AI counselors. Apparently, lots of people rated them highly. Why not give one a try – you can always seek out the help of a real therapist if it doesn’t work out. And furthermore, you’ve talked to real therapists before, and it’s not always comfortable or even helpful. So, you download the app and within five minutes you’ve been asked a series of questions designed to create your perfect counselor – your choice of gender, age, and appearance. You select their “specialty” based on what you plan to talk about. You even decide their personality – chatty, maternal? Whatever floats your boat. Seconds later you are asked if you are in a private location, and then a very real looking avatar shows up on your computer screen. “How can I help?” she says with a warm smile, as she pushes strands of long hair behind her ear. You feel yourself smile in response, and you silently applaud yourself on a taking a much-needed step toward self-care.

And that’s how I envision it – the increased popularity of AI therapists will be subtle, but not necessarily slow to unfold. AI therapists won’t become popular because we reject human versions, or because we initially consider them more qualified. It’s just that they will be so simple to utilize. And ultimately, they will prove themselves to be quite beneficial.

Research supports this transition.

Just as it’s hard to imagine that AI will replace all human therapists, it’s equally hard for me to imagine that AI therapists will be unlikeable and unhelpful. Particularly for younger generations, who show more willingness to engage AI in all aspects of their lives, the transition to AI mental health won’t be as dramatic as it may sound at this moment. In addition, AI won’t have bad days, brain fog, or take weekends and evenings off. In fact, it will be available 24 hours a day, so when you can’t sleep, your therapists will be right there with you. Your AI therapist won’t be distracted by hunger pangs or a sick child. It will remember everything you tell it, and it will be able to integrate everything you have ever said to it in seconds. It may not even act less compassionate than a human therapist – one research study demonstrated that AI can already be perceived to be more empathic and helpful than a human therapist (Vowels 2024). Furthermore, AI can sometimes look more human (Tucciarelli et al., 2022) and trustworthy (Nightingale & Farid 2022) than people. Research also shows that some folks actually prefer talking to an AI therapist precisely because of what it lacks – the potential negative judgment of others that is inherent in humanity (Pickard et al., 2016). Perhaps most importantly, recent research suggests that people experience similar emotional benefits whether they believe they are self-disclosing personal information to a human or to a chatbot (Ho et al., 2018).

Of course, some people will always prefer a human therapist. Further, human therapists will probably always be necessary in certain contexts, such as inpatient psychiatric services. And there are risks with AI like data breaches that will create very real challenges. But I expect that certain therapists will be hit harder by the AI transition than others – for example, younger, less experienced therapists may be more easily replaced by AI models. Further, therapists who charge exceptionally high prices may be priced out of the market by inexpensive and easily accessible AI. And this AI transition isn’t unique to therapists – we could have a similar dialogue about physicians, lawyers, financial advisors and teachers. All professions will be feeling the shift to AI in the coming years.

Technology is already changing what it means to be human. For many of us, this transition gives us pause, as changes often do. I am no exception – while I feel this evolution is inevitable, it makes me sad personally and professionally. Being a psychologist has added so much to my life – it’s a pivotal part of my identity, giving my life purpose and satisfaction. At the same time, we must ask ourselves whether there is harm inherent in this transition. As much as change can be unsettling, we are poised to benefit greatly from the many changes already unfolding before us. It’s an astounding time to be alive, witnessing the unfolding of the future of intimacy.

References

Ho, A., Hancock, J. & Miner, A. (2018). Psychological, Relational, and Emotional Effects of Self-Disclosure After Conversations With a Chatbot, Journal of Communication, 68, (4), 712–733, https://doi.org/10.1093/joc/jqy026

Nightingale, S. & Farid, H. (2021). AI-synthesized faces are indistinguishable from real faces and more trustworthy, PNAS, 119(8), https://doi.org/10.1073/pnas.2120481119

Pickard, M., Roster, C. & Chen, Y. (2016). Revealing sensitive information in personal interviews: Is self-disclosure easier with humans or avatars and under what conditions? Computers in Human Behavior, 65, 23-30, 10.1016/j.chb.2016.08.004

Tucciarelli, R., Vehar, N., Chandaria, S. & Tsakiris, M. (2022). On the realness of people who do not exist: The social processing of artificial faces. iScience, 25, 105441, https://doi.org/10.1016/j.isci.2022.105441

Vowels, L. (2024). Are chatbots the new relationship experts? Insights from three studies. Computers in Human Behavior: Artificial Humans, 2, 100077, ISSN 2949-8821, https://doi.org/10.1016/j.chbah.2024.100077.

advertisement
More from Marianne Brandon Ph.D.
More from Psychology Today