• ARGUS History
  • Join The ARGUS
  • Advertise With Us
  • ARGUSnewsnow
  • Contact
  • ARGUS H.E.A.L
Tuesday, December 9, 2025
  • Login
St. Louis Argus
  • HOME
  • NewsWatch
    • St. Louis City
    • Community
    • Politics
    • Education
    • National
    • World
  • A Closer Look
    • Opinion
  • Events
  • the vibe
    • Art & Entertainment
    • Beauty, Wellness and Fashion
    • Books
    • Education
    • Entertainment
    • Fashion
    • Harris Stowe
    • Food
    • Health
    • HBCUs
    • Sports
      • Black College Sports
      • High School
      • St. Louis Cardinals
  • National
  • E-Editions
  • The Narrative Matters
  • Video
  • Contact
No Result
View All Result
  • HOME
  • NewsWatch
    • St. Louis City
    • Community
    • Politics
    • Education
    • National
    • World
  • A Closer Look
    • Opinion
  • Events
  • the vibe
    • Art & Entertainment
    • Beauty, Wellness and Fashion
    • Books
    • Education
    • Entertainment
    • Fashion
    • Harris Stowe
    • Food
    • Health
    • HBCUs
    • Sports
      • Black College Sports
      • High School
      • St. Louis Cardinals
  • National
  • E-Editions
  • The Narrative Matters
  • Video
  • Contact
No Result
View All Result
St. Louis Argus
No Result
View All Result

Thomas Plante: The Mental Healthcare Crisis

Corey S. Powell by Corey S. Powell
August 2, 2024
in The Narrative Matters
Home The Narrative Matters
Share on FacebookShare on TwitterShare on LinkedinShare with Email

Learn about Thomas Plante’s efforts to tackle the mental healthcare crisis and end the stigma surrounding mental illness, while improving access to treatment options.

Plante, a psychologist and ethicist, weighs in the pros and cons of chatbots. Can AI fill in for a shortage of human therapists?

By Corey S. Powell

When you can’t get out of the house and feel strapped for cash, AI therapy could be a useful tool. Credit: DALL-E

Thomas Plante: “When people are looking for some stress relief, maybe they have mild to moderate anxiety, these kinds of things make good sense.”

Therapy has changed a lot since the days of Sigmund Freud, from the advent of cognitive behavioral therapy techniques to the use of psychotropic drugs. Now this field is facing a huge shift in who, or ratherwhat, is interacting with the patients. Over the past decade, tech companies have begun rolling out chatbots—interactive AI programs—that use algorithms to dispense mental health advice. In a 2021 survey, 22 percent of Americans said they had already experimented with one of these computerized therapists. 

Thomas Plante, a psychology professor at Santa Clara University and adjunct professor of psychiatry and behavioral sciences at Stanford University School of Medicine, has watched the rise of mental health machines with a mix of excitement and concern. Plante, who teaches at the Markkula Center for Applied Ethics, wonders about the ethical implications of therapy without the therapist. He also runs a private clinical practice, where he sees up close what works and what doesn’t for his patients. Here, he talks with Corey S. Powell, OpenMind‘s co-founder, about the pros and cons of AI therapy. (This conversation has been edited for length and clarity.)


LISTEN TO THE PODCAST


READ THE INTERVIEW


Statistics show that a record number of Americans rely on therapy and other forms of mental healthcare. I also hear reports of a chronic shortage of therapists, so many people who need help aren’t getting it. Is there a mental healthcare crisis in the United States?

Absolutely. The Surgeon General at the end of 2021 issued an unprecedented advisory about this mental health crisis. The evidence suggests that a lot of people have significant concerns about mental health across the age span, particularly among youth. We call it a mental health tsunami: anxiety, depression, substance abuse, suicidality—they are all off the charts. There are not enough mental health professionals out there. And even if there were, it’s a hassle. It’s expensive, and a lot of things aren’t covered by insurance. It makes sense that people would be turning to web-based approaches for their mental health issues and concerns.

Do you see an important role for AI therapists? Is this technology good enough to fill in some of the shortage of human therapists?

This is all brand new. In the health world, we always say that before you offer treatment to the public for anything, it has to go through randomized clinical trials to demonstrate that it works. You just don’t want a willy-nilly rush into something that may not only not work but could hurt people. I live in Silicon Valley, where one of the mottos is “move fast and break things.” Well, you don’t want to do that when it comes to mental health treatment. You want to do those studies so that, at the end of the day, you’re giving people evidence-based best practices. But some people don’t want to spend the time. They create these things and just launch them. There are issues with that.

You run a clinical practice. Do your patients use any of these online therapy tools and, if so, do they find the tools effective?

I’ve had patients who use some of the popular apps like Calm (a meditation app), which has been around for a while. I think that’s fine. A lot depends on the person’s diagnosis. When people are looking for some stress relief, maybe they have mild to moderate anxiety, these kinds of things make good sense. But if they are experiencing major psychopathology—let’s say, schizophrenia or major depression that includes active suicidality—then I would worry about overreliance. 

The fear is that the technology is oversold, and people might not get effective treatment from a licensed therapist (when they need it) because they’re overly reliant on this convenient app.

I’ve noticed that people have a strong tendency to project consciousness and intent onto these AI systems. Could that impulse help make therapy bots seem more believable—or maybe too believable, depending on your perspective?

That’s a great and complicated question. People project their desires and fantasies onto other human beings, and they do it onto devices too. It can help with the placebo effect: If you’re told that this is a great app and it helps lots of people, you’re going to expect it will help you too. But then you might expect perfection from it because you can’t see the flaws. When you’re working with a human being, even if you think they’re very helpful, you can see the flaws. Maybe your therapist has a messy office, or they were late for your session, or they spilled their coffee on their notes. You can see that they’re far from perfect. You don’t see that with the computer-generated chatbot.

The other thing that’s important to mention is that the helpfulness of therapy for most people goes beyond technique. Part of the helpfulness of therapy is having that human being who is on your side, journeying with you as you face difficult things. Something as simple as listening without judgment and with empathy goes a long way. A chatbot is not going to do that.

I can see the limitations of therapy chatbots, but could they be actively harmful? You alluded to the risks of not going through clinical evaluation. What are those risks?

A lot of people—not just the average people on the street but also the average person working in the industry—seem to think that you can’t hurt people with these apps. You know, “They give you some less-than-perfect advice, but it is not exactly like they’re doing brain surgery.” And yet we know that people can be harmed by licensed mental health professionals, never mind these apps. For instance, anorexia nervosa has a fairly high mortality rate. The last thing you want to tell a teenage girl who’s suffering from anorexia is to congratulate her for eating less. (This happened last year with the Tessa chatbot created by the National Eating Disorders Association.)

These are the challenges of companies not going through the kind of approval process they would have to do if they were offering pharmaceuticals. We also don’t know a lot yet about how much they can help. You have to do longitudinal research, and it takes a while for that research to happen.

The reality is that millions of people are already using therapy bots, and that number is surely going to increase. Where do you see this field heading?

More and more in our culture, we are looking toward computer-related ways to solve our problems. You want to buy a book? You go to Amazon, when you used to go to the bookstore. A lot of people want services through their computer, because there are a lot of advantages: The convenience and cost are very compelling. For people who have mild to moderate issues, it’s possible that AI is going to become the go-to approach, although it’s just one tool of a variety of tools that could be used to help people.

The fear is that the technology is oversold, and people might not get effective treatment from a licensed therapist (when they need it) because they’re overly reliant on this convenient app. If you want to get the most out of your life, you need to use all the tools that are available to you. There’s no one-size-fits-all solution.


This Q&A is part of a series of OpenMind essays, podcasts and videos supported by a generous grant from the Pulitzer Center‘s Truth Decay initiative.

#mentalhealthawareness #mentalwellness #endthestigma

Post Views: 3
Tags: mental illnessstigmatherapy

Related Posts

Business

The Rise of “Boutique FinTech”: Why Niche Payment Solutions are Thriving

November 27, 2025
Community

Sending Love and Thanks This Black Greek Holiday Season

November 27, 2025
Business

Why Donor-Advised Funds Make Sense for Tax-Savvy Donors

November 26, 2025
The Narrative Matters

How Is Ford Redefining Comfort and Power in Its New Vehicle Range?

November 24, 2025
The Narrative Matters

Proven Techniques Experts Utilize to Extend The Life Of Diesel Transmissions

November 24, 2025
Books

Unfinished Business: A Journey to Bridge the Gap Between Generations

November 21, 2025
Next Post

Thanks to Medicaid Expansion, More Black Americans Can Access Clinical Trials

No Result
View All Result

Latest News

Why We’re Leaving Our Downtown St. Louis Loft: Addressing Disappointment, Politics, and Safety Concerns

December 8, 2025

Four Seasons Hotel St. Louis Announces Multi-Million Dollar Spa Renovation | Premier Wellness Destination

December 8, 2025

Alpha Phi Alpha Fraternity, Inc. | The First African American Greek-Letter Fraternity

December 4, 2025
Facebook Twitter Instagram
  • ARGUS History
  • Join The ARGUS
  • Advertise With Us
  • ARGUSnewsnow
  • Contact
  • ARGUS H.E.A.L

© 2025 JNews - Premium WordPress news & magazine theme by Jegtheme.

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

No Result
View All Result
  • HOME
  • NewsWatch
    • St. Louis City
    • Community
    • Politics
    • Education
    • National
    • World
  • A Closer Look
    • Opinion
  • Events
  • the vibe
    • Art & Entertainment
    • Beauty, Wellness and Fashion
    • Books
    • Education
    • Entertainment
    • Fashion
    • Harris Stowe
    • Food
    • Health
    • HBCUs
    • Sports
      • Black College Sports
      • High School
      • St. Louis Cardinals
  • National
  • E-Editions
  • The Narrative Matters
  • Video
  • Contact

© 2025 JNews - Premium WordPress news & magazine theme by Jegtheme.