Discussing mental health concerns is smart. Can AI do the job online, filling in for the shortage of human providers? Here’s what you should know.
But it's also especially hard to get help you can afford. Canada, Switzerland, and Australia have about twice the number of mental health workers per person. The United States also relies more heavily on social workers and nurses, with fewer psychologists and psychiatrists than elsewhere.
Enter artificial intelligence (AI). You may one day opt to interact with computer coding in the form of an avatar responding via smartphones and laptops. You may even like it.
Cognitive behavioral therapy, designed to change thinking patterns, is a science-backed remedy for depression and insomnia. A program called Woebot is already available on smartphones. The company reports that “it has been shown to establish a lasting working alliance with users akin to the bond formed between humans.”
According to Eric Topol, MD, director of the Scripps Research Translational Institute, “people are more comfortable sharing their innermost secrets with an avatar than with a human being.”
Proponents say that AI:
- Will provide quick, accurate diagnoses
- Screen for people at risk for self-harm or suicide
- Pinpoint therapies or a drug or drug class that’s potentially best for a patient
- How a virtual therapist may understand what’s wrong
Nearly 125 million people in the United States use a digital voice assistant, asking Siri, Alexa, or some other program to perform a task or answer a question. Voice recognition and machine-learning codes equip the assistant to respond. Therapist avatars will make and process observations of human behavior, from speech to sighs to heart rate.
The next step is “deep learning” — a subset of machine-learning that allows machines to draw conclusions on their own.
AI-assisted imaging and other tests are already alerting doctors to diseases and other potential problems that humans are unlikely to notice.
Some data important for mental health emerges from medical records. For example, Veterans Affairs and other health systems flag patients and offer help based on suicide risk algorithms, which tackle a notoriously difficult problem — therapists having little success in predicting imminent danger. A pain prescription or recent injury might tip off further investigation.
Another potential source is social media. Researchers are fine-tuning ever-more-complex models to predict risk based on Facebook postings. The most effective “predictions did not rely on explicit suicide-related themes, but on a range of text features,” the authors of one study said.
Large employers have already signed up to offer employees free and immediate mental healthcare though companies like Spring Health that use machine-learning models to target care. After answering questions about personal problems and behaviors, employees are directed to an in-network provider. The provider has already heard from Spring Health about specific treatments most likely to help.
A virtual therapist could be programmed to pick up non-verbal revelations like changes in the way you touch a keyboard or frequent sighs. Some people might not realize they are depressed, Topol notes. “The intonation of all the aspects of our speech, relative to our baseline, can tell if a person is depressed, better than they know themselves, subjectively. Then you add on things like your breathing pattern — there are so many ways we can objectively determine and track continuously the state of mind.”
Scientists are working to optimize the inputs, such as which signs are the most important or necessary for a diagnosis.
Experienced psychiatrists and psychotherapists draw on their training and rapport with a patient. But computer-based systems can take in a different range of data, might be less subject to bias, and might be more flexible and better able to catch and adapt to changes in science or a patient.
Albert “Skip” Rizzo, PhD, director of medical virtual reality at the University of Southern California’s Institute for Creative Technologies, and his research team are developing virtual humans to help real humans with mental health issues.
Rizzo insists he and his colleagues are not creating virtual therapists to replace human psychiatrists and psychologists. Instead, he says his work can help alleviate the shortage of mental health providers, with avatars who are never tired, always available, and can build a huge data base on individual patients’ concerns, symptoms, and needs.
The avatars can also be an advantage to people who hesitate to reach out to doctors for help. Instead, they can privately consult with a virtual therapist and see if it helps.
Although the whole idea may seem cold and impersonal, more people might benefit, says Topol, author of “Deep Medicine; How Artificial Intelligence Can Make Healthcare Human Again.”
“We have a grossly insufficient amount of (human) professionals to help people with depression and other mental health issues. It’s critical we get this straightened out,” he says.
October 24, 2023
Janet O'Dell, RN