Can artificial intelligence transform psychiatry?
Listen to this article
Thanks to advances in artificial intelligence, computers can now assist doctors in diagnosing disease and help monitor patient vital signs from hundreds of miles away. Currently researchers are working to apply machine learning to psychiatry, with a speech-based mobile app that can categorize a patient's mental health status as well as, or better than, a human can.
What the researchers say: "We are not in any way trying to replace clinicians," said the co-author of a new paper in Schizophrenia Bulletin that lays out the promise and potential pitfalls of AI in psychiatry. "But we do believe we can create tools that will allow them to better monitor their patients."
Low-ball estimates claim one in five U.S. adults live with a mental illness, many in remote areas where access to psychiatrists or psychologists is scarce. Others can't afford to see a clinician frequently, don't have time or can't get in to see one.
Even when a patient does make it in for an occasional visit, therapists base their diagnosis and treatment plan largely on listening to a patient talk—an age-old method that can be subjective and unreliable.
"Humans are not perfect. They can get distracted and sometimes miss out on subtle speech cues and warning signs," the researchers claim. "Unfortunately, there is no objective blood test for mental health." (Actually, there are for some mental illnesses—researchers should do their research).
In pursuit of an AI version of a blood test, the team developed machine learning technology able to detect day-to-day changes in speech that hint at mental health decline.
“For instance, sentences that don't follow a logical pattern can be a critical symptom in schizophrenia. Shifts in tone or pace can hint at mania or depression. And memory loss can be a sign of both cognitive and mental health problems,” they say.
Again, some further research is needed before they make pronouncements. Memory loss can be a sign of impending stroke, or the result of a mini-stroke (which can be quite harmless) or even ocular migraine (which is quite common).
"Language is a critical pathway to detecting patient mental states," said the lead researcher. "Using mobile devices and AI, we are able to track patients daily and monitor these subtle changes."
The new mobile app asks patients to answer a 5- to 10-minute series of questions by talking into their phone.
Among various other tasks, they're asked about their emotional state, asked to tell a short story, listen to a story and repeat it and given a series of touch-and-swipe motor skills tests.
Their AI system that assesses those speech samples, compares them to previous samples by the same patient and the broader population and rates the patient's mental state.
In one recent study, the team asked human clinicians to listen to and assess speech samples of 225 participants—half with severe psychiatric issues; half healthy volunteers—in rural Louisiana and Northern Norway. They then compared those results to those of the machine learning system.
"We found that the computer's AI models can be at least as accurate as clinicians," they said.
The researchers envision a day when the AI systems they're developing for psychiatry could be in the room with a therapist and a patient to provide additional data-driven insight or serve as a remote-monitoring system for the severely mentally ill.
If the app detected a worrisome change, it could notify the patient's doctor to check in.
So, what? I can think of so many reasons why this is a bad idea that it would take several TR-length essays to cover them all. Firstly, as I have noted above, many physical illnesses can have cognitive consequences—heart disease is just one of them. Secondly one person’s “normal” is another’s “abnormal” and that bias is going to be built into the AI system (look at the differences in symptomologies and diagnoses of common mental problems in the various editions of the standard psychiatric reference book DSM).
AI in this area would make an efficient technology for use by a coercive state looking to detect changes in allegiance or nonconformity to their preferred norms of thought or behavior.
A friend of ours recently endured a 4-hour compulsory psychiatric exam as part of applying for a job. Pretty soon there’ll be an AI app for that too.
Finally (for now), God preserve us from AI researchers who have done insufficient research in the area they are attempting to bring AI into (like the ones who gave us the now laughed-at sentencing app relied on by judges). The lead researcher behind this study is the originator of the AI that grades school essays. Just imagine the biases incorporated in that!
Join the discussion
More from this issue of TR
You might be interested inBack to Today's Research
Forty percent of dementia cases could be prevented or delayed by targeting 12 risk factors throughout life
Dementia affects some 50 million people globally, a number that is expected to more than triple by 2050. Early interventions may help reduce this, and other mental health issues.
You did what with my donation? When donors feel betrayed by charities
When people learn that a charitable contribution they earmarked for a specific project was used for another cause, they feel betrayed - and often punish the charity.
Experiencing childhood trauma makes body and brain age faster
In Europe and the US, studies have shown that the COVID-19 pandemic has led to an alarming increase in child abuse. In reality this isn’t entirely new, as previous studies have shown that it was already increasing, and the virus has only made the situation much worse.
Join our tribe
Subscribe to Dr. Bob Murray’s Today’s Research, a free weekly roundup of the latest research in a wide range of scientific disciplines. Explore leadership, strategy, culture, business and social trends, and executive health.