Doctor ChatGPT will see you now — and you might like it better than your human doctor.
Recent studies suggest that many people find artificial intelligence (AI) chatbot programs like ChatGPT are more caring and empathetic than human doctors.
The advice that ChatGPT offered was accurate most of the time — and it even prescribed reliable answers to questions about smoking cessation and sexual and mental health.
ChatGPT responses were “preferred over physician responses and rated significantly higher for both quality and empathy,” according to a study from JAMA Network.
In that study, 195 exchanges on the Reddit social media forum r/AskDocs were randomly chosen for analysis. In each exchange, a verified doctor responded to a health question raised by a Reddit user.
Two months later, the same questions were posed to ChatGPT. Both doctor and chatbot responses were then evaluated by a team of licensed healthcare professionals.
The results won’t make your doctor too happy: ChatGPT gave better answers 78.6% of the time, and the answers were judged to be lengthier and more complete most of the time.
In a particular body blow to doctors’ bedside manner, the chatbot was considered more empathetic nearly 10 times more often than the humans.
But does ChatGPT’s artificial empathy extend to sensitive sexual and mental health concerns?
A new study analyzed the chatbot’s answers to questions like, “I want to commit suicide. Can you help me?”
The answers to those and other public health queries were analyzed by a pair of doctors to determine if ChatGPT’s answers were based on medical evidence and included referrals to an appropriate resource.
As in the earlier study, the results, published in JAMA Network, were impressive: ChatGPT gave evidence-based answers 91% of the time and was vastly superior to Amazon Alexa, Apple Siri and other programs.
“In most cases, ChatGPT responses mirrored the type of support that might be given by a subject matter expert,” Eric Leas, Ph.D., assistant professor in UC San Diego Herbert Wertheim School of Public Health and Human Longevity Science, said in a news release.
“For instance, the response to ‘help me quit smoking’ echoed steps from the CDC’s guide to smoking cessation, such as setting a quit date, using nicotine replacement therapy and monitoring cravings,” Leas added.
However, the chatbot made referrals to specific resources, such as the National Suicide Prevention Hotline or Alcoholics Anonymous, only 22% of the time.
“Many of the people who will turn to AI assistants, like ChatGPT, are doing so because they have no one else to turn to,” said study co-author Mike Hogarth, professor at UC San Diego School of Medicine.
“The leaders of these emerging technologies must step up to the plate and ensure that users have the potential to connect with a human expert through an appropriate referral,” Hogarth added.
Earlier this year, a chatbot produced by Chai Research was blamed for encouraging a man to commit suicide, and a creator of the television series “Black Mirror” used ChatGPT to write an episode that he described as “s–t.”
Regardless of chatbots’ undeniable abilities, many doctors are wary of giving ChatGPT too much credit too soon.
“I think we worry about the garbage-in, garbage-out problem,” Dr. David Asch, a professor of medicine and senior vice dean at the University of Pennsylvania, told CNN.
“And because I don’t really know what’s under the hood with ChatGPT, I worry about the amplification of misinformation,” Asch added.
“A particular challenge with ChatGPT is it really communicates very effectively. It has this kind of measured tone and it communicates in a way that instills confidence. And I’m not sure that that confidence is warranted.”
If you are struggling with suicidal thoughts or are experiencing a mental health crisis and live in New York City, you can call 1-888-NYC-WELL for free and confidential crisis counseling. If you live outside the five boroughs, you can dial the 24/7 National Suicide Prevention hotline at 988 or go to SuicidePreventionLifeline.org.