This site is intended for health professionals only


ChatGPT gives ‘more empathetic’ responses to patient queries than doctors, new study suggests

ChatGPT gives ‘more empathetic’ responses to patient queries than doctors, new study suggests

ChatGPT may produce more empathetic responses than doctors to questions from patients, a new study has claimed.

A team of healthcare professionals read 195 responses to patient questions from social media forum Reddit and rated the answers from the artificial intelligence chatbot more highly than those provided by doctors.

The study’s authors from the University of California concluded in JAMA Internal Medicine that further research should be done on whether AI assistants can help doctors in workflow and drafting responses, which may help reduce clinician burnout.

The healthcare professionals comparing the responses – without being told which was real and which generated by ChatGPT – were asked to look at both the quality of information and bedside manner.

They preferred the ChatGPT responses in 79% of cases and overall rated them as of significantly higher quality.

The prevalence of chatbot responses that were graded as ’empathetic’ or ‘very empathetic’, was almost 10 times higher than the doctor-written responses, the researchers reported.

The authors said: ‘While this cross-sectional study has demonstrated promising results in the use of AI assistants for patient questions, it is crucial to note that further research is necessary before any definitive conclusions can be made regarding their potential effect in clinical settings.’

‘Studying the addition of AI assistants to patient messaging workflows holds promise with the potential to improve both clinician and patient outcomes,’ they added.

Responses from ChatGPT were on average four times longer which suggests in practice that those evaluating them would have been able to guess which was which, experts said.

It was also not a level comparison as the clinicians were responding on a public forum which may have affected how empathetic the responses were, Professor James Davenport, professor of information technology, at the University of Bath pointed out.

But he said it does raise legitimate questions about whether and how ChatGPT could ‘assist physicians in response generation’.

Professor Mirella Lapata, professor of natural language processing at the University of Edinburgh, said it was not surprising that healthcare professionals preferred ChatGPT to physician responses.

She said: ‘ChatGPT is more empathetic and overall chattier. Without controlling for the length of the response, we cannot know for sure whether the raters judged for style (eg, verbose and flowery discourse) rather than content.’

Professor Anthony Cohn, professor of automated reasoning at the University of Leeds, said: ‘Given ChatGPT’s well known abilities to “write in the style of”, it is not surprising that a chatbot is able to write text that is generally regarded as empathetic.’

He added that authors are careful to note that a chatbot should only be used as a tool to draft a response to a patient query – given the propensity of this type of technology to invent ‘facts’ and hallucinate, it would be dangerous to rely on any factual information given.

‘It is essential that any responses are carefully checked by a medical professional. However, humans have been shown to overly trust machine responses, particularly when they are often right, and a human may not always be sufficiently vigilant to properly check a chatbot’s response; this would need guarding against.’


          

READERS' COMMENTS [7]

Please note, only GPs are permitted to add comments to articles

Mr Marvellous 4 May, 2023 10:04 am

A reason why an AI might be more likely to give empathetic answers than a human GP is that human GPs often face time and resource pressures that can limit their ability to provide empathetic care. Human GPs have a limited amount of time to spend with each patient, and they often have to deal with a large number of patients with complex medical issues.

Under such conditions, human GPs may find it challenging to provide personalized care that fully addresses a patient’s emotional needs. They may also have to make difficult decisions about allocating their limited resources, which could mean that some patients do not receive the level of care and attention they need.


And yes, the answer above is written by ChatGPT.

Turn out The Lights 4 May, 2023 10:19 am

Good theyre going to need it..Can it prescribe pragmatically as well.

Steve McOne 4 May, 2023 11:44 am

Great! All my chatty and personality disorder patients, here’s your Messiah.

David Church 4 May, 2023 12:38 pm

So, AI is better at ‘spin’ than GPs, but who is better at stopping in the middle of an answer to provide CPR ?

Perhaps, if we have our Receptionist ask the first question : ‘do you have a smartphone or computer?’, and if the answer is no, there will be plenty of free appointments to slot them into, because if the answer is ‘yes’, Receptionist can direct them to the AI-bot instead of a GP! Sorted, Yay!

Carpe Vinum 4 May, 2023 4:17 pm

There is no reason why AI shouldn’t turn out to be an excellent counselling tool for low level mental health where CBT & Mindfulness approaches predominate. CBT/Mindfulness/DBT are predominantly rules based interventions guiding someone to look at their automatic responses in a different light, for which AI should be perfectly adept.
In addition AI doesn’t get burnt out, disenfranchised and exhausted…
Plus, most low level mental health is NOT a medical problem and should never be seen by GPs in the first place so hopefully a win-win scenario!

Jonathan Heatley 5 May, 2023 8:44 am

I can see chatGBT and its medical progeny taking over a lot of our work. If patients are able to ask it online what their symptoms mean and what treatments are best it will provide vastly more detailed answers than we possible could. Our only advantage would be to examine a patient, and for example, look in their ear.
AI could change the primary care landscape hugely. Imagine for instance that secondary care decide that AI diagnosed cancer is more accurate than us then they may start accepting referrals directly.
Perhaps there won’t be a shortage of GPs in 5 years time? Perhaps the public demand will fall dramatically?

Keith Greenish 5 May, 2023 11:43 am

And will Chat GPT reassure patients that they do not require numerous blood tests and scans, or will they be left to pester us?