This site is intended for health professionals only


One fifth of GPs already use AI in clinical practice, survey suggests

One fifth of GPs already use AI in clinical practice, survey suggests

One fifth of GPs are using artificial intelligence (AI) in clinical practice, with ChatGPT the most popular tool, a new study has indicated.

Earlier this year, researchers conducted a survey with just over 1,000 UK GPs which asked if they had ever used ‘large language model’ AI chatbots during their clinical work.

Among the 20% of GPs who reported using tools such as ChatGPT or Bing AI, the most common purpose was to generate documents after patient appointments.

Other functions included using AI to ‘suggest a differential diagnosis’ or to suggest treatment options. 

The authors of the paper, published in BMJ Health and Care Informatics, included the caveat that this survey, conducted via the medical forum Doctors.net.uk, may not be representative of all UK GPs.

But they claimed the findings show that ‘GPs may derive value’ from AI tools and have ‘readily incorporated’ them into their clinical practice. 

They warned that generative AI has ‘limitations since they can embed subtle errors and biases’, and may also risk patient privacy. 

The survey found that of 160 GPs who use AI:

  • 29% use it to generate documentation;
  • 28% use it to suggest a differential diagnosis;
  • 25% use it to suggest treatment options;
  • 20% use it for patient summarisation from prior documents.

According to the authors, the ‘medical community’ must find ways to ‘educate physicians and trainees’ about the benefits and potential risks of AI.

They said: ‘These findings signal that GPs may derive value from these tools, particularly with administrative tasks and to support clinical reasoning. However, we caution that these tools have limitations since they can embed subtle errors and biases.

‘They may also risk harm and undermine patient privacy since it is not clear how the internet companies behind generative AI use the information they gather. 

‘While these chatbots are increasingly the target of regulatory efforts, it remains unclear how the legislation will intersect in a practical way with these tools in clinical practice.’

At the Pulse LIVE conference earlier this year, a digital health expert warned GPs using AI to be aware of liability risks and maintain their ability to ‘critically appraise’ the technology. 

Dr Annabelle Painter, who is also a GP registrar, said that doctors could be ‘the ones that hold the can’ when AI gets decisions wrong.

Local medical leaders have also voiced their concerns regarding the developing use of AI in general practice, with LMCs voting in favour of a motion which said that ‘only a doctor with full training and appropriate levels of experience will be effective to challenge an AI’.

This month, a medical defence organisation warned GPs not to use AI for complaint responses due to the risk of ‘inaccuracy’ or ‘insincerity’.

Pulse October survey

Take our April 2025 survey to potentially win £200 worth of tokens

Pulse October survey

          

READERS' COMMENTS [8]

Please note, only GPs are permitted to add comments to articles

So the bird flew away 18 September, 2024 11:17 am

Rubbish.

So the bird flew away 18 September, 2024 11:29 am

AI is the latest growth area for the elite 1% capital accumulators. It’s never been necessary to my job. What has been necessary is properly funded analogue support and not being starved of capital, and income for staff. Patients still want “my GP” as the bedrock of quality.
Chasing “quantity” has been miserable.
Zen and the art of motorcycle maintenance..

Michael Mullineux 18 September, 2024 11:48 am

Spot on STBFA

Ghost of Victor Meldrew 18 September, 2024 2:27 pm

Turkeys voting for Christmas. What is the point of worrying about being replaced by less qualified non doctor staff then relying on non human facilities. The government appear to believe that primary care is simple enough to be managed by algorithm so will be happy for us to be replaced by computers. The most concerning aspect of this to me is that by relying on AI our colleagues are depriving themselves of the experience that distinguishes them from mere computers. Yes we are overworked but the only way we will convince the world that we are irreplaceable is by continuing to practice experience based medicine

A B 18 September, 2024 3:58 pm

I’d suggest anyone interested enough to have an opinion on AI, (its various incarnations and platforms like ChatGTP) actually take a look at it, what it can do, what tasks it can perform, before a knee jerk ‘its bad’ reaction. If you use google you are using AI. Theres AI and theres AI. Personally I think Drs need to get their head around it all. It isn’t going away. Last time I looked ignorance isn’t strength, and not advisable for someone working in medicine.

So the bird flew away 18 September, 2024 6:58 pm

AB, I agree with you that ignorance isn’t strength. Can I recommend Madhumita Murgia’s Code Dependent and also recent articles by professor Gary Marcus about the dangers of AI? But there’s also a political and economic stance. That AI is a new virtual territory which capital wants to open up for extracting value in order to accumulate more wealth. Do GPs really want to aid large language models to learn in order to replace GPs themselves? Turkeys indeed. There may be some superficial material benefit but at the expense of jobs and, even worse, the loss of the human approach. No wonder even most tech experts are cautious and are actively discussing these threats via published literature. So don’t be a turkey, nor a sheep – it takes strength of character to be neither – and hopefully there will still be GPs around for the years to come.

David Church 18 September, 2024 10:51 pm

So, that is ‘Clainaical Praictaice’ now then, is it ?
One H*** of an accaint, if you aisk mai !

A B 19 September, 2024 9:38 am

So to clarify I’m not arguing for the use of AI in medical decision making. Nor am I looking to sound clever referencing obscure services in the hope of signalling an impressive breadth of personal experience. My point is simply the fact AI is not going away. It is already part of your life. ChatGTP is a tool that can do things like producing computer script in Python (a computer language) to perform a task for a Dr who doesn’t know how to write Python. It can produce a graph of a large spreadsheet of data in seconds, for someone (a Dr maybe) who doesn’t have a microsoft certificate in advanced excel. It can do much much mire than that, none of which encroaches on medical decision making. Personally I don’t really use it, but knowing about it is useful. I’m afraid AI is bigger than medicine. Its a societal issue and we need to remain educated. The only point I’m making.

Pulse October survey

Take our April 2025 survey to potentially win £200 worth of tokens

Pulse October survey