This site is intended for health professionals only


AI is not the write stuff

AI is not the write stuff

Copperfield on the demise of his relationship with AI transcription tools in general practice

I recently had an epiphany. Someone showed me an AI transcription software which records the consultation and, at the touch of a button, converts that chaos into clear and sensible written notes in the patient’s record.

Wowzers! This was like watching my brain download onto a computer: a gamechanger. No more laborious typing up! Less staring at a screen!! More time, more headspace!!! I immediately decided to turn revelation into revolution.

In fact, I was so stunned by this software wizardry that I agreed to undertake a Data Protection Impact Assessment, which is what you’re supposed to do rather than just plug it in. This involved long online meetings, endless spreadsheets, and phrases like ‘data subject’, ‘purpose drift detection’ and ‘default-deny architecture’ – which was all way less fun than it sounds. But it didn’t matter, because a bright new cutting-edge, drudgery-free future beckoned.

So I’ve been using this consultation transcription tool for about three months now. And there certainly was a honeymoon period. It analysed and recorded consultations brilliantly. It gave me a choice of format and length. It would even generate referral letters automatically. OK, occasionally it hallucinated, but what’s that amongst friends?

I was besotted. Until, that is, I started having my first review appointments with patients where I’d used AI transcription in the previous consultation. Then, something weird happened. The notes suddenly meant nothing to me. They recorded facts but I couldn’t recall the consultation at all. The patient might as well have seen a locum: I simply didn’t recognise my input in the record.

So transcription seems to do a terrible thing: it drains the notes of my sense of ownership. All those nuances, insights and coded messages I routinely incorporate are lost in translation. I’m getting the thousand words but I’m not getting the picture.

I’m sure this is fixable. Just as ChatGPT can be directed to a chosen style, no doubt transcription software could do the same, eg. in my case: ‘Please adopt a highly cynical tone with the starting assumption that there is nothing wrong with the patient but leaving the door open to a brilliant and obscure diagnosis’. Are you listening, Big Tech? (of course you are, your LLMs are gobbling this up quicker than I can type).

In the meantime, I may well switch off the software. And for those of you wondering if there’s a metaphor buried in here somewhere, well, apparently there is. Because I asked ChatGPT, and it said that replacing the art and wisdom of traditional GP note keeping with AI transcription is a bit like swapping frontline GPs for noctors. Two differences, I guess. One is that they’re not artificial, and the other you can work out for yourself.

Dr Tony Copperfield is a GP in Essex.


          

READERS' COMMENTS [1]

Please note, only GPs are permitted to add comments to articles

David Church 23 April, 2025 6:30 pm

I am intrigued that on landing on the webpage for this, it takes only 2 seconds for Pulse to analyse me and determine that I am indeed human. It can do this even with a post-it note covering the camera on the laptop.
Why, then, do I still need to login? Can the ‘human-detection’ software not be tweaked to detect that I am a Doctor, and indeed, that I am ME, rather than Damien the cockerell or Victoria the chicken sat on my desk tapping at the mouse-pad?