GP practices ‘may still be liable’ for clinical negligence claims arising from the use of artificial intelligence (AI), NHS England has said in new guidance.
The commissioner has published guidance to assist practices adopting ‘ambient scribing products’ that feature generative AI, as health secretary Wes Streeting encouraged their use.
These products, also referred to as ‘ambient scribes’ or ‘AI scribes’, include advanced ambient voice technologies (AVTs) used for clinical or patient documentation and workflow support.
The guidance is intended for NHS settings, including GP practices, aiming to implement a specific product, and is the first in a series of documents to be published over the next six months.
The Government said that the guidance will encourage AI use across the health service ‘while protecting patient data and privacy’.
It said that clinicians in GP surgeries are ‘forced’ to spend much of their consultations ‘recording information into a computer’ instead of ‘focusing on the patient in front of them’.
By converting a conversation with a patient into a clinical note, the ambient scribing product is ‘freeing up time for a range of staff including GPs’, the Government added.
The guidance said: ‘By integrating with existing EPRs and being adaptable to diverse environments, ambient scribing products support a wide range of use cases, from primary, community, and care home settings to specialised hospital settings, and can intelligently automate workflow, promoting scalability and interoperability.’
However, it said that AI-enabled products have ‘high potential for bias’ due to limitations in the data and processes used for training AI models, and that technical risks may include ‘output errors, system unavailability, integration failures or data loss’.
‘Clinical hazards may include missing critical information, incorrect information or context, or delayed outputs,’ it added.
GP practices should develop a safety case, hazard log and monitoring framework for the products they want to use, and were encouraged to seek help from their ICB to do this.
They should also provide ‘appropriate training’ to staff, including guidance on appropriate dictation regarding voice and words used, and reinforcing ‘ongoing responsibility’ for practitioners to review and revise the outputs.
On liability risks, the guidance added: ‘NHS organisations may still be liable for any claims arising out of the use of AI products particularly if it concerns a non-delegable duty of care between the practitioner and the patient.
‘This financial exposure can be mitigated by clear and comprehensive contracting arrangements with suppliers setting out their roles, responsibilities and liability appropriately.’
It added that liability for any claims associated with the use of AI-enabled products in NHS settings ‘remains complex and largely uncharted’, with limited case law to provide clarity.
It added: ‘As such, engage your legal teams before procuring ambient scribing and AVT products, particularly to assess specific functionalities, intended use, and levels of human oversight.
‘Legal consultations can help mitigate liability risks by addressing these factors as part of comprehensive risk management.
‘Within NHS organisations, liability (particularly liability for clinical negligence claims) can ultimately lead to a non-delegable duty on the part of the Trust or primary care provider if a specific liable party cannot be established.’
When promoting the use of AI in the NHS, the Government mentioned the findings of a AVT trial funded by NHS England across London, led by Great Ormond Street Hospital for Children, which evaluated AVT capabilities across settings including primary care.
It said that the evaluation involving over 7,000 patients has demonstrated ‘widespread benefits’ including an ‘increase in direct care’. Pulse has asked the Department of Health and Social Care for more details of this trial related to primary care and general practice.
Health secretary Wes Streeting said: ‘AI is the catalyst that will revolutionise healthcare and drive efficiencies across the NHS, as we deliver our Plan for Change and shift care from analogue to digital.
‘I am determined we embrace this kind of technology, so clinicians don’t have to spend so much time pushing pens and can focus on their patients.
‘This Government made the difficult but necessary decision at the Budget to put a record £26bn into our NHS and social care including cash to roll out more pioneering tech.’
NHS England director of transformation Dr Vin Diwakar said: ‘This exciting technology can reduce the burden of administration, allowing patients more quality time with their clinician, and our new guidance shows the NHS’s ability to rapidly and safely harness the very latest innovations to transform healthcare and bring benefits for our hardworking staff and our patients.’
Earlier this year, GP practices in one area were warned against using AI without seeking approval from their ICB first.
Recently NHS England enlisted the services of an AI company to help identify patients at high risk of A&E attendance and hospital referrals by GPs.
However, GP leaders have voiced their concerns regarding the developing use of AI in general practice, with LMCs voting in favour of a motion last year which said that ‘only a doctor with full training and appropriate levels of experience will be effective to challenge an AI’.
And a medical defence organisation warned GPs not to use AI for complaint responses due to the risk of ‘inaccuracy’ or ‘insincerity’.
Enthusiasm and assurances are so far empty of detail. Separation of liability and commercially driven interests renders governance a hollow promise and a useful idiot’s minefield. Where are the disseminated pilots, sandbox analyses, and exploration of the hallucinations, biases, and defective data inputs in which AI is inherent? This reeks of massive over-promise by consultancies colluding with commercial clients, government and regulators. Where is the (expert) peer reviewed research and analysis of the GOS AVT trial? Discovering serious patient safety issues down the line once implemented will be mired in obfuscation and the self-protection of commercial risk. Babylon chatbot was a lesson in how WEF promoters and government colluded to deregulate licensing, governance and responsibility. It was only by the efforts of individuals that this wholly inappropriate and dangerous software was brought to light. NHS priorities put patient safety ahead of commercial development; perhaps that’s why institutional expertise has been so derided and dismantled. We’re all rightly enamoured of AI, but it isn’t a panacea and its known flaws have particular pertinence to patient safety.