GPs warned using AI to record patient notes can lead to dangerous inaccuracies

GPs have been warned to look out for ‘inaccurate or fabricated’ information when using AI to write their medical notes.

Family doctors are increasingly using tools that listen to their consultations with patients and automatically add summaries to their records.

But the Royal College of GPs has warned AI can misinterpret the nuance of conversations, with potential dangerous consequences.

The Medicines and Healthcare products Regulatory Agency (MHRA) also says there is a ‘risk of hallucination which users should be aware of, and manufacturers should actively seek to minimise and mitigate the potential harms of their occurrence’.

The safety watchdog is now urging GPs to report issues with AI scribes through its Yellow Card Scheme, which is typically used to report adverse reactions to medicines.

This should include ‘suspected inaccuracies’, trade publication GP Online reports.

The British Medical Association’s GP Committee said earlier this year that ‘the adoption of passive scribes in general practice has gathered significant pace’, with practices using standalone systems or tools rolled out with other common software.

Dr Phil Whitaker, a UK GP who recently moved to Canada, wrote in the New Statesman that an AI tool he used was ‘not to be trusted’.

Doctor using a smart phone

He said it misinterpreted conversations with patients who asked him about his move from the UK – and recorded notes suggesting patients had recently moved to Canada instead.

He added: ‘I’ve caught it recording findings of examinations I haven’t performed and detailing advice I haven’t given.

‘The company that makes it advises users to check its output carefully.

‘For me, the time spent reading and editing outweighs any productivity gains.’

And an article published by Fortune last month outlined a case in which ‘a patient in London was mistakenly invited to a diabetic screening after an AI-generated medical record falsely claimed he had diabetes and suspected heart disease’.

However, despite this growing use of AI and the recognition of potential problems, the MHRA said a search of its database revealed ‘no adverse incident reports related to the use of AI scribes’.

The government’s 10-Year Health Plan says it intends to ‘accelerate the adoption and spread of AI technology, such as AI scribes, by streamlining AI regulation’.

A new national procurement platform will be set up next year to support GP practices and NHS trusts to adopt new technology safely.

Professor Kamila Hawthorne, chair of the RCGP, said: ‘AI has enormous potential for transforming the future of our health and patient care.

SICK BRITONS ‘UNCOMFORTABLE’ DIAGNOSING THEIR OWN ILLNESS WITH NEW ‘AI GP’ IN NHS APP 

Fewer than one in three Britons are comfortable with the prospect of using new AI features in the NHS App to diagnose their issues, a poll reveals.

Health secretary Wes Streeting announced plans to revamp the app as part of Labour’s 10-Year Health Plan so every patient could have a ‘doctor in their pocket’.

But a new survey found 44 per of the public are ‘uncomfortable’ with trusting the diagnosis and management of their conditions to artificial intelligence, with this figure rising to 60 per cent among pensioners.

The NHS App

Only 31 per cent of the 2,030 respondents to the Savanta poll, for the Liberal Democrats, said they are ‘comfortable’ with the idea.

Helen Morgan, the Liberal Democrat’s health spokesperson, praised Labour for tackling bureaucracy but added: ‘Making the NHS more efficient is of course welcome but it cannot come at the cost of leaving people behind as they try to grapple with digitised services rather than a real life doctor.

‘Ministers need to allay these fears by offering support to those who are not digitally literate and older people to ensure that these sweeping changes benefit everyone.’

Speaking at the Plan’s launch last month, Mr Streeting said: ‘The NHS App will become a doctor in your pocket, bringing our health service into the 21st century.’

It will use patients’ medical records and artificial intelligence to provide instant answers to users’ questions and direct them to the best place for care.

Dennis Reed, director of Silver Voices, which campaigns for elderly Britons, said at the time ‘Elderly people will be sceptical about whether the plan will be delivered and concerned that greater reliance on the app could exclude them from accessing timely care.

‘For some, the doctor in their pocket will be padlocked.’

Advertisement

‘However, its use is not without risks and so its implementation in general practice must be closely regulated to guarantee patient safety and the security of their data.

‘GPs are always open to introducing new technologies that can improve the experience of patients and help cut the administrative burden, and an increasing number of GP practices are now using AI scribing tools to improve the quality and efficiency of their consultations.

‘While these tools can offer real benefits, particularly at a time of significant GP workforce pressures, there are some important concerns – particularly around data security of sensitive patient records, data controllership and the risk of inaccuracies.

‘We are aware that AI scribes can produce inaccurate or fabricated details, and that they can also misinterpret the nuance of conversations.

‘It is important that clinicians review AI-generated documentation for accuracy before adding it to the patient record.’

The MHRA said: ‘The MHRA is aware of this potential issue in AI enabled tools generally and this includes AI scribe tools.

‘We recommend that GPs and healthcare professionals only use tools which are registered medical devices which have been determined to meet the required standards of performance and safety.

‘Recently published MHRA guidance clarifies how these technologies qualify as medical devices and while this is specific to digital mental health, the principles apply across digital health applications. 

Professor Kamila Hawthorne, chair of the Royal College of GPs

‘While not published by the MHRA, NHS England guidance encourages the use only of registered medical devices when used in a clinical context.

‘We strongly encourage that all suspected adverse incidents, including suspected inaccuracies are reported to the MHRA via the yellow card scheme.’

The watchdog said the yellow card scheme website had been updated to include ‘a standalone page for software and AI as medical device’.

Earlier this year, the BMA advised practices to pause use of AI scribes until they had carried out data protection and safety checks and sought assurances that the products meet NHS standards.

This post was originally published on this site

Share it :