One in five family doctors are relying on AI to diagnose patients, despite its risk of errors, data has shown.
A new study has revealed that 20% of GPs use ChatGPT, Bing AI, Google’s Bard or another AI system to help them with clinical practice, such as diagnosing patients and writing notes.
With no official guidance published on how to work AI programmes, experts are urging GPs to reconsider using them as these systems could misdiagnose patients.
In addition, patient data could be at risk of being compromised, according to senior health officials.
- Weekly insulin injections as effective at controlling blood sugar as daily injections
- Abdominal fat and chronic pain linked in first-of-its-kind study
- AI detects disease from colour of human tongue
During the study, the team of researchers surveyed thousands of GPs on how often they use AI programmes.
Generative AI tools were used in clinical practice by one in five GPs, and 29% of family doctors used these tools to create patient medical notes following appointments, the survey has reported.
According to the survey results, a quarter of the respondents have used AI programmes to suggest treatment options for patients.
The authors said: “While AI can be useful for assisting with documentation, they are prone to creating erroneous information.
“We caution that these tools have limitations since they can embed subtle errors and biases. They may also risk harm and undermine patient privacy since it is not clear how the internet companies behind generative AI use the information they gather.”
They added: “While chatbots are increasingly the target of regulatory efforts, it remains unclear how the legislation will intersect in a practical way with these tools in clinical practice.
- Exercise reduces depression in people living with high levels of pain
- AI-generated alerts reduce hospital patients’ risk of dying, study shows
- Combining diabetes drugs can help to protect against heart and kidney disease
“Doctors and medical trainees need to be fully informed about the pros and cons of AI, especially because of the inherent risks it poses.”
Professor Kamila Hawthorne, Chair of the Royal College of GPs, said: “Technology will always need to work alongside and complement the work of doctors and other healthcare professionals, and it can never be seen as a replacement for the expertise of a qualified medical professional.
“Clearly there is potential for the use of generative AI in general practice but it’s vital that it is carefully implemented and closely regulated in the interest of patient safety.”
The findings have been published in the BMJ.