A survey led by researchers at Uppsala University in Sweden reveals that a significant proportion of UK general practitioners (GPs) are integrating generative AI tools, such as ChatGPT, into their clinical workflows. The results highlight the rapidly growing role of artificial intelligence in healthcare – a development that has the potential to revolutionise patient care but also raises significant ethical and safety concerns.
"While there is much talk about the hype of AI, our study suggests that the use of AI in healthcare is not just on the horizon – it’s happening now. Doctors are deriving value from these tools. The medical community must act swiftly to address the ethical and practical challenges for patients that generative AI brings," says lead researcher Dr Charlotte Blease, Associate Professor at Uppsala University.
The study reveals that 20 per cent of GPs reported using generative AI tools in their practice, with ChatGPT being the most frequently used AI tool. Conducted with collaborators at Harvard Medical School, Boston, USA and the University of Basel, Switzerland, this is the most comprehensive examination of generative AI in clinical practice since the launch of ChatGPT in November 2022.
The study was conducted in February 2024 as part of a monthly omnibus survey and was designed to include GPs from across different regions of the UK. Researchers surveyed 1,006 GPs registered with Doctors.net.uk, the largest professional network for UK doctors.
The aim of the study was to measure the adoption of AI-powered chatbots by GPs across the UK and to understand how these tools are being used in clinical settings. With the advent of large language models (LLMs), there has been substantial interest in their potential to support medical professionals in tasks ranging from documentation to differential diagnosis.
Apart from revealing that 20 per cent of GPs used AI tools in their practice, the study also shows that among users, 29 per cent utilised these tools for generating documentation after patient appointments, while 28 per cent employed them to assist with differential diagnosis.
These findings suggest that AI chatbots are becoming valuable assets in medical practice, particularly in reducing administrative burdens and supporting clinical decision-making. However, the use of these tools is not without risks. The potential for AI to introduce errors ("hallucinations"), exacerbate biases, and compromise patient privacy is significant. As these tools continue to evolve, there is an urgent need for the healthcare industry to establish robust guidelines and training programmes to ensure their safe and effective use.
“This study underscores the growing reliance on AI tools by UK GPs, despite the lack of formal training and guidance and the potential risks involved. As the healthcare sector and regulatory authorities continue to grapple with these challenges, the need to train doctors to be 21st century physicians is more pressing than ever,” Blease concludes.
Journal
BMJ Health & Care Informatics
Method of Research
Survey
Subject of Research
People
Article Title
Generative Artificial Intelligence in Primary Care: An online survey of UK General Practitioners
Article Publication Date
13-Sep-2024
COI Statement
C. Blease, C. Locher, J. Gaab, M. Hägglund, K D Mandl. Generative Artificial Intelligence in Primary Care: An online survey of UK General Practitioners. BMJ Health and Care Informatics. August 27, 2024 https://doi.org/bmjhci-2024-101102