As chatbot doctors and AI scribes gain traction, the RACGP is calling for guardrails and for real GPs to stay in the driver’s seat.
Artificial intelligence is muscling into general practice, promising to ease paperwork and speed up patient care – but the RACGP has hit the brakes, warning that unchecked AI risks could do more harm than good.
AI-powered tools have been broadly hailed as the future of healthcare, through solutions like automated admin, scribes and even chatting with patients. In firm guidance released on conversational AI this week the RACGP made it views clear.
“There is no doubt that conversational AI could revolutionise parts of healthcare delivery. GPs should be extremely careful, however, in using conversational AI in their practice at this time. Many questions remain about patient safety, patient privacy, data security, and impacts for clinical outcomes,” the guidance states.
“The ethical and privacy issues inherent to using sensitive health data and the regulation of medical products, have meant that the use of AI is not as widespread in medicine as it is in other domains. There are still many clinical, privacy, and workflow issues to be resolved before conversational AI can be used safely and to its fullest potential in clinical settings.”
The RACGP has a separate resource on AI scribes, which convert a conversation with a patient into clinical notes.
Conversational AI is different type of technology that enables machines to understand, process, and respond to human language in a way that feels natural and human-like.
It combines natural language processing (NLP), machine learning, and speech recognition to allow computers to engage in real-time conversations through text or voice.
This technology powers virtual assistants as well as chatbots used in customer service, healthcare, and other industries. The goal is to create smooth, intuitive interactions where users can communicate with machines just as they would with another person.
Dr Rob Hosking, chair of the RACGP’s Practice and Technology Management Expert Committee, told The Medical Republic that while technology offered potential benefits for general practice, it was important for GPs to think about the potential risks and disadvantages.
“The message we’re giving to our members is you really have to check it,” he said.
Related
“It [AI] may help, but it also does make mistakes, and you have to be careful to be aware of those mistakes. That’s with the scribes and that’s the same with the context of using them for diagnoses and therapeutic investigations and in interventions. You’ve really got to be careful.”
Dr Hosking said GPs should involve patients in the decision to use AI tools and obtain informed patient consent when using patient-facing AI tools.
Before conversational AI is brought into practice workflows, the RACGP guidance recommends GPs are trained on how to use it safely, including knowledge around the risks and limitations of the tool, and how and where data is stored.
The guidance states, “it is also worth considering that conversational AI tools designed specifically by, and for use by, medical practitioners are likely to provide more accurate and reliable information than that of general, open-use tools”.
“These tools should be TGA-registered as medical devices if they make diagnostic or treatment recommendations,” it said.
Last month the UK’s NHS issued a national priority notification which effectively classifies all AI scribes as medical devices.
The notice went on to specify that any AVT solutions that generate summaries required an MHRA Class 1 medical devices status, a process that required a very detailed and detailed governance compliance process, the equivalent of which in Australia is formal classification as a medical device by the TGA.
In Australia, the same sort of AI scribes being captured in the UK by the NHS decision are currently largely exempt from the TGA medical device regulatory process, being considered “simple record keeping tools” rather than devices, with a caveat that users of the tools always read over summaries after the fact and that liability for any issues will always rest with the supervising clinician.
Dr Hosking told TMR GPs needed to be aware of the technology they were using.
“The current situation with the TGA is the registration of software as a medical device and at the moment AI scribes are not in that group, because they don’t make diagnoses or recommendations for treatment,” he said.
“As soon as you start to use a tool that does those things, then it may need to be registered with the TGA as a medical device, because it’s providing interpretation and diagnosis and then also suggested investigations and treatments.”
Dr Janice Tan, deputy chair of the RACGP’s Digital Health and Innovation specific interest group and a member of its Expert Committee – Practice Technology and Management, said there was a growing interest in AI use in general practice, and GPs were conscious of the need for appropriate safety and privacy for them and their patients.
“Our new guidance gives GPs and their practice teams practical advice on how to incorporate new conversational AI tools into their practice safely,” she said.
“Those include patient consent and shared decision making, explaining and understanding the limitations of these systems, and knowing data is stored securely.
“For patients, these tools could improve their experience in a range of ways. It’s early days and this guidance is intended to provide principles that will be useful as technology continues to develop.”
Dr Tan said both a GP and patient might benefit from a list of differential diagnoses or supporting diagnoses, for example, to get a second opinion or to help a GP to ensure they’ve considered the range of possible conditions.
“Some tools help to prepare medical documentation, like a first draft of a referral that a GP can review and edit as appropriate. It could be as simple as translation or taking calls to organise basic tasks like making or changing an appointment, leaving complex issues to practice team members,” she said.
“There are a wide range of potential uses, and this guidance will help GPs to assess if future tools are suitable for their own practice and patient population. This is a complex area, and as with everything, patient safety is the most important consideration.”