Two local GPs have developed an OpenAI-based app that aims to significantly simplify note-taking and referrals.
While the federal government scrambles to establish some road rules around the use of generative AI in Australia, local GPs Dr Chris Irwin and Dr Umair Masood – of GP lobby group Australian Society for General Practice – have ploughed ahead with the development and launch of a consultation app.
Consultnote.ai uses a localised instance of the OpenAI large language model into which expert Australian-specific medical data is fed to simplify documenting consultations, writing referral letters, and even building care plans.
The app records a consultation, turns speech to text and organises the text into relevant medical notes automatically.
According to the developers, the system can automatically generate detailed and easy-to-understand reports post-consultation including referral letters, differential diagnosis and advice on relevant investigations and preventative health measures.
The developers maintain that, conservatively, the app reduces the admin in an average consult by a minimum of three minutes.
“By providing doctors with a comprehensive summary and potential insight into medical history, underlying conditions, treatment options and anticipating patient concerns, ConsultNote.ai fosters a more interactive and engaging consultation process,” Dr Irwin told The Medical Republic.
“We record a transcript (patient and doctor), we then use AI to interact with this transcript to provide all necessary information,” he said.
The process is as follows:
- A doctor records a consult with a patient which is transcribed to text and fed to an AI model, which, having been trained on relevant expert data, deletes irrelevant material and rearranges the consult text to create a cohesive set of consulting notes.
- The data generated is temporarily exchanged with AI servers (in this case overseas servers but within the bounds of Australian privacy and data legislation, according to the developers Ts and Cs), which returns the data to the individual doctor’s device or server once the AI has been applied; the original data does temporarily leave the environment of the doctor’s devices but it is not retained or stored by Open AI or Consultnote.ai once it has been interpreted into the final notes form. It is largely safe from the sort of problems being experienced in other large language models like ChatGPT over the internet because the instance is only being trained on expert selected and vetted content as opposed to the wild west of information that ChatGPT trains on over the internet.
- If a doctor wants to then write a referral letter they add some simple instructions and the AI generates a letter from the consulting notes.
- If a doctor wants to apply the notes to a care plan, a similar process applies
The app does poll external OpenAI servers to run the consult text through a trained open AI instance which sits overseas, but the data only temporarily resides outside the doctor’s device or system, according to Dr Irwin. It is not retained by Open AI other than to do its AI magic on the consulting text and send it back to the doctor, and never is sent to Consultnote.ai,
He says this means the worries associated with some new generative AI large language models such as ChatGPT don’t come into play because the app is a closed ecosystem.
None of the data is used externally to train the app AI either. All the trained AI in the app has been developed and is constantly being iterated for improvement by Dr Irwin and his colleagues by feeding the OpenAI instance certain sets of proprietary expert information and instructions that they deem appropriate.
So far the app has been tested – successfully, according to Dr Irwin – on around 200 GPs.
To get a feel for how the app works and see its ease of use you can watch a series of demonstration consultations prepared by the developers.
In a live demonstration with an actual patient, done for The Age newspaper over the weekend, the app quickly compiles medical notes, omitting irrelevant information such as a quick chat about the Ashes test, and with a few quick subsequent instructions writes a referral letter for the patient automatically.
As well as the notes and letter, the AI makes a few suggestions in terms of longer-term care and diagnosis for the doctor to think about.
Dr Masood told the Age that while he knew his patient’s diagnosis, “the technology gave me some other things to think about”.
The developers have sought legal advice and maintain that app is exempt from new TGA regulations controlling “medical devices as software”.
According to the site’s T&Cs, because the app “is only intended for the purpose of providing or supporting a recommendation to a health professional about prevention, diagnosis, curing or alleviating a disease, ailment, defect or injury; and it is not intended to replace the clinical judgement of a healthcare professional to make clinical diagnosis or treatment decisions regarding an individual patient”, it is considered an exempt medical device.
Notwithstanding the developers’ assurances that Consultnote.ai addresses all the key emerging concerns being raised on new generative AI technologies, the app is launching into what is effectively an unregulated generative AI environment where there is much debate already going on between detractors and supporters of the technology.
Professor Farah Magrabi of the Australian Institute of Health Innovation, at Macquarie University – one of the leading academic institutes researching medical AI in Australia – is largely supportive of such new technology, but only when very particular conditions of use and user training have been met.
Speaking generally and not about Consultnote.ai in particular, she told TMR that medical AI had the potential to vastly increase efficiency and accuracy in delivering healthcare outcomes in Australia, including significantly increasing clinician efficiency, but that it was vital that doctors and any other parts of the medical workforce using AI were educated in how to use of the technology safely and ethically.
She said it was important that doctors use AI for assistance only, and that any documentation or conclusions generated by the technology be assessed as OK by a relevantly trained human.