Are we following the NHS? The TGA believes developers are dodging ‘accountability, transparency and responsibility’. It also has its eye on digital mental health tools.
The TGA has flagged that it wishes to conduct a review of AI scribes in healthcare settings to determine whether they are medical devices and, if so, whether they comply with existing regulatory requirements.
In its report Clarifying and strengthening the regulation of Medical Device Software including Artificial Intelligence, which was released without fanfare this month, only emerging on LinkedIn in the past few days, the TGA had some strong words for the developers of AI scribes.
“Developers of digital scribes claim they are not a medical device as their intended purpose is to summarise clinical practice notes,” the TGA said.
“Users report digital scribes frequently propose diagnosis or treatment options for patients beyond the stated diagnosis or treatment a clinician has identified during consultations.
“This functionality indicates digital scribes meet the definition of a medical device and require pre-market approval in the Australian Register of Therapeutic Goods and are potentially being supplied in breach of the Act.”
Earlier this month the NHS issued a national priority notification in the UK which effectively classifies AI scribes as medical devices. Now it seems the TGA is heading towards a similar path.
The TGA regulates software and AI models and systems when they meet the definition of a medical device under the Therapeutic Goods Act 1989. In 2021 the TGA clarified the classification levels of software to “account for potential and emerging risks of harm”.
At the same time, it introduced a number of “carve-outs” for very low risk products or products that had oversight from other regulators.
This latest report came from a 2024 review of the current legislation, regulations and guidance to see if they were appropriate to meet “the challenges associated with an increasing use of medical software and AI across the healthcare sector”.
In general, the report found, the existing legislative framework was “largely appropriate” although it would continue to engage with regulators in other jurisdictions and relevant stakeholders.
Finding 4 of the report made it clear that the TGA was not entirely happy with the way developers were approaching the regulation of AI scribes.
“A review of digital scribes is needed to determine whether they are medical devices and, if so, whether they comply with existing regulatory requirements,” it said.
“The TGA has observed, and stakeholders report, an unwillingness from some developers to provide the accountability, transparency and responsibility necessary for engagement with existing regulation.
Related
“Developers have expressed views that healthcare providers should take responsibility for validating and verifying the outputs of deployed systems they choose to use, while simultaneously limiting access to information about the datasets used to train their product or to test the model used to operate it.”
The TGA acknowledged that the time and costs associated with regulatory requirements were “disproportionate” when compared to the time and costs associated with the development of a software product.
“A further cultural issue is the pervading belief among some developers that software products don’t present a meaningful risk to consumers and users, particularly when they are integrated with the provision of healthcare, where a human is in the loop, or where outputs are information only,” said the TGA report.
“Stakeholders, including clinicians and consumers who use these kinds of products, have identified that the absence of humans, lack of transparency and failure to engage with existing regulatory requirements represent a combination of circumstances that may lead to patient harm.
“In many instances, users are not aware that AI or machine learning has been used in the development of software, or is used operationally within the clinical workflow.”
The TGA also targeted digital mental health tools, with finding 8 of the report saying:
“The digital mental health tools exclusion is no longer appropriate and urgent review is needed, in collaboration with the Australian Commission on Safety and Quality in Health Care.”



