How did AI scribes slip through the TGA loophole?

4 minute read


The TGA has admitted it’s reviewing all 200 AI scribes in the market. A key question is whether the scribes replace a clinical function.


How did ambient scribes slip through the loophole to become unregulated, the TGA’s senior regulatory policy advisor Rebecca Bateson was asked at the Informa AI in Health Regulation, Policy and Standards conference last week.

Although AI scribes describe themselves as transcribers, they also make a decision about what is important to write down in clinical notes. “And that is replacing a clinical function,” the questioner highlighted.

Ms Bateson had previously clarified how the TGA determined what was within its remit – devices and software used for a therapeutic purpose, including diagnosis, prevention, monitoring, prediction, prognosis of a medical condition.

“A really good way to think about the definition of what is going to be a medical device and what is not is: is it replacing a clinical function?” Ms Bateson explained.

Whether AI scribes replace that clinical function is a core debate of why they’ve avoided regulation thus far.

This article originally ran on TMR’s sister site, Health Services Daily. TMR readers can sign up for a discounted subscription.

“The TGA is currently in the process of reviewing all those 200 scribes that are available in the market,” she confirmed.

“A lot of them have not been included in the Australian Register of Therapeutic Goods as a medical device, because what they say is that all they’re there to do is to transcribe or summarise a consultation between the health professional and the patient, record patient’s diagnosis and treatment as stated by the health professional, draft referrals, medical stickers and other administrative documents.

“If it starts to record a diagnosis or recommend a treatment that wasn’t mentioned during the consultation, or if it’s suggesting something that wasn’t discussed at all, that starts to be a medical device,” she explained.

She confirmed that’s what the TGA is looking at during the ambient scribe review.

“Scribes are specifically being advertised for use in clinical healthcare settings. They’re not being advertised as a discussion summary device, and there’s a reason for that,” she said.

She said she has a few concerns. One is the way AI scribes are used by clinicians.

“The thing I heard I was doing this review was, ‘I looked at it really, really thoroughly the first day I used it, and it was so good that I stopped reviewing it thoroughly’.”

As a strategic policy person, she also has bigger concerns, including where that record ends up.

“Did you get your patient to review it before it went to My Health Record?” she asked.

She posed an example of a child whose father has a particular health condition as well as the father’s mother. But the scribe recorded that condition as both paternal and maternal history.

“If that goes into My Health Record and years from now, we’re making a decision using AI about who gets the ventilator in ICU – maybe this person doesn’t, because they’ve got a paternal and a maternal history of this life-shortening condition.

“Those are the things that I worry about when it comes to the integrity of that data, not just that interaction on the day, but where that’s going in the future,” she said.

The TGA is also reviewing mental health tools.

“We did have a conditional exclusion, and it was to come out as one of those areas that needs to be immediately reviewed and refined,” Ms Bateson said.

Looking ahead, other focuses for the TGA will be to clarify definitions in the Therapeutics Good Act, “because people who are manufacturing software don’t identify as manufacturers, they identify as software engineers”, she explained.

The TGA is collaborating with other health regulators and Australian government departments on AI approaches.

“This is a big challenge for all of us, because what you’re seeing increasingly is tools being made available for people to make their own products, data sets being made available, governance of platform providers.”

There is also some work to do with global challenges including validation of open source data sets and large language models.

Ms Bateson said there were existing laws for the use of generative AI for non-validated, unverified clinical applications; however, testing those laws against generative AI will take some time.

The TGA is also working together with international agencies.

“Because it’s moving so rapidly, we have to move rapidly too, in order to make sure that we are appropriately regulating two to three million individual products in a safe way,” she concluded.

End of content

No more pages to load

Log In Register ×