AI vendors need to lift regulation game

2 minute read

Artificial intelligence is so new to the regulation space that vendors are making approval of their products more difficult.

Artificial intelligence vendors are shooting themselves in the foot by not paying closer attention to ticking the regulation boxes come approval time.

Tracey Duffy, first Assistant Secretary for Medical Devices and Product Quality at the TGA, told a room full of software vendors earlier this week how they could make their approval process shorter and smoother.

“Many developers who have AI as part of their device are new to regulation,” said Ms Duffy, speaking at a Medical Software Industry Association event in Brisbane.

“So we’re seeing a range of different developers pop out of lots of different landscapes, both internationally and here in Australia and who are wanting to bring their product to market.

“So, their understanding of the regulatory framework, which is quite complex and convoluted and find it difficult and they struggle.”

The TGA regulates AI if it is intended to perform a therapeutic or medical purpose, including diagnosis, monitoring, prevention, treatment, and alleviation of disease, injury or disability.

“The AI in a product is not always visible in the design,” said Ms Duffy.

That leads to barriers to approval, including the need to engage extensively with the supplier tom elicit details of “all architectural components”.

Other barriers include data for training or testing the AI often not being related to the use case population or too small a group to be valid. The AI model is not always based on strong clinical evidence.

“So for those companies who are wishing to seek approval, it’s really important that you present that information upfront. [That reduces] the amount of times you have to go backwards or forward and have ongoing conversations about the back end or the other blackbox components that support your software,” said Ms Duffy.

“What we often see is that vendors are not providing basic software lifecycle artefacts, the evidence is often provided for other software components but not for AI, the relevant clinical studies are using very small datasets, and not all the data is provided.

“These problems pose safety concerns, but they also slow down the processing of applications.”

Transparency was critical, she said.

“It should be clear what happens to patient data and how the product complies with Australian privacy and consumer law,” said Ms Duffy.

End of content

No more pages to load

Log In Register ×