Who knows how DoHDA is using AI? Not the RACGP

3 minute read


The RACGP says it knows DoHDA is using artificial intelligence for compliance and fraud detection, but it doesn’t know what that use looks like.


The royal college of GPs says it “does not have visibility” over how the Department of Health, Disability and Ageing is using AI for compliance and fears another Robodebt scenario.

The Australian National Audit Office is currently investigating DoHDA’s use of AI to manage health provider non-compliance.

More specifically, it is looking at whether DoHDA has appropriate governance supporting the adoption of AI, whether it is effectively monitoring and reporting the impact of its use of AI and whether it has fit-for-purpose assurance arrangements over the adoption of AI by health providers.

The department’s own transparency statement on AI says it uses generative and narrow models to analyse data, automate activities, identify patterns and “support decision making by helping staff summarise, analyse or synthesise information” in certain areas.

One of these areas is compliance and fraud detection.

The statement explicitly says the department does not use AI to automate decisions and that human officials remain accountable for the advice and recommendations they provide.

“The concern is the lack of clarity around the AI tools [used by the department] and what settings it will have and how they will apply,” RACGP digital health and innovation chair Dr Sean Stevens told The Medical Republic.

“We’ve seen with Robodebt, but also with things like the rolling out in some states of AI-initiated seatbelt fines, that [the AI] doesn’t always have the full context.

“And if all you rely on is the AI, then there’s going to be a lot of innocent people caught up.”

“… The vast majority of GPs want to do the right thing and believe they’re doing the right thing – so to get a letter from the department, you know, even if it’s just a ‘please explain what’s going on’ letter, it creates a lot of work.”

In its submission to the Australian National Audit Office, which was published this week, the RACGP noted that DoHDA does not actually outline the specific ways in which it is using AI for compliance.

“As this technology becomes more widely adopted in the coming months and years, we support transparent reporting arrangements to better understand its use in the compliance context,” the college said.

“This is particularly important given DoHDA does not provide a breakdown of the percentiles for MBS item number usage (eg the number of services that would place a provider in the top percentile), meaning there is already a degree of obscurity around the process.

“The RACGP is keen to avoid a situation where providers are unnecessarily targeted because they have been identified as being potentially non-compliant by AI tools.”

The use of AI tools to monitor compliance should be avoided wherever possible, the college said, citing the risks associated with AI “hallucinations”, the lack of rigour in assessing the safety of AI and a general lack of trust in the technology.

“With regard to Medicare compliance, the use of AI without clear understanding and sharing of information such as model training risks worsening transparency and belief in DoHDA’s assessment and decision-making processes,” the RACGP submission reads.

“Retaining human oversight of Medicare claiming is essential to avoid the tragic consequences of another Robodebt scheme.”

While AI can identify billing patterns, it does not have insight into the drivers of those patterns.

“For example, an ageing population with an increased chronic disease burden could result in practitioners billing more longer consultations than they have historically,” the submission reads.

“DoHDA ought to be guided by the medical profession to understand the meaning behind the statistics.”

End of content

No more pages to load

Log In Register ×