Healthcare leaders warn the rise of consumer AI is reshaping consultations, exposing data gaps, and forcing clinicians to adapt to an ‘undeclared third party’ in the room.
“Dr Google is dead,” Richard Taggart, CEO of eHealth NSW, told the audience at Australian Healthcare Week.
Instead of searching symptoms online, patients are now turning to ChatGPT and other generative AI tools and bringing the results into the consulting room.
Some arrived with neatly summarised chatbot advice. Others don’t mention it at all.
The question for the health system is no longer whether this is happening, but how to respond to it.
“How do we build upon that? How do we acknowledge it, and how do we make sure that the clinicians that you then interact with as you come into the health system are ready for it?” Mr Taggart asked.
Those questions surfaced repeatedly throughout the Healthcare 2040 Expo in Sydney this week, where leaders debated what consumer AI use means for safety, trust and the future of care.
The horse has bolted and it’s changing how care is accessed
Bettina McMahon, CEO of Healthdirect, told delegates that although there is plenty of debate about whether the models are getting the health advice right or wrong, that wasn’t actually the point.
“The horse has bolted; it’s being used. What’s the responsibility of organisations like ours, of leaders like ours and people in our network to look at?” she said.
The Australian Digital Health Agency’s chief clinical advisor (medicine), Amandeep Hansra, agreed that it was already being used. But what was more interesting is who is using it and why.
“This is Australian data: the people that use AI for healthcare queries are more likely to be of a non-English-speaking background. They are more likely to have not been born in Australia, and they’re more likely to have low levels of health literacy, and they’re also more likely to ask these questions out of hours.
“So if you think about all of this, the group that is benefiting the most from using AI tools are actually some of our groups that are finding it hard to access healthcare,” she said.
That raises a new challenge: how to make these tools safer without removing the access they provide.
Optimising health information for the AI era
Ms McMahon explained that Healthdirect was looking at how they could do better to improve the accuracy of the systems.
“What could improve consumers’ awareness about how they could use it for their helpful things, and what to look out for?
“Are there some guardrails or constraints that perhaps we can put in place to improve the safety and in a sensible and responsible way, hand into the Australian health system?” she asked.
They are assuming AI will be part of the ecosystem by optimising their content for agentic use.
“We want to do that because we know our content is high quality, clinically governed and Australian centric. It’s relevant to Australians,” Ms McMahon said.
They’re also thinking about how the pathways and the data stores they have could be used by consumers to make it easier for them to get personalised advice.
But making that possible depends on something the health system still struggles with. Usable data.
Patients want their data but the system can’t always give it
Several speakers pointed to the growing consumer expectation of access to their own health information, particularly as My Health Record expands under Sharing by Default.
One panel asked what would happen if a patient arrived with a USB stick and asked for all their hospital records.
The consensus: it would not be easy.
As Telstra Health’s CTO Farhoud Salmi said, there isn’t a button you can press to get access to data.
“The data has been in silos, and still is in silos, and will be for a long while until we solve some of these problems.”
However, it’s not just getting the data out that’s the problem. It’s the accuracy of it.
“Inaccurate data into these models will actually create bigger problems for the patients and also the clinicians. So getting the right quality of data into the model is probably the challenge that we have,” he said.
The undeclared third party in the consultation
Beyond infrastructure, speakers warned of a more subtle risk that can happen inside the consultation itself.
Aspen Medical’s CMO, Dr Katrina Sanders described a case where a patient used ChatGPT before seeing their doctor.
The AI suggested pancreatitis as a possible cause of mild gastrointestinal symptoms. The patient arrived worried about that diagnosis but never mentioned the chatbot.
“The challenge is, there is an undeclared third party in the consultation,” Dr Sanders highlighted.
Although it means the patient is more informed, but they then start to question why the doctor hasn’t asked about pancreatitis.
“It starts to erode trust. And trust is the foundational element of the therapeutic relationship between the doctor and the patient. And when the therapeutic relationship breaks down, the impact is with the patient,” she said.
Related
Some solutions: ask, declare, document
Despite the concerns, speakers said the answer is not to resist consumer AI use, but to bring it into the open.
From Dr Sanders’ perspective, the first step is transparency.
“The first thing we’re saying is declare it. Be really open and transparent as a patient or a practitioner or as a leader, about your use of AI.
“Declare it because it’s the undeclared use that is particularly worrying,” Dr Saunders said.
Clinicians may need to routinely ask whether patients have used AI tools, document what they have been told, and correct misinformation early in the consult.
Tanya Kelly, deputy director-general of eHealth Queensland, said the system needs to work with consumers not against them.
“There are groups of providers that are really looking to work with those tools and with consumers with those tools to help them, work with them, rather than against them.
“So I think that’s where the consumer place for technology is all in that arena of trust,” she said.
For Mr Taggart, the rise of AI-assisted patients was not something to fear.
“I think it’s exciting that consumers have tools that help them understand what’s going on in their own lives,” he said.
“It balances the scales in healthcare. It makes self-management a real thing.”
But as many at the conference agreed, that future depends on the system learning to adapt just as fast as its patients have.



