Australia’s privacy laws not fit for purpose in a digital health world

6 minute read


And why do we happily give our wellness data to a smartphone app, but are distrustful of governments when it comes to sharing our health data?


Trying to fit 21st century digital health interventions into 20th century privacy laws is leading to “sloppy workflows, uninformed patients” and a technical system “too hard for people to actually use”, says an industry expert.

Danielle Bancroft, former chief product officer for practice management system software developer Best Practice, told delegates at the AIDH’s HIC2025 conference in Melbourne recently that Australia’s current privacy laws were no longer fit for purpose.

“Our privacy laws were written for static databases and paper workflows,” she said.

“They weren’t actually created for continuous data streams, AI algorithms or the use cases we’re trying to use now, like interoperability and sharing of data.

“The problem with that is that we’re trying to take 21st century tech and put it through 20th century rules. It’s like putting a rotary dial on an iPhone. It just doesn’t fit.

“We keep trying to rebuild systems and transform different industries, but we’re copying the paper workflow instead of taking a step back and thinking about, yes, that fit that system, but maybe we can change and do better when we look at a technical workflow.

“Regulations are extremely important. There should absolutely be some form of mandatory, cybersecurity privacy impact assessments. They’re not done enough in health.”

Ms Bancroft was speaking on a panel about the value of privacy and data protection to patients, providers and clinicians.

“Do we actually need to review the privacy laws in line with our digital tech and where we’re headed? Absolutely,” she said.

“I don’t think they’re fit for purpose, for what we’re actually trying to build in health today, and that’s where we keep tripping over ourselves.

“Consent is broken because we’re trying to retrofit paper workflows and informed consent that we think it is at the time, but we end up with sloppy workflows, uninformed patients, and a technical system that makes it too hard for people to actually use.”

Ms Bancroft was asked if the added cost of privacy impact assessments was achievable in an environment where the regulatory burden for providers was already high.

“It’s feasible, but we need to be able to put in those systems and dedicate the investment,” she said.

“As a vendor, it’s a two-way system. There’s got to be a push from the government.

“At some point, the government does have to step in and provide regulatory frameworks that force vendors to actually invest in these systems, because if we don’t do it today, the cost is way more expensive.

“Yes, it’s expensive to uplift, but if you’re given enough notice and you have time to work with governments around what those guardrails look like, you actually can build incremental roadmaps that are in line with your existing systems and development to be able to improve incrementally on the product as you go.

“It’s not ‘big bang’. The industry is not sitting here going, we don’t want to uplift. We don’t want to do it.

“We all know it’s worthwhile. It’s being able to have some maturity around how we’re going to get there and incrementally working together to get there.”

Peter Worthington-Eyre, formerly the chief data officer for the South Australian government, told the conference the question was even more fundamental than the technicalities of privacy assessments.

“I don’t think it’s one ‘law versus technology’ discussion,” he said.

“The whole ecosystem [needs to] go back to basics and say, what is the healthcare system for? Is it delivering that? And then how do we design the laws, policies and frameworks around that?”

Also part of the panel discussion was the apparent cognitive dissonance in the way Australians will or won’t agree to sharing their health data, particularly with governments.

“On the one hand, we trust health tech companies with almost all of our data and our most intimate of thoughts, and yet we struggle with the thought that we might give the data to government,” said Tim Blake, the managing director of Semantic Consulting, and the moderator of the panel.

“We give data to the people who are very explicit that they will sell it for profit, and we don’t want to give it to the people who are very explicit, that they won’t necessarily even use it for secondary use.”

Ms Bancroft suggested the immediacy of the result might have something to do with that cognitive dissonance.

“Wellness data is not necessarily seen as being as sensitive as what we classify as health data, even though it can be just as personal,” she said.

“When we think about health systems, it can take several weeks for paperwork to be delivered. But when I’m tracking my steps or my heart rate, it’s right there – I have instant access.

“It can be a real perception disconnection for patients in understanding what their data is actually being used for, and how their data is going to be used in the long term, and that can often increase trust or decrease trust at the same time.”

Mr Worthington-Eyre suggested governments could increase public trust by being careful about the language they used.

“You get what you talk about,” he said.

“The private sector is talking about value, and governments talk about risk, and we need to talk about both.

“Part of it is that the consumer expectation is the value, and you see that playing out in the AI space – some countries are talking about the opportunity and the excitement, and others are talking about the guardrails and the risk.

“That’s changing some of the public narrative.

“How we ask things then becomes really critical. ‘Are you happy for us to share all of your private health information?’ ‘No.’ ‘Are you happy for me to share deidentified information, or, even more simply, information that won’t come back to you, to help other people in the same situation?’

“Nearly everyone says, yes [to that].

“What we then need to do is provide some guarantees that that happens, and some assurance, but normalise that actually what we need to increase the body of knowledge is safe, deidentified sharing.”

Marina Yastreboff, president of the Australasian Society for Computers + Law, agreed that governments weren’t doing themselves any favours in the way they communicated with the public about privacy and consent.

“There’s always an assumption inherent in that question, that somehow our not giving information over to the government is because it’s unwarranted or unjustified,” she told the delegates.

“We have a history where perhaps government hasn’t worked in its own best interest.”

The My Health Record was a case in point, she said.

“[Look at] the backlash swapping from an opt-in system to an opt-out system. That opt-out mechanism wasn’t particularly clear, and it was time-bound. So, assuming you knew about it, did you actually exercise your so-called right before the time was up?

“There’s a lack of transparency around third-party access, which tapped into insurers having a look at that data, as well as law enforcement under certain provisions.

“In a nutshell, government can learn a lot from the private sector,” she said.

HIC2025 was held in Melbourne on 18, 19 and 20 August 2025.

End of content

No more pages to load

Log In Register ×