Did you say that or did AI say that?

3 minute read


The AMA has made calls for the government to take serious action against the use of AI to impersonate well-respected medical professionals.


Concerns have been raised over a lack of action on the use of AI in producing ‘deepfake’ videos to push misinformation. 

Exploitation of deepfake technology has been reportedly used to impersonate well-respected medical figures to peddle unproven and potentially harmful treatments for serious diseases. 

Many clinicians who have been impersonated by these deepfake videos have questioned the government inaction, highlighting how the practice is defamatory to their medical reputations. 

“We are now living in an age where any video that appears online has to be questioned –is it real, or is it a deepfake?” AMA president Dr Danielle McMullen said. 

“Deepfake videos are becoming more and more convincing, and this technology is being exploited by dodgy companies peddling snake oil to vulnerable people who are dealing with serious health issues.” 

This issue has prompted the AMA to send a letter to the federal communications minister urging the government to ratify clear legislation to control the use of AI in health-related advertising. 

These laws would be able to further regulate medical-related content on social media and digital platforms to further combat online misinformation. 

A regulatory framework has been suggested by the AMA that would include mandatory identification of individuals or companies responsible for any online promotional materials for medical goods or services. 

Further online user safeguards have also been suggested, such as an online portal for reporting said material and greater unsubscribing mechanisms for medical content. 

The primary concern outside of defamation has been how this content could impact patient relations with general practice and create unnecessary distrust. 

“These are harmful to people’s health, disruptive to the healthcare system, and undermine the credibility of the healthcare profession,” Dr McMullen told The Medical Republic. 

“It’s really important that we see stronger measures, both to prevent these being uploaded in the first place, make it easier for them to be taken down, and to strengthen the penalties against those scammers out there who are who are putting up fake content. 

“We’ve written to the communications minister, but we need governments to take urgent action to make it harder to put these videos up. 

“Also to consider things like credentialing healthcare videos online to make it easier to take them down and get these fake videos removed, and to strengthen the penalties for those who are putting them up. 

“We’ve also had examples from Dr Norman Swan, Dr Karen Phelps, who’s also a GP and with increasingly convincing videos of them purportedly telling people to put stop taking their usual medicines. 

“It’s deceptive and ignore deceptive behaviour and risks, not only patients’ pockets, but also their health if they’re stepping away from their usual treatments.”

End of content

No more pages to load

Log In Register ×