Give AI access to medical records? What could possibly go wrong?
Your Back Page scribbler is of an age where he takes most predictions about how technological advances will make our lives better with a grain of salt.
That is to say, as a child we firmly believed we’d all have flying cars like in The Jetsons by now, and we have never really got over the disappointment.
So when it comes to the extravagant and self-serving prognostications made around artificial intelligence, we apply the salt in industrial quantities.
How can we not when we learn that Google’s state-of-the- art AI Overviews feature can’t even understand how calendars work?
When it was recently asked: “Is 2027 next year?” it confidently replied: “No, 2027 is not next year; 2027 is two years away from the current year (2026), meaning next year is 2028, and the year after that is 2027, which is a common year starting on a Friday.”
WT actual F!?
You must forgive us then as we raise a sceptical eyebrow when told by techno-boffins that it would be a wonderful idea to load our medical records into OpenAI’s brilliant new feature called ChatGPT Health on the promise that doing so will “generate responses more relevant and useful to you”.
Despite a plethora of evidence showing that AI chatbots, in their current iterations, are prone to prescribing nonsensical, surreal and downright dangerous medical advice, the geniuses at OpenAI remain undaunted.
And it’s not just OpenAI who are plunging headlong into uncharted waters.
Last year, everyone’s favourite gazillionaire Elon Musk asked folks to upload their medical data to his AI model, pertinently named Grok, leading to a flood of confusion as users received hallucinated diagnoses after sharing their X-rays and PET scans.
To be fair to OpenAI, the company does include some heavy-duty caveats for its product, including tellingly saying that the tool is “not intended for diagnosis or treatment”.
Instead, the company says on its website that: “ChatGPT Health helps people take a more active role in understanding and managing their health and wellness – while supporting, not replacing, care from clinicians.”
Frankly, if you’re prepared to swallow that load of egregious bulldust I’ve got a harbour bridge to sell you. Diagnosis and treatment are EXACTLY what people are going to use this feature for!
Far be it from us to try to protect folks from their own stupidity, but surely the ones that we select to make decisions for us on behalf of the greater good need to take a long, hard look at what’s going on here.
A laissez-faire approach to technology regulation is all well and good, right up until someone loses an eye.
Not to mention the privacy and protection issues that are raised by uploading one’s medical information to private, profit-driven enterprises with underwhelming track records in the data-security stakes.
As always, we are not holding our breath in the hope that someone reins in this bolting horse. Rather, we are strapping ourselves in for what is promising to be a wild ride.
Don’t wait until it’s 2028, send your story tips now to Holly@medicalrepublic.com.au.
