Bias against women in AI-generated healthcare. Who knew?

3 minute read


Nobody should be surprised by this news.


Here’s a shocker – artificial intelligence can be biased.

Sorry, are you okay? I should have given you a bit more of a soft entry into that one. Do you need to sit down? Chocolate? Cuppa?

In Florida, machine learning algorithms designed to diagnose a common infection that affects women showed a diagnostic bias among ethnic groups.

Published last week in the Nature journal Digital Medicine, the research claims to be the “first to evaluate fairness among these tools in connection to a women’s health issue”.

Maybe because it only takes common sense to know two things – women get discriminated against in healthcare, and machine learning algorithms are programmed by humans, predominantly blokes.

Seriously. I continue to be astounded by the things academics are given money to study.

AI is “A” because it’s not naturally “I”, right? Someone has to program it. Garbage in, garbage out.

The researchers evaluated the fairness of machine learning in diagnosing bacterial vaginosis, or BV, a common condition affecting women of reproductive age, which has clear diagnostic differences among ethnic groups.

The University of Florida researchers pulled data from 400 women, comprising 100 from each of the ethnic groups represented — white, Black, Asian, and Hispanic. 

In investigating the ability of four machine learning models to predict BV in women with no symptoms, the researchers found the accuracy varied among ethnicities. Hispanic women had the most false-positive diagnoses, and Asian women received the most false-negative. 

“The models performed highest for white women and lowest for Asian women,” said the authors.  “This tells us machine learning methods are not treating ethnic groups equally well.”

No kidding, Sherlock.

BV, one of the most common vaginal infections, can cause discomfort and pain and happens when natural bacteria levels are out of balance. While there are symptoms associate with BV, many people have no symptoms, making it difficult to diagnose.

It doesn’t often cause complications, but in some cases, BV can increase the risk of sexually transmitted infections, miscarriage, and premature births.

The researchers said their findings demonstrated the need for improved methods for building the AI tools to mitigate health care bias.

And again, I say, no kidding.

Is it me, or are people inherently dumb when it comes to understanding the problem with artificial intelligence’s exponential development over the past couple of years?

We need governance to catch up with the technology or Skynet here we come.

via GIPHY

I’m looking forward to the release tomorrow of a national AI roadmap by the Australian Alliance for Artificial Intelligence in Healthcare.

We are so far behind on this stuff it’s actually mildly terrifying. Stay tuned.

Send your story ideas to cate@medicalrepublic.com.au for a tickle from a Terminator.

End of content

No more pages to load

Log In Register ×