When it comes to food safety standards, less might actually be more.
What’s the rule in your household for picking up and still eating food that’s dropped on the floor? Five-second rule? Three-second rule? Or is it: “Don’t be disgusting, chuck that in the bin!”
Your Back Page scrawler, we must admit, makes a separate executive decision on each occasion based on the type of foodstuffs involved, cleanliness of the floor in question, and how arsed we can be about preparing another delicious snack.
We attribute this blasé attitude to the largely unsupervised free-range childhood we had, which involved the making of and consumption of mudpies as a rite of passage. Not once do we recall being struck down with food poisoning as a result.
Is it possible that when it comes to food safety, the strict standards adopted by wealthy Western nations may actually be doing more harm than good?
A team of international scientists lead by Cornell University in the US certainly make a compelling case for that being so.
Publishing this week in the journal Frontiers in Science, our boffins argue that overly stringent food safety measures and ultra-sensitive tests may be resulting in edible food being unnecessarily thrown away, creating excessive packaging and extra costs for consumers.
They posit that current “zero-detection” approaches to food pathogens should be replaced with evidence-based targets for “sufficiently safe” food, saying this approach could make food systems more sustainable without sacrificing public health.
“Although the public expects food to be completely safe, there will always be some risk of foodborne illness. Zero risk doesn’t exist, and we shouldn’t be aiming for that either,” lead author Professor Martin Wiedmann said in a media release.
“Just as we don’t limit highway speeds to 10 miles (16 kilometres) per hour to minimise road deaths, we need to take a balanced approach that considers possible negative consequences of extreme food safety measures,” he said.
The study authors explained that many rules and purchasing standards relied heavily on detecting a pathogen, sometimes treating any detection as unacceptable without fully considering dose, exposure, the food’s ability to support microbial growth, or who is most at risk.
For example, a food product might be considered contaminated if it tests positive for the bacterium Listeria monocytogenes, regardless of the actual levels found.
Ultra-sensitive tests, they argued, were detecting small amounts of microbes unlikely to cause disease in humans, resulting in those foods being thrown away.
“A tremendous amount of food is wasted that would have been sufficiently safe to eat. Too often, trade-offs such as environmental or economic costs are only considered after a traditional microbial risk assessment,” co-author Professor Sophia Johler, from Ludwig Maximilian University of Munich, told media.
“We cannot afford to carry on like this at a time when we desperately need to reduce our impact on the planet and assure not only food safety but food security.”
The study team suggested that specialists across social sciences, economics, and life sciences should work together to “establish values that align with consumers’ priorities”.
By using models that built on geographic information, AI, and genomics, experts could assess, manage, and communicate risks far more accurately, they added.
Or as we would like to think of it: butter-side down on a sticky carpet is a “no” but butter-side up on clean kitchen tiles is a “yes”.
Send food safety advice and story tips to Holly@medicalrepublic.com.au.
