Thursday, March 12, 2026

‘Rectal garlic insertion for immune help’: Medical chatbots confidently give disastrously misguided recommendation, consultants say

Fashionable AI chatbots usually fail to acknowledge false well being claims after they’re delivered in assured, medical-sounding language, resulting in doubtful recommendation that might be harmful to most of the people, equivalent to a advice that individuals insert garlic cloves into their butts, in response to a January examine within the journal The Lancet Digital Well being. One other examine, revealed in February within the journal Nature Drugs, discovered that chatbots had been no higher than an atypical web search.

The outcomes add to a rising physique of proof suggesting that such chatbots aren’t dependable sources of well being data, not less than for most of the people, consultants advised Reside Science.

Related Articles

Latest Articles