If you can’t dazzle ’em with brilliance, baffle ’em with BS
If you can’t dazzle ‘em with brilliance, baffle ‘em with BS: Whisper, an AI transcription tool used in hospitals, has trouble with “hallucinations,” aka making things up. A study found eight in 10 transcripts had fakery, including meds like “hyperactivated antibiotics.” PSA: If you saw a doctor or were in the hospital lately, sign into your portal and check the notes for any nonsense.
Tags: AI (artificial intelligence), health care, hospitals, Whisper