AiinhealthcareLiveAppeal 10.01 min read

Ontario Audit Exposes AI Scribe Errors

15 May 2026By Pulse24 desk
← Back
Share →

What happened

Ontario's Auditor General found 20 approved AI Scribe systems for healthcare providers routinely produced factual errors and fabricated content. Nine systems invented treatment suggestions, 12 inserted incorrect drug information, and 17 missed key mental health details. The provincial procurement process weighted domestic presence at 30% of the evaluation score, while medical note accuracy contributed only 4%, and bias controls just 2%. Over 5,000 physicians use the program, with no mandatory attestation for manual review of AI-generated notes.

Why it matters

Patient safety risks increase when procurement prioritises vendor location over system accuracy. Inaccurate medical records increase patient safety risks for procurement teams and healthcare providers, as 4% of the evaluation score determined medical note accuracy for systems now used by thousands of physicians. This follows research indicating AI chatbots frequently misdiagnose patients in studies. Security architects must assume untrusted outputs from systems lacking robust bias controls and privacy assessments, which accounted for only 2% each of the evaluation.

Source · theregister.comAI-processed content may differ from the original.
Published 15 May 2026