AI helps radiologists and non-specialists detect fractures on X-rays, but experts remain dubious
Radiologists and nonradiologists both performed better at spotting fractures on radiographs with the help of artificial intelligence, doing so without additional required interpretation time.
In fact, readers’ sensitivity for detecting fractures jumped by more than 10% when using AI, without a loss in specificity, researchers reported Tuesday in Radiology. Those tech-assisted interpretations were completed 6.3 seconds faster per patient compared to manual reads.
Overlooked fractures on X-rays are a problem for some acute trauma settings and can account for up to 24% of harmful diagnostic errors across emergency departments. AI can help triage exams in these busy settings while also functioning as a safety mechanism for providers.
“The AI-assisted fracture recognition … has the potential to enhance diagnostic ability of both radiologists and nonradiologists, not only by detecting subtle findings difficult to visualize with human eyes but also by preventing cognitive errors due to human fatigue or satisfaction bias in image interpretation,” Ali Guermazi, MD, PhD, with Boston University School of Medicine’s Department of Radiology, and co-authors explained.
For their retrospective study, Guermazi et al. developed an algorithm using 60,170 radiographs taken from trauma patients treated across 22 health centers. They then put their tool to the test using 480 radiograph exams from multiple institutions, tasking 24 readers (radiologists, orthopedists, emergency doctors, etc.) with interpreting the exams with AI and without it.
The gains in sensitivity when using AI and time saved for busy radiologists were highlighted by two musculoskeletal specialists in an accompanying editorial.
Thomas M. Link, MD, PhD, and Valentina Pedoia, PhD, both with UCSF’s Department of Radiology and Biomedical Imaging, also expressed excitement related to AI’s potential impact on growing workloads and overnight interpretation requirements.
At the same time, the pair pointed out a problem common to most AI-based studies—the ground truth. This study established ground truth using musculoskeletal radiologists’ reads, which are not based on “solid” scientific evidence, but rather a concept used for convenience, the pair wrote.
Going forward, more studies will need to overcome this issue before AI can advance into clinics.
“Careful implementation is required to include AI algorithms in radiologists’ workflow, simplifying analysis and making readouts faster. However, at this stage, it is premature to engage in AI stand-alone techniques,” Link and Pedoia argued Tuesday. “Many of the current AI studies do not have a rigorous study design and include an imperfect ground truth, which is not suitable in a clinical environment.”