A ‘powerful’ tool: Commercially available AI significantly outperforms radiologist reads for trauma imaging

Commercially available AI software was recently found to be significantly more accurate for detecting an array of skeletal lesions than radiologists, particularly for trauma radiographs. 

The software was put to the test on nearly 5,000 trauma radiographs, assessing for the presence of fractures, dislocations, elbow effusions and focal bone lesions (FBL). In some cases, it outperformed radiologists by as many as 82 points. 

Aside from fractures, radiologists are also tasked with assessing trauma radiographs for dislocations, bony lesions and joint effusions. Trauma imaging is particularly prone to reader error, but experts believe that artificial intelligence applications could support radiologists in avoiding costly mistakes

They shared evidence to support their case recently in the European Journal of Radiology

“Trauma X-ray reading is often delayed or assigned to non-expert readers due to the lack of experienced radiologists and an increased workload in imaging. Therefore, the rate of diagnostic errors on trauma radiographs is one of the highest, potentially leading to malpractice claims,” corresponding author on the study Nor-Eddine Regnard, of the South Ile-de-France Imaging Network, and colleagues shared. 

To compare how AI compared to radiologists for assessing traumatic injuries on x-ray, the experts used a commercially available software—BoneView—to retrospectively analyze three consecutive months of radiographs from 14 different centers within a private imaging group. Each exam was completed after a traumatic injury to the limbs or pelvis. Reports were used to compare detection performance and a senior skeletal radiologist reviewed any resulting discrepancies. 

AI outperformed radiologists in lesion-wise sensitivity for fracture detection at 98.1% compared to 73.7%. This trend continued for identifying dislocations (89.9% vs 63.3%), elbow effusions (91.5% vs 84.7%) and focal bone lesions (98.1% vs 16.1%). Radiologist specificity was always 100%, while the software’s ranged from 88% to 99.8%, and negative predictive values were impressive as well. 

Authors of the study highlighted the fact that their research offers a real-world comparison of radiologists and AI, stating that when coupled with human readers, AI could be a “powerful tool” in clinical practice that increases accuracy and speeds up read times. 

The study abstract can be viewed here

Hannah murhphy headshot

In addition to her background in journalism, Hannah also has patient-facing experience in clinical settings, having spent more than 12 years working as a registered rad tech. She joined Innovate Healthcare in 2021 and has since put her unique expertise to use in her editorial role with Health Imaging.

Around the web

The nuclear imaging isotope shortage of molybdenum-99 may be over now that the sidelined reactor is restarting. ASNC's president says PET and new SPECT technologies helped cardiac imaging labs better weather the storm.

CMS has more than doubled the CCTA payment rate from $175 to $357.13. The move, expected to have a significant impact on the utilization of cardiac CT, received immediate praise from imaging specialists.

The newly cleared offering, AutoChamber, was designed with opportunistic screening in mind. It can evaluate many different kinds of CT images, including those originally gathered to screen patients for lung cancer. 

Trimed Popup
Trimed Popup