AI spots missed findings on chest X-rays, aiding nonradiologists in emergency setting

Artificial intelligence (AI) supports the ability of nonradiologist clinicians to read and interpret chest X-rays, serving as a second pair of eyes, a study out of Germany found. 

Researchers examined the use of AI in an emergency unit that often lacked a radiologist, leaving other providers to read reports. Serving as a decision support tool, a convolutional neural network (CNN) AI model was found to facilitate more accurate primary diagnosis when nonradiologists were reading chest X-rays.

The accurate, swift diagnosis supports early treatment interventions, the researchers led by Jan Rudolph, MD, from the Ludwig Maximilian University of Munich said. The full study findings are published in the journal Chest. [1]

Despite being the primary imaging modality for assessing a variety of diseases that afflict the chest, X-rays can be difficult to interpret, especially for nonradiologists who may lack the necessary experience. Rudolph and his team sought to add to a growing number of studies that reveal ways in which AI can fill the diagnostic gap. 

For their study, three board-certified radiologists examined 563 chest X-rays. The same images were also examined by three radiology residents, as well as three radiology residents with experience working in an emergency unit. 

The researchers tasked nonradiologists with recognizing pleural effusion, pneumothorax, pneumonia and lung nodules, all of which would be common emergency diagnoses and ones the CNN model was able to identify with high levels of accuracy ranging from .95 for nodules and .995 for pleural effusion. 

When supported by AI, the nonradiology group achieved a higher level of accuracy, improving consensus. Specifically, implementing AI improved diagnostic sensitivity by 53% for findings that may be missed, along with a 7% increase in accuracy for specific diagnoses. 

When compared to the radiologist groups, these improvement metrics were significantly smaller, with the researchers calling improvements in accuracy and sensitivity “mostly nonsignificant.” The AI was best at identifying findings that were missed, especially for the nonradiologist group, the authors noted. 

“In this case, the number of potentially missed findings could be significantly reduced,” the authors wrote. 

Their final conclusion on the results of testing the CNN AI were very specific:

“In an [emergency unit] setting without 24/7 radiology coverage, the presented AI solution features an excellent clinical support tool to nonradiologists, similar to a second reader, and allows for a more accurate primary diagnosis and thus earlier therapy initiation,” they concluded. 

The full study can be read at the link below. 

Chad Van Alstin Health Imaging Health Exec

Chad is an award-winning writer and editor with over 15 years of experience working in media. He has a decade-long professional background in healthcare, working as a writer and in public relations.

Trimed Popup
Trimed Popup