'Transformer' AI model can read chest X-rays like a human radiologist

A new AI model can interpret chest X-ray images accurately utilizing information that goes beyond the image itself, much like an experienced radiologist. The technology has outperformed other AI typically used to aid in the diagnosis of disease, according to a new study out of Germany published in Radiology. [1]

Developed by a team of scientists at University Hospital Aachen, this advanced AI uses what researchers call “transformer-based neural networks” technology, which was initially developed to analyze human language and fuel platforms such as ChatGPT. Clinicians typically rely on both imaging and non-imaging data to diagnose diseases. However, existing AI-based approaches have been limited to handling one type of data at a time. As such, the novel transformer model represents a significant advancement, able to analyze clinical data along with an X-ray image to make a more informed decision, according to an announcement from RSNA. 

"Transformer models form a more general type of neural network, making them particularly well-suited for medical applications where various variables, such as patient data and imaging findings, are frequently integrated into the diagnostic process,” lead study author Firas Khader, a PhD student at University Hospital Aachen, explained. “They rely on a so-called attention mechanism, which allows the neural network to learn about relationships in its input.”

Khader and the other researchers tailored the study model specifically for medical use, instructing it to draw from two extensive databases containing the clinical histories of over 82,000 patients.

To conduct their research, the AI was trained to diagnose up to 25 different diseases, first using exclusively either non-images or images alone, to then be compared to a multimodal model that utilized a combination of both. The researchers found the multimodal variant consistently outperformed in terms of diagnostic accuracy across all conditions tested, coming in at 77%, versus 70% accuracy for AI that utilized chest X-rays alone and 72% when given only clinical data and no image. 

Khader believes that the proposed model could serve as a blueprint for efficiently integrating large amounts of data into AI platforms.

"With patient data volumes steadily rising and the limited time available for doctors to spend on each patient, it is becoming increasingly challenging for clinicians to effectively interpret all available information," he said. "Multimodal models hold the promise to assist clinicians in their diagnoses by streamlining the aggregation of available data into precise and accurate diagnoses."

As radiologists face shortages worldwide, the study authors believe their AI “has potential as an aid to clinicians in a time of growing workloads,” as the study roadmaps how machine-learning can possibly incorporate the data necessary to mirror what it means to be an experienced clinician.

Chad Van Alstin Health Imaging Health Exec

Chad is an award-winning writer and editor with over 15 years of experience working in media. He has a decade-long professional background in healthcare, working as a writer and in public relations.

Around the web

The nuclear imaging isotope shortage of molybdenum-99 may be over now that the sidelined reactor is restarting. ASNC's president says PET and new SPECT technologies helped cardiac imaging labs better weather the storm.

CMS has more than doubled the CCTA payment rate from $175 to $357.13. The move, expected to have a significant impact on the utilization of cardiac CT, received immediate praise from imaging specialists.

The newly cleared offering, AutoChamber, was designed with opportunistic screening in mind. It can evaluate many different kinds of CT images, including those originally gathered to screen patients for lung cancer. 

Trimed Popup
Trimed Popup