Clinicians want AI to show its work

A large majority of radiologists and neurologists support the use of artificial intelligence triage assistance tools, but most are hesitant in their support of the technology flying solo. 

The U.S. Food and Drug Administration has now cleared nearly 1,000 medical AI algorithms, two-thirds of which are tailored to radiology. Many of these algorithms are already being implemented to address a wide swath of needs, including worklist prioritization, increased identification of difficult to detect tumors and several imaging triage tasks. 

Today, integrating AI into clinical workflows is less of a taboo topic than it was a decade ago, but there are still many barriers that sit in between algorithm approval and widespread implementation. One of those is trust in the available products, authors of a new analysis in the European Journal of Radiology signaled. 

“Clinicians, especially radiologists, are important stakeholders in the successful integration of AI imaging tools into clinical workflows. As a result, several studies have surveyed these clinicians to evaluate their attitudes and perceptions on AI. Most studies demonstrated an overall positive attitude and support for implementing AI into radiology. However, multiple barriers to integration were also raised,” Thomas C. Booth, with the School of Biomedical Engineering & Imaging Sciences at King’s College London, and colleagues note. “Although AI was perceived to have the potential to enhance clinical efficiency and improve patient care, this was contingent on an expectation for high AI performance and accuracy.” 

To gauge how much trust providers have in AI as a triage tool, the team surveyed 133 radiologists and neurologists regarding their confidence in AI to analyze brain MRI scans and correctly classify them as either normal or abnormal. Clinicians were provided the models’ explainability saliency maps and information on performance, training and validation data. 

For MRI brain scan triage, 71% of respondents signaled trust in AI, but only as an assistive tool, rather than a standalone reader. Clinicians were more likely to put trust in AI when given information that explained its decision-making, especially when given a heat map that allowed them to better understand the tool’s findings. 

“This highlights the importance of providing clinicians with information on how AI models come to their decisions,” the authors note. “The findings suggest that clinical AI triaging tools should maintain the concept of explainability as a core priority throughout their development and implementation process to ensure that clinicians feel confident in its application in patient care.” 

The group suggests that explainability will likely be a key factor in organizations’ decisions regarding integrating AI as more products continue to enter the market. 

Hannah murhphy headshot

In addition to her background in journalism, Hannah also has patient-facing experience in clinical settings, having spent more than 12 years working as a registered rad tech. She began covering the medical imaging industry for Innovate Healthcare in 2021.

Around the web

RBMA President Peter Moffatt discusses declining reimbursement rates, recruiting challenges and the role of artificial intelligence in transforming the industry.

Deepak Bhatt, MD, director of the Mount Sinai Fuster Heart Hospital and principal investigator of the TRANSFORM trial, explains an emerging technique for cardiac screening: combining coronary CT angiography with artificial intelligence for plaque analysis to create an approach similar to mammography.

A total of 16 cardiology practices from 12 states settled with the DOJ to resolve allegations they overbilled Medicare for imaging agents used to diagnose cardiovascular disease.