SIIM19: Is radiology’s data problem hurting AI?

Radiology and AI are becoming inseparable, but there’s still a major hurdle to overcome if they’re going to reach their full potential: the lack of high-quality, well-annotated data.

So explained a panel of experts speaking at the Society for Imaging Informatics in Medicine (SIIM) 2019 annual meeting in Colorado.

If algorithms are to be properly trained and validated, they must be fed and tested on high volumes of quality-labeled data—and these are not easy to obtain, said Judy W. Gichoya, MD, who recently finished an interventional radiology fellowship at Oregon Health & Science University’s Dotter Institute.

At her institution, for example, Gichoya estimated a breast MR study would take up to six months, even with colleagues working every night to access archived images.

Datasets released by the NIH and Stanford University’s Center for Artificial Intelligence in Medicine and Imaging (AIMI) are among the most robust publicly available datasets to date, she said. The NIH set contains more than 112,000 x-rays with disease labels from 30,805 unique patients. Stanford has published three sets, one of which—CheXpert—contains 224,316 radiographs of more than 65,000 patients.

Still, Gichoya said, there isn’t enough well curated data out there, and the imaging information that could be useful is tucked away.

“For me, the reality is that it’s the wild west out there,” she said. “We have a lot of data, a lot of open source models, and a lot is still within institutions. I’m not so sure we really understand how to make it accessible.”

But there are certainly organizations that have undertaken the laborious task of annotating data, with promising results. Matthew Lungren, MD, MPH, associate director of Stanford’s aforementioned AIMI center, represented one such example at SIIM.

Earlier this month, Lungren and his team at Stanford published their deep learning model that helps radiologists detect intracranial aneurysms on CT angiography (CTA) exams. With the tool, eight readers identified six more aneurysms in a set of 100 scans. Each also showed big gains in sensitivity and accuracy with the tool.

“We labeled, by hand, every voxelthe 3D equivalent to a pixelwith whether or not it was part of an aneurysm,” said Chris Chute, a co-lead author of the paper. “Building the training data was a pretty grueling task and there were a lot of data.”

Lungren admitted he speaks from an academic perspective, where there is more funding, more willing hands and “less mouths to feed” than a typical clinical setting. But even he has his concerns about the promise of AI.

“I will go on the record and say I’ve still yet to see a use case that makes me say ‘Oh wow that’s really great, it’s really going to help somebody with the task the algorithm said it could accomplish.”

""

Matt joined Chicago’s TriMed team in 2018 covering all areas of health imaging after two years reporting on the hospital field. He holds a bachelor’s in English from UIC, and enjoys a good cup of coffee and an interesting documentary.

Around the web

The nuclear imaging isotope shortage of molybdenum-99 may be over now that the sidelined reactor is restarting. ASNC's president says PET and new SPECT technologies helped cardiac imaging labs better weather the storm.

CMS has more than doubled the CCTA payment rate from $175 to $357.13. The move, expected to have a significant impact on the utilization of cardiac CT, received immediate praise from imaging specialists.

The newly cleared offering, AutoChamber, was designed with opportunistic screening in mind. It can evaluate many different kinds of CT images, including those originally gathered to screen patients for lung cancer. 

Trimed Popup
Trimed Popup