AI accurately distinguishes breast cancer on ultrasounds, but still can’t replace sonographers

A new artificial intelligence-based computer-aided detection model can accurately predict breast cancer based on ultrasound images, according to a study published June 9  in the Journal of Digital Imaging.

And while the convolutional neural network approach is far from perfect, it was on par with trained sonographers, albeit with a high rate of missed diagnoses. With further refinement, first author Heqing Zhang, with Sichuan University’s West China Hospital, and colleagues say the platform could help lighten sonographers’ ever-increasing workloads.

“Although the number of sonographers who perform ultrasonographic examinations, interpret the images, and issue diagnostic reports has increased, currently they cannot keep up with the growth in the requirement of ultrasound examinations,” the authors wrote, adding that increasing case numbers may leave more opportunities for errors. 

“The rapid development of artificial intelligence technology, such as deep learning, provides a new way to solve the aforementioned deficiency,” they added.

The team created four different CNNs using 5,000 breast ultrasound images (2,500 benign and 2,500 malignant) for their training set. And 1,007 of those scans were used to test the prediction model.

Compared to the sonographers who interpreted more than 680 ultrasounds, the model labeled “Inception V3” achieved a superior area under the curve score (0.913 versus 0.846). However, the CNN failed to match the human readers’ sensitivity of 90% and low missed diagnoses rate of less than 10%.

“In the prediction of breast cancer, the harm of missed diagnosis of breast cancer was far greater than that of misdiagnosis,” Zhang and colleagues noted. “Therefore, improving the sensitivity as far as possible with certain specificity is necessary.”

All of the ultrasounds used in this research had to be hand-labeled by professionals, which was “inefficient” and is not feasible for future larger-scale research, the team noted under its limitations. Additionally, the inherent nature of the technology does not allow users to view how it distinguished between benign and malignant lesions—a fault commonly known as the “black box problem.”

""

Matt joined Chicago’s TriMed team in 2018 covering all areas of health imaging after two years reporting on the hospital field. He holds a bachelor’s in English from UIC, and enjoys a good cup of coffee and an interesting documentary.

Around the web

The new technology shows early potential to make a significant impact on imaging workflows and patient care. 

Richard Heller III, MD, RSNA board member and senior VP of policy at Radiology Partners, offers an overview of policies in Congress that are directly impacting imaging.
 

The two companies aim to improve patient access to high-quality MRI scans by combining their artificial intelligence capabilities.