Deep learning detects, segments, classifies breast tumors with 93% accuracy
An integrated computer-aided diagnosis (CAD) system developed by researchers from Kyung Hee University and Sungkyunkwan University in South Korea could outperform current conventional deep learning methodologies used by radiologists to detect, segment and classify tumors from digital x-ray mammograms.
Classifying masses as benign or malignant can be a time-consuming process for radiologists. A supplemental reading done by another expert, in this case one that’s computerized, may increase overall accuracy and specificity while reducing false positives and negatives, wrote lead author Mugahed Al-antari, PhD, a professor of biomedical engineering at Kyung Hee University, and colleagues in research published in the August issue of the International Journal of Medical Informatics.
The researchers’ CAD system comprises three main deep learning components to detect, segment and classify a breast mass from an entire mammogram.
To detect a breast mass, a regional deep learning approach named "You-Only-Look-Once" (YOLO) was used to segment the mass. A full-resolution convolutional network (FrCN) then segmented the mass.
Lastly, a deep convolutional neural network (CNN) was used to recognize and classify the mass as either benign or malignant. The CAD system was also evaluated for accuracy by an INbreast database, according to the researchers.
Al-antari and colleagues' CAD system performed with an overall 99 percent mass detection accuracy, 93 percent mass segmentation accuracy and 96 percent mass classification accuracy.
"Our proposed CAD system could handle all stages of detection, segmentation, and classification with a higher performance and in a faster time than others with a total testing time for all stages," Al-antari et al wrote. "Therefore, the performance of the proposed CAD system seems to make its practical application possible."