Algorithm performs at expert level when distinguishing between benign and malignant ovarian tumors

Researchers recently developed a deep learning algorithm that can distinguish malignant from benign ovarian tumors on ultrasound with comparable performance to experts. 

When tested on 422 women with ovarian masses, the algorithm achieved a higher area under the receiver operating characteristic curve (AUC) for predicting malignancy than both experts and Ovarian-Adnexal Reporting and Data System (O-RADS). Those involved with the study suggested that these findings could be beneficial in the future of ovarian tumor assessment by providing clinical decision making support. 

Various deep learning applications have shown high diagnostic performance for the detection and classification of tumors based on computed tomography and magnetic resonance imaging, but few studies have analyzed their efficacy in ultrasound. The affordability and convenience of US has resulted in the modality often being the first choice to screen for ovarian cancer.  

Radiologist interpretations of ultrasound imaging do have known variability though, which is why the ACR developed O-RADS—to provide structured risk stratification guidelines. But even with this classification system, image interpretations remain at least somewhat subjective. For these reasons, researchers sought to understand how deep learning could play into the often arduous task of diagnosing ovarian cancer on ultrasound. 

“To date, there have been limited deep learning (DL) algorithms developed for assessing ovarian tumors, and most have been based on single-modal US images,” corresponding author Wei-Wei Feng, from the Department of Obstetrics and Gynecology at Ruijin Hospital in China, and co-authors explained. “In clinical practice, the diagnosis of ovarian cancer involves multiple US images, including gray scale, color Doppler, and power Doppler US images.” 

For this study, the experts analyzed gray scale and color Doppler US images from 422 women with ovarian tumors. The subjects were divided into training, validation and test sets, and histopathologic analysis was used as the reference standard. The algorithm’s malignancy prediction was compared to O-RADS and expert assessments. 

The imaging included 304 benign and 118 malignant tumors. When distinguishing between benign and malignant tumors, the algorithm achieved a sensitivity and specificity similar to or higher than O-RADS and the experts. This was consistent when it came to AUC numbers, with the algorithm showing .93, also in line with the prediction accuracy of O-RADS and radiologists. 

“Our results suggest that targeted DL algorithms could assist practitioners of US, particularly those with less experience, to achieve a performance comparable to experts,” the authors wrote. “Our models could also be further developed to assess lesions found within a screening population.” 

The detailed study can be viewed in Radiology

Related ultrasound imaging content: 

These ultrasound features predict ovarian cancer

Research links echocardiographic measures to dementia risk

Ultrasound features that indicate difficulty of vaginal childbirth

Specialized ultrasound can accurately detect prostate cancer, new research shows

These ultrasound features distinguish between COVID vaccine-related and malignant adenopathy

Ultrasound images reveal how smoking before conceiving impacts embryonic development

Hannah murhphy headshot

In addition to her background in journalism, Hannah also has patient-facing experience in clinical settings, having spent more than 12 years working as a registered rad tech. She joined Innovate Healthcare in 2021 and has since put her unique expertise to use in her editorial role with Health Imaging.

Around the web

The new technology shows early potential to make a significant impact on imaging workflows and patient care. 

Richard Heller III, MD, RSNA board member and senior VP of policy at Radiology Partners, offers an overview of policies in Congress that are directly impacting imaging.
 

The two companies aim to improve patient access to high-quality MRI scans by combining their artificial intelligence capabilities.