How image analysis competitions can promote faster, more collaborative AI research
The rise of AI in medical imaging has paved way for the improvement of workflow standardization, consistency and dependability imaging providers need in order to achieve the best patient care. However, as when implementing any new kind of technology into clinical workflows, there are challenges.
In a special report published Jan. 30 in the inaugural issue of Radiology: Artificial Intelligence, Luciano M. Prevedello, MD, and chief of imaging informatics at The Ohio State University Wexner Medical Center in Columbus, Ohio, and colleagues recognize these challenges, but also offer potential solutions—specifically image-based competitions—which could foster collaborative AI research.
The authors noted that although AI holds exciting opportunities for medical imaging, challenges related to data complexity, data access and curation, patient privacy, transferability of algorithms to mass markets and the integration of AI in clinical workflows must be addressed first in order to effectively bring AI to the forefront of augmenting patient care.
“Readily available, well-curated, and labeled data of high quality is paramount to performing effective research in this area,” Prevedello et al. wrote. “The radiology community, as stewards of this imaging data, needs to remain cognizant of our patients’ privacy concerns tempered by the need for large volumes of high-quality data.”
Activities that promote dialogue among radiologists and data scientists, such as image-based competitions, may address these current roadblocks, according to the researchers.
For example, over the past two years RSNA has hosted a public Machine Learning Challenge for the sole purpose of promoting collaborative research on AI in imaging. Last year’s competition focused on pneumonia detection and leveraged an existing public dataset, asking 1,399 teams to create new annotations for a subset of 112,000 chest x-rays for a $30,000 prize.
One advantage these AI image-based competitions have over standard research, according to Prevedello et al., is they encourage multiple participants to simultaneously address a specific problem independently and concurrently.
In comparison to hypothesis driven research, which follows a “serial path," image analysis competitions “promote a parallel process of iterative knowledge discovery and dissemination,” and allows individuals from various backgrounds to work toward solving the problem faster than the time it it take for new technologies to be implemented into clinical workflows, according to the researchers.
“It has long been recognized that there is a large temporal gap between the time that knowledge discoveries happen in the medical research setting relative to when they are clinically implemented; prior work suggests that this gap may be as long as 17 years in the public health sector,” the researchers wrote. “While this activity is extremely important and has been the cornerstone of the scientific advancements in many decades, it could potentially benefit from a more decentralized approach of initial knowledge discovery, such as image analysis competitions.”
Social media and sponsoring site forums that accompany such competitions allow increased visibility for an individual algorithm's performance. A public leaderboard can also help participants themselves understand their algorithm's performance and receive relevant feedback.
Nevertheless, the researchers concluded that image analysis competitions do have their flaws and weaknesses and should not replace standard hypothesis-driven peer-reviewed research. Additionally, radiologists “should still rely on standard rigorous scientific methodology to ensure safe and clinically relevant outcomes.”