AI identifies signs of COVID-19 in lung ultrasound images

Artificial intelligence (AI) can identify COVID-19 from lung ultrasound images, possibly providing clinicians with another way to diagnose the disease through imaging in the absence of other tests. And this ability could expand to other pulmonary illnesses, according to a new study published in Communications Medicine. [1]

AI model development began during the rise of the global pandemic, when test shortages were a major concern and clinicians were desperate for new ways to diagnose patients who had an active infection. 

Trained by combing through ultrasound lung images of patients confirmed to have the disease, the AI uses a similar method to facial recognition technology, identifying subtle features that are typical of the illness and finding a common pattern.

“We developed this automated detection tool to help doctors in emergency settings with high caseloads of patients who need to be diagnosed quickly and accurately, such as in the earlier stages of the pandemic,” the study’s senior author Muyinatu Bell, PhD, a professor of Electrical and Computer Engineering, Biomedical Engineering and Computer Science at Johns Hopkins University, said in a statement. “Potentially, we want to have wireless devices that patients can use at home to monitor progression of COVID-19, too.”

In total, the model was trained on over 40,000 unique images, some of which were computer-generated scans taken from real patient ultrasounds were also used by the researchers at Johns Hopkins. The combination of real and “simulated” images was used to successfully enhance the model’s abilities to pinpoint signs of COVID-19.

“We had to model the physics of ultrasound and acoustic wave propagation well enough in order to get believable simulated images,” Bell added.

Bell and the authors believe the AI tool could be loaded into wearables with ultrasound patches to diagnose a variety of illnesses, once trained on the proper set of imaging data—and it may not be limited to the lungs. For example, the AI can potentially be trained to monitor congestive heart failure at home in a more accurate way than current technology allows.

In the case of COVID-19, the AI was able to spot “B-lines” in the lungs, which appear as bright vertical abnormalities and typically indicate inflation. The feature in the lungs is commonly seen on ultrasound images of patients with the virus. 

The AI is a modified version of a deep neural network (DNN), a common algorithm designed to act like a brain, allowing the model to train itself by recognizing patterns in large datasets, from written speed to ultrasound scans. 

The study may offer a roadmap for using a combination of real and computer-generated images to develop DNNs to identify other illnesses. Future research is ongoing on using this type of deep-learning AI for a variety of diagnostics.

False negatives remain a concern, which is why manual review of findings is still necessary. However, the researchers were able to consistently reduce the number of false negatives by training the AI on more real and simulated scans. 

The full study is available at the link below.

Chad Van Alstin Health Imaging Health Exec

Chad is an award-winning writer and editor with over 15 years of experience working in media. He has a decade-long professional background in healthcare, working as a writer and in public relations.

Trimed Popup
Trimed Popup