Primary care docs lack trust in AI, making it unlikely they'll invest in applications, new survey says

Despite the opportunistic screening capabilities afforded by artificial intelligence applications, many primary care providers are hesitant to embrace the technology.  

Based in information derived from routine imaging exams, AI and machine learning-enabled opportunistic screening (OS) applications provide data physicians can use to determine patients’ risk of developing certain diseases. Given how common CT scans of the chest, abdomen and pelvis are ordered, they provide ample opportunities for such OS applications to be put to good use. 

There are currently several FDA-cleared AI-enabled OS applications commercially available, with more likely to come. However, concerns over the added costs, potential liabilities and a general lack of trust in AI could prevent providers from utilizing such applications, according to the survey data published in Clinical Imaging.

“Artificial intelligence methods such as machine learning have been used to automate anatomic segmentation and quantitative measurement, allowing for efficient scalable implementation of OS applications,” corresponding author Katherine P. Andriole, from the department of radiology at Brigham & Women's Hospital, and colleagues note. “Examples of OS applications in the literature include automated risk stratification for predicting presymptomatic major cardiovascular events or bone mineral density quantification for diagnosing osteopenia/osteoporosis—detection of such conditions allow for earlier potential preventative or disease modifying interventions.” 

Survey responses from a group of 71 PCPs were included in the analysis. Those responses revealed that although the majority (75%) of the PCPs included had heard of AI and ML, nearly all of them (96%) had little to no familiarity with opportunistic screening applications at all. 

When asked whether providers would consider including the use of these applications for their patients’ CT exams, a slew of trust issues were brought to light. 

Nearly 75% cited concerns over AI’s accuracy, while 73% highlighted potential issues related to liability (i.e., who is held accountable when AI makes a mistake?). Another 79% suggested close oversight would be required when using AI OS applications, which would inevitably increase radiologists' workloads.

Many believed that reports generated using the applications might change clinical management for patients, resulting in added tests and costs for both patients and providers. As such, just over 70% indicated that PCP offices likely would not be willing to invest. 

“For AI to be widely clinically adopted, stakeholders need to see value in use of the application,” the group cautions. “Increasing stakeholder familiarity with AI may be a critical prerequisite first step before stakeholders consider implementation.” 

The authors encourage developers to strongly consider the opinions of PCPs on these matters when planning for the future integration of opportunistic screening applications. 

Hannah murhphy headshot

In addition to her background in journalism, Hannah also has patient-facing experience in clinical settings, having spent more than 12 years working as a registered rad tech. She began covering the medical imaging industry for Innovate Healthcare in 2021.

Around the web

CCTA is being utilized more and more for the diagnosis and management of suspected coronary artery disease. An international group of specialists shared their perspective on this ongoing trend.

The new technology shows early potential to make a significant impact on imaging workflows and patient care. 

Richard Heller III, MD, RSNA board member and senior VP of policy at Radiology Partners, offers an overview of policies in Congress that are directly impacting imaging.