Imaging patients are concerned—but optimistic—about AI
Radiology patients are confident artificial intelligence will improve healthcare workflow and efficiency, but they’re skeptical of the tech itself and remain unsure of how AI will factor into the patient experience, according to a study published online March 14 in the Journal of the American College of Radiology.
Due to a lack of existing research on the subject, first author Marieke Haan, PhD, and colleagues at the University of Groningen in the Netherlands pulled together a small-scale qualitative study of 20 imaging patients’ views on AI in radiology. The team said that, while physicians have been focused on the clinical benefits of AI—increased diagnostic certainty, quicker turnaround and reduced burnout, to name a few—patients remain “important but still neglected stakeholders” as the technology evolves.
“At present, it is still unknown how patients view the development of AI in radiology in terms of awareness of this topic, uncertainties and expectations,” Haan et al. wrote. “This knowledge is crucial to define preconditions for the development of AI systems for different clinical purposes and how they should be used in routine radiology practice.”
The authors recruited 11 men and nine women from University Medical Center Groningen’s radiology department between July and August of 2018, randomly approaching patients who were scheduled for an outpatient CT of the chest and abdomen. An interviewer met with each participant directly after their exam, spending an average 9.3 minutes discussing the intersection of AI and radiology before patients were compensated with a 5-Euro voucher.
In addition to being asked about their views on the radiology department in general, AI in general and the combination of the two, patients were queried about the ideal focus of a scan (for example, if a computer should limit its diagnostic scope or search for incidental findings), the evaluation of scans by a radiologist versus a computer and receiving results from a physician versus a computer.
“A prominent result from the interviews is that patients’ views on radiology are diverse and sometimes incorrect,” Haan and colleagues wrote. “For instance, to patients it is not clear what the differences are in roles and responsibilities of different staff members in the radiology department. With respect to AI in general, patients noted either no particular associations or mentioned factors like ‘loss of jobs,’ ‘making life easier’ or ‘what need do we have for that?’”
The authors said participants were also wary of how AI would fit into current radiological practice—perhaps because they didn’t have a strong grip on the roles within an imaging department before the interview. Patients said they valued the experience of a radiologist, and while computers might be helpful to validate diagnoses and form treatment plans, they’d ideally be used to corroborate information a physician already knows.
According to Haan et al.’s research, personal interaction when receiving test results is critical to patients, allowing them to ask questions and better understand their results in a safe space with a professional they trust. That human contact is something AI won’t be able to replicate.
“Patients express their concerns about depersonalized procedures in which patients become numbers,” the team wrote. “Also, to discuss the results of a scan in a sensitive manner, human dialogue is important.”
Participants were also concerned about accountability, since it’s unclear who should shoulder the burden for mistakes made by computers. Haan and co-authors said that’s an indication we need to push harder for legal AI guidelines and start talking ethics more often; in the meantime, they said their findings might provide a framework for targeted patient education about the development and implementation of AI in radiology.