Automated feedback improves trainee reports, especially during after-hours
A new analysis in Clinical Radiology details the effect of automated comparison tools on radiology trainees’ reporting habits.
The use of such systems was shown to provide trainees with hundreds of additional opportunities for feedback over a span of six months. The tool was especially beneficial for trainees working after hours shifts, when quality feedback is generally delayed and more difficult to come by, authors of the paper noted.
“Consultant radiologists at teaching hospitals routinely modify trainee preliminary reports prior to report finalization in clinical practice. However, timely feedback of these modifications is lacking, particularly for after hours reporting, with no efficient means for trainees to compare their preliminary report to the finalized consultant report,” corresponding author Michael John Stewart, with the Department of Radiology at Austin Health in Australia, and co-authors explained.
For the study, third-party and in-house IT specialists designed an automated report comparison tool (RCT) that utilizes natural language processing to provide feedback to trainees based on the differences between their preliminary reports and the final consultant versions. The tool highlights changes made between initial and final reports for trainees to take note of. Surveys on user satisfaction and data specific to the reports generated before and after the tool was put to use were collected to assess its impact on trainees.
Pre- and post-implementation surveys indicate that the feedback trainees were given was well received by both residents and their supervisors. The feedback helped trainees to significantly reduce the character counts of their reports, and post-implementation data revealed that there also was a reduction in the number of initial reports that needed to be modified.
Following the RCT’s implementation, trainees spent more time reviewing reports, especially during after-hours shifts. Prior to the RCT providing feedback, residents spent less than five minutes on average reviewing their reports for two-thirds of their shifts. This figure doubled post RCT implementation.
“In our study we observed that, even with the efficiency improvements in reviewing cases/reports inherent in the RCT, trainees spent more time reviewing reports in the post-implementation phase; this implies that either a larger number of cases were reviewed and/or cases were reviewed in greater depth,” the authors suggested. “Such improvements to the quantity and quality of case review and feedback may allow trainees to improve future report quality and radiological understanding.”
The improvements were consistent across trainees of varying experience levels and CT exam complexity as well, the group noted.
Although face-to-face feedback remains the gold standard, the group suggested that similar tools could provide invaluable feedback to radiology residents when clinical supervisors may be strapped for time.
The study abstract is available here.