Allowing image noise increases could cut liver CT rad dose in half
Researchers found increases in image noise equivalent to a 50 percent reduction in patient radiation dose on low-contrast liver CT scans can be achieved without substantially affecting sensitivity, according to a study published in the October issue of Academic Radiology.
Lead researcher Kalpana M. Kanal, PhD, of the University of Washington in Seattle, and colleagues found that noise levels could be increased incrementally into CT images up to a Noise Index of 21.2 without impacting radiologists’ detection of focal hypodense liver lesions.
The retrospective study's results suggest “that clinically acceptable scanning could be performed at a lower patient radiation dose,” wrote Kanal and colleagues.
Three radiologists reviewed a total of 400 images. CT liver scans with hypodense lesions and CT liver scans without any lesions were assembled in 100-image sets. The researchers used an artificial image noise addition tool and each set had a different NI, ranging from an NI of 15 to an NI of 29.7. Readers examined four phases: the precontrast phase, the late arterial phase, the portal venous phase, and the delayed phase. Fifty axial liver images, which had no lesions, were selected from the same pool of patients to serve as control images.
Sensitivity was between 95 percent and 98 percent for original images (NI 15) plus images with NI of 17.4 and 21.2. For images with NI of 29.7, sensitivity was 89 percent, according to the study.