AAPM: Image-processing algorithm offers 20x less rad dose for CT
A new image-processing algorithm, based on the Gradient Adaptive Bilateral fIItration (GABI) algorithm, can help radiologists lower radiation dose in perfusion CT scanning, delivering as much as a 20-fold reduction with no disadvantage to perfusion information and image quality compared to a full dose image acquisition, according to a study presented at the 52nd Annual Meeting of the American Association of Physicists in Medicine (AAPM) in Philadelphia on July 20.
Cynthia H. McCollough, PhD, of the Mayo Clinic in Rochester, Minn., and colleagues applied their new algorithm in vivo to extremely low-dose perfusion CT scans to demonstrate that at the correct dose, there should be no injury to the patient undergoing the imaging study. "We believe in the clinical value of perfusion CT, so we're trying to lower the dose and reduce the stigma," they said.
In CT time series studies, the GABI algorithm is applied in order to avoid losing spatial resolution and noise power spectra characteristics. It expands the bilateral filter algorithm by non-locally adapting its strength based on a temporal gradient applied to the image dataset, allowing for the preservation of contrast fidelity over time.
"With this algorithm, we're trying to maintain both the image quality, so that a doctor can recognize the anatomic structures, and the functional information, which is conveyed by analyzing the flow of the contrast agent over many low dose scans," said presenting study author Juan Carlos Ramirez Giraldo.
The researchers tested the GABI algorithm on two renal perfusion animal experiments that used a reference dose of 160 mAs (full dose) and 16 mAs and 8 mAs (1/10th and 1/20th of the radiation dose).
McCollough and colleagues found that the GABI algorithm preserved the temporal fidelity, as measured by the time attenuation curves (TACs), while dramatically reducing image noise from very low-dose acquisitions. In addition, the algorithm kept comparable spatial resolution and preserved CT noise texture.
“The background signal-to-noise ratio for the full dose pig A, full dose pig B, 1/10 dose pig A, 1/20 dose pig B, 1/10 pig A + GABI, 1/20 pig B + GABI were 4.5, 4.4, 1.4, 0.85, 3.8 and 2.9 respectively,” offered the authors.
For their study, the researchers were restricted to keep very high fidelity to the TACs and had perfusion parameters; however, they noted that higher background signal-to-noise ratio improvements are achievable.
While further investigation for parameter optimization is necessary, the authors concluded that the image-space based and noniterative algorithm can provide a framework in reducing radiation dose in CT time series studies. “We demonstrated in real animal data, that in high signal-to-noise ratio tasks as renal perfusion, the gain in dose reduction can be as high as 20-fold,” they wrote.
Cynthia H. McCollough, PhD, of the Mayo Clinic in Rochester, Minn., and colleagues applied their new algorithm in vivo to extremely low-dose perfusion CT scans to demonstrate that at the correct dose, there should be no injury to the patient undergoing the imaging study. "We believe in the clinical value of perfusion CT, so we're trying to lower the dose and reduce the stigma," they said.
In CT time series studies, the GABI algorithm is applied in order to avoid losing spatial resolution and noise power spectra characteristics. It expands the bilateral filter algorithm by non-locally adapting its strength based on a temporal gradient applied to the image dataset, allowing for the preservation of contrast fidelity over time.
"With this algorithm, we're trying to maintain both the image quality, so that a doctor can recognize the anatomic structures, and the functional information, which is conveyed by analyzing the flow of the contrast agent over many low dose scans," said presenting study author Juan Carlos Ramirez Giraldo.
The researchers tested the GABI algorithm on two renal perfusion animal experiments that used a reference dose of 160 mAs (full dose) and 16 mAs and 8 mAs (1/10th and 1/20th of the radiation dose).
McCollough and colleagues found that the GABI algorithm preserved the temporal fidelity, as measured by the time attenuation curves (TACs), while dramatically reducing image noise from very low-dose acquisitions. In addition, the algorithm kept comparable spatial resolution and preserved CT noise texture.
“The background signal-to-noise ratio for the full dose pig A, full dose pig B, 1/10 dose pig A, 1/20 dose pig B, 1/10 pig A + GABI, 1/20 pig B + GABI were 4.5, 4.4, 1.4, 0.85, 3.8 and 2.9 respectively,” offered the authors.
For their study, the researchers were restricted to keep very high fidelity to the TACs and had perfusion parameters; however, they noted that higher background signal-to-noise ratio improvements are achievable.
While further investigation for parameter optimization is necessary, the authors concluded that the image-space based and noniterative algorithm can provide a framework in reducing radiation dose in CT time series studies. “We demonstrated in real animal data, that in high signal-to-noise ratio tasks as renal perfusion, the gain in dose reduction can be as high as 20-fold,” they wrote.