Guest blog by Kalpana M. Kanal, PhD, Director of Diagnostic Physics Section and Associate Professor in the Department of Radiology at University of Washington
How low can we go in radiation dose without affecting diagnostic confidence for detection of low-contrast liver lesions?
In a recent article we published, we studied the impact of incremental increases in CT image noise on detection of low-contrast hypodense liver lesions. Clinical CT liver exams were obtained on a 64-slice CT scanner using automatic tube current modulation at a routine clinical noise index 15. An artificial image noise addition tool was used to increase the noise level in clinical liver CT images to simulate 75% (NI 17.4), 50% (NI 21.2), and 25% patient radiation dose (NI 29.7) scanning relative to the original images (NI 15.0; 100% dose). The images were reviewed by radiologists of varying experience who subjectively scored lesion detectability on all the images, original and simulated.
We concluded that there is little loss of detection sensitivity for low-contrast liver lesion detectability of CT exams scanned with a NI at least up to 21.2 compared to a NI of 15, a patient radiation dose reduction of 50%. No significant degradation was observed when reader performance was evaluated as a function of lesion size (>10 mm) and contrast (>60 HU) at 90% sensitivity. When lesion size dropped to <10 mm or contrast was <60 HU, sensitivity did drop to 85%.
This study had some limitations, the most important of which was that this study was a simulation and not a true study of CT scanning at lower radiation dose compared to high dose scanning which would have involved scanning patients multiple times. Nevertheless, this study was important as it demonstrated that dose could be reduced by 50% without affecting diagnostic confidence for detecting low-contrast liver lesions.