Abstract
The effects of varying beam energy on the computed tomographic (CT) enhancement-to-noise (S:N) ratio were studied experimentally with the DeltaScan 2020 and GE 8800 CT scanners and a 20-cm-diameter cyclindrical Plexiglas phantom containing 11 50 ml syringes filled with varying amounts of xenon and iodine. Enhancements of 54.2, 36.7, and 31.7 Hounsfield units (H)/mg l/ml were measured with the DeltaScan 2020 at 70, 100, and 120 kVp, respectively, with corresponding root mean square deviations (RMSDs) of 12, 7, and 5 H for 400 mAs scans. For the GE 8800, enhancements of 48.3, 37.6, and 32.7 H/mg l/ml were measured at 80, 100, and 120 kVp with RMSDs of 13, 8, and 7 H for 9.6 sec 320 mA scans (3.3 msec pulse). RMSD was independent of enhancement over the range of iodine concentrations studied (0-1.5 mg l/ml) and was only a weak function of region-of-interest (ROI) size. For repeated scans with the DeltaScan 2020, measurements in 17 X 17 pixel regions were reproducible to within 0.8 H for all techniques and a drift in calibration of less than 6% was observed after 8 months of clinical use. For both the DeltaScan 2020 and the GE 8800, at the milliamperage studied, lower-energy techniques offered no advantage over 120 kVp technique for xenon CT measurements of regional cerebral blood flow, which are feasible using either of these scanners.
- Copyright © American Society of Neuroradiology