The x-ray sensitivity of a high-resistivity photoconductor sandwiched
between two parallel plate electrodes and operating under a constant field is
analysed by considering charge carrier generation that follows the x-ray photon
absorption profile and taking into account both electron and hole trapping
phenomena but neglecting recombination, bulk space charge and diffusion effects.
The amount of collected charge in the external circuit due to distributed
generation of electrons and holes through the detector is calculated by
integrating the Hecht collection efficiency with Ramo's theorem across the
sample thickness. The results of the model allow the x-ray sensitivity to be
calculated as a function of the applied field, detector thickness and electron
and hole ranges (µτ), given the field and energy dependence of the
electron and hole pair creation energy, W±, and the energy spectrum of
incident x-ray radiation. The sensitivity model was applied to stabilized a-Se
that is currently used as a successful x-ray photoconductor in the recently
developed flat panel x-ray image detectors. Recent free electron-hole pair
creation energy versus electric field data at room temperature and appropriate
electron and hole drift mobilities were used to calculate the sensitivity for
monoenergetic x-rays at 20 and at 60 keV. For the 20 keV radiation, it was shown
that a typical detector thickness of 200 µm (4 × attenuation depth at
20 keV) with currently attainable electron and hole trapping parameters in a-Se
was operating optimally, the sensitivity of which can only be increased by
further increasing the applied field. With the receiving electrode positively biased,
the sensitivity was much more dependent on
the hole lifetime than electron lifetime. The absence of hole transport results
in a reduction in sensitivity by a factor of about 4.4, whereas the absence of
electron transport results in a sensitivity degradation of only 22%. The ratio
of hole trapping limited sensitivity to electron trapping limited sensitivity is
about 0.3. For a detector of thickness 200 µm operating at 10 V µm-1,
the maximum sensitivity is about 220 pC cm-2 mR-1, and this
sensitivity degrades by more than 10% when either the electron lifetime falls
below ~20 µs or the hole lifetime falls below ~5 µs. When the
hole lifetime is very short so that the sensitivity is substantially reduced,
the sensitivity versus thickness dependence at a given field exhibits a maximum
(an optimal thickness) that is less than that for full absorption. In the case
of 60 keV x-ray photons, it is more useful to examine the sensitivity as a
function of detector thickness given the practical bias voltage limit. The
sensitivity versus thickness behaviour for a given bias voltage exhibits a
maximum, that is an optimal thickness, that is less than that for nearly full
absorption. Electron lifetimes longer than ~200 µs and hole lifetimes
longer than ~10 µs do not significantly affect the sensitivity.