- #1
- 72
- 0
Hi,
Iam having a hard time understanding the benefits of using a semiconductor to construct e.g. a photodiode that creates a current in proportion of the energy deposited by radiation.
Text books says that semiconductor is preferred as detectors due to high density (larger chance for interaction with the detector) can be made small etc etc.
Then they start to explain the large leakage current produced due to the thermal generated charge carriers because of the relatively low band gap etc (they contribute to a current of the order 0.1A, and radiation induced current is like 0.00001). So they continue to explain that this needs to be dealt with, so you "fuse" a P and N type crystal so that a depletion region is created near the junction, where the thermally generated charges are depleted. But let's just stop here, all of this just to create a situation where you get rid of these thermal generated charges, well, why do you use a semiconductor in the first place, why not an insulator.
If I took a piece of glass, applied a high voltage (e.g 4KV), then when the radiation knocks out an electron, the strong electric field would collect the electron (at the electrode) before it is recombined, and I would measure a current. This current would be proportional to the energy deposited in the glass. In addition I would have almost no leakage current (thermal generated). Why is this a bad idea? Am I missing something? Is it ONLY because insulators have low atomic number and therefore not good radiation detectors (you need high atomic number for high interaction probability)
Thank you for your time!
Iam having a hard time understanding the benefits of using a semiconductor to construct e.g. a photodiode that creates a current in proportion of the energy deposited by radiation.
Text books says that semiconductor is preferred as detectors due to high density (larger chance for interaction with the detector) can be made small etc etc.
Then they start to explain the large leakage current produced due to the thermal generated charge carriers because of the relatively low band gap etc (they contribute to a current of the order 0.1A, and radiation induced current is like 0.00001). So they continue to explain that this needs to be dealt with, so you "fuse" a P and N type crystal so that a depletion region is created near the junction, where the thermally generated charges are depleted. But let's just stop here, all of this just to create a situation where you get rid of these thermal generated charges, well, why do you use a semiconductor in the first place, why not an insulator.
If I took a piece of glass, applied a high voltage (e.g 4KV), then when the radiation knocks out an electron, the strong electric field would collect the electron (at the electrode) before it is recombined, and I would measure a current. This current would be proportional to the energy deposited in the glass. In addition I would have almost no leakage current (thermal generated). Why is this a bad idea? Am I missing something? Is it ONLY because insulators have low atomic number and therefore not good radiation detectors (you need high atomic number for high interaction probability)
Thank you for your time!