# Relationship between radiation flux and count rate of a scintillator

## Main Question or Discussion Point

If the radiation flux is calculated as:

$F = \frac{L}{2*pi*r^{2}}$ where L is the luminosity of the source and r is the distance from the source

and the count rate of a scintillator

$\frac{number-of-scintillations}{time}$

What is the relationship between them?

There obviously should be one, as the count rate depends on the distance from the source and the strength of the source, as does the radiation flux.

Is it just a linear relationship? I'm guessing it varies for different scintillators but do they all have a linear relationship, or at least an almost linear relationship?

Thanks!

Related Other Physics Topics News on Phys.org
fzero
Homework Helper
Gold Member
The number of particles captured in the scintillator will depend on the material and the size of the detector. Together, these and other considerations are encapsulated in the the "efficiency" of the detector. The efficiency is usually assumed to be linear over an appropriate range of particle energies, but there are certainly nonlinearities. For example, this http://www.detectors.saint-gobain.com/uploadedFiles/SGdetectors/Documents/Technical_Information_Notes/Efficiency-Calculations.pdf [Broken] has graphs for various detectors using different materials. It's clear that there is a decent range where the efficiency is approximately linear, but significant nonlinearities emerge outside these ranges.

Last edited by a moderator:

I'm finding it very hard to follow though. Could you please explain a bit more how I would go about finding the expected count rate for a specific detector, (e.g CaF2), when exposed to a certain radiation flux.

e.g if the radiation has a luminosity of 1W and is 1 meter away from the scintillator, What would be the approximate count rate if there is only air between the source a 1 inch CaF2 detector?