Inverse-Square Law of Radiation

In summary: Poisson distribution to me.Your data does not seem to follow a Poisson distribution. Furthermore, there seems to be a lot of variability in the data points, which suggests that there may be some sort of error in your equipment.
  • #1
ayans2495
58
2
Homework Statement
I am conducting an experiment in which I investigate the relationship between the counts per second detected by a Geiger Counter and the distance between said Geiger Counter and the source of radiation. Attached in my solution is a graph of the counts per second over time. My question is: why so much variation between the data points? Also, what mathematical formulae can we use to define this relationship as well as have any relevance to the topic at all.
Relevant Equations
The average counts per second is inversely proportional to the square of the distance between the source and detector.
Geigher Counter (2).png
 
  • Like
Likes Delta2
Physics news on Phys.org
  • #2
So far, this is what I have.
Let A be the average counts per second and d be the distance between the source and detector.

A 1/d2 ∴ A = k · 1/d2, where k is the constant of proportionality.

I believe I can determine K if I plot A against 1/d2 and find the gradient of the function; namely k.

Is there anything else I can do mathematically? Any suggestions would be greatly appreciated.
 
  • #3
ayans2495 said:
why so much variation between the data points?
For a Poisson process, the mean and variance have the same value. Breaking it into one second intervals is giving you a mean count of around 230, so the standard deviation will be the square root of that, about 15. 90something percent should be plus or minus three standard deviations, so plus or minus 45, and about 60% plus or minus one sdev of the mean. That seems to fit your data.
This gives you an error bar when plotting rate against distance.

Suppose you measure over one minute for each of several distances. Because the means will be different, the error bars will be different. If you then use standard regression analysis (plotting rate against inverse square of distance, say) the error bars will probably still be different, so you should use a weighting in the regression. That expresses the greater confidence in the shorter error bars.
https://en.m.wikipedia.org/wiki/Weighted_least_squares
 
  • #4
haruspex said:
For a Poisson process, the mean and variance have the same value. Breaking it into one second intervals is giving you a mean count of around 230, so the standard deviation will be the square root of that, about 15. 90something percent should be plus or minus three standard deviations, so plus or minus 45, and about 60% plus or minus one sdev of the mean. That seems to fit your data.
This gives you an error bar when plotting rate against distance.

Suppose you measure over one minute for each of several distances. Because the means will be different, the error bars will be different. If you then use standard regression analysis (plotting rate against inverse square of distance, say) the error bars will probably still be different, so you should use a weighting in the regression. That expresses the greater confidence in the shorter error bars.
https://en.m.wikipedia.org/wiki/Weighted_least_squares
Would this variation be by virtue of the systematic error of the equipment?
 
  • #5
ayans2495 said:
Would this variation be by virtue of the systematic error of the equipment?
No, it’s inherent in the Poisson nature of the distribution.
 
  • #6
Unfortunately I haven't learned much about the Poisson Point Process, I am only a year 10 in a year 11 physics class. Do you believe there is a simpler way of describing it? I understand if there isn't.
 
  • #7
ayans2495 said:
Unfortunately I haven't learned much about the Poisson Point Process, I am only a year 10 in a year 11 physics class. Do you believe there is a simpler way of describing it? I understand if there isn't.
Then maybe I over complicated it. What you proposed in post #2 is fine, but do you need to be able to state the error bounds on the slope calculated?
 
  • #8
haruspex said:
Then maybe I over complicated it. What you proposed in post #2 is fine, but do you need to be able to state the error bounds on the slope calculated?
When you say "error bounds", are you referring to the standard error of regression slope? Or the error bars on each individual data plot acquired from the standard error of the mean?
 
  • #9
ayans2495 said:
When you say "error bounds", are you referring to the standard error of regression slope?
Yes.
 
  • #10
Of course, strictly speaking, it is not quite an inverse square law. It doesn’t tend to infinity as the distance tends to zero.
It will be more like ##1-\frac d{\sqrt{r^2+d^2}}##, where r is the radius of the sensor and d is the distance. This behaves like ##\frac {r^2}{d^2}## for r<<d.
 
  • #11
ayans2495 said:
Homework Statement:: I am conducting an experiment in which I investigate the relationship between the counts per second detected by a Geiger Counter and the distance between said Geiger Counter and the source of radiation. Attached in my solution is a graph of the counts per second over time. My question is: why so much variation between the data points? Also, what mathematical formulae can we use to define this relationship as well as have any relevance to the topic at all.
Relevant Equations:: The average counts per second is inversely proportional to the square of the distance between the source and detector.

View attachment 290751
The expected counts should be distributed randomly according to a Poisson distribution. It is difficult to read the data from your graph since the data points are not clearly marked. It looks to me like your per-second counts are uniformly distributed in the range from c=190/sec to c=280/sec. That does not match what I would expect from such an experiment.

On the other hand, for a Poisson distribution with mean 235 counts per second the tails of the ideal distribution would match this data pretty well with only 1% of the samples having less then 190 counts and only 1% having more than 280 counts.

Try this calculator with parameters set to 235, 190, 1, 90. [That puts the mean counts per second at 235 and calculates the probability density for counts starting at c=190 and going up by 1's to c=279]

A histogram with the data sorted into buckets (how many intervals with counts between 190 and 199, how many from 200-209, how many from 210-220, etc) would let you see the distribution of your data more easily.
 
  • #12
You do not say where you investigate the dependence of counts with distance. Did you vary the distance and see differences in the count. Suppose you increase the distance and the average count rate goes down. Doesn't this tell you anything. It looks like you are not even changing the distance and you are just seeing the variation in count rate at a fixed distance.
 
  • #13
mpresic3 said:
You do not say where you investigate the dependence of counts with distance. Did you vary the distance and see differences in the count. Suppose you increase the distance and the average count rate goes down. Doesn't this tell you anything. It looks like you are not even changing the distance and you are just seeing the variation in count rate at a fixed distance.
Yes, that is my reading of post #1. The graph is for a particular distance, and the question asked was whether it should be expected to be so erratic. Answer: yes, but really all you need for a given distance is the average rate over the whole run. Given that it is Poisson, the variance of that average can be inferred.

For different distances, the OP goes on to ask "what mathematical formulae can we use to define this relationship". Clearly it is expected to be roughly inverse square (but see post #10). It is not clear whether the OP is required to analyse how well the data (over all distances) fits such.
 
  • #14
jbriggs444 said:
The expected counts should be distributed randomly according to a Poisson distribution. It is difficult to read the data from your graph since the data points are not clearly marked. It looks to me like your per-second counts are uniformly distributed in the range from c=190/sec to c=280/sec. That does not match what I would expect from such an experiment.

On the other hand, for a Poisson distribution with mean 235 counts per second the tails of the ideal distribution would match this data pretty well with only 1% of the samples having less then 190 counts and only 1% having more than 280 counts.

Try this calculator with parameters set to 235, 190, 1, 90. [That puts the mean counts per second at 235 and calculates the probability density for counts starting at c=190 and going up by 1's to c=279]

A histogram with the data sorted into buckets (how many intervals with counts between 190 and 199, how many from 200-209, how many from 210-220, etc) would let you see the distribution of your data more easily.
Forgive me as I am not awfully educated on the subject matter but how does the Poisson distribution justify why the expected counts become distributed arbitrarily? Also, is there another way to explain this variation as it is an experiment at the year 11 level?
 
  • #15
ayans2495 said:
how does the Poisson distribution justify why the expected counts become distributed arbitrarily?
Not arbitrarily; distributed according to a Poisson distribution. Random does mean arbitrary; it means not determinately, i.e. you cannot completely predict it.
ayans2495 said:
explain this variation
Do you mean the variation from second to second?
Your graph shows around 230 events per second on average.
The probability of ##k## events in a given second is therefore given by the expression at https://en.m.wikipedia.org/wiki/Poisson_distribution#Probability_mass_function, with ##\lambda=230##.
You could try evaluating it for k from 240 to 250, say, and summing to find the probability of getting that range in a given second, then compare with your data. Or just plot a histogram of your data and observe the peak being about equal to the mean.
 
  • #16
haruspex said:
Not arbitrarily; distributed according to a Poisson distribution. Random does mean arbitrary; it means not determinately, i.e. you cannot completely predict it.

Do you mean the variation from second to second?
Your graph shows around 230 events per second on average.
The probability of ##k## events in a given second is therefore given by the expression at https://en.m.wikipedia.org/wiki/Poisson_distribution#Probability_mass_function, with ##\lambda=230##.
You could try evaluating it for k from 240 to 250, say, and summing to find the probability of getting that range in a given second, then compare with your data. Or just plot a histogram of your data and observe the peak being about equal to the mean.
How would you explain this as though you were writing a log book for this experiment, recording observations for this data qualitatively as your teachers have not yet taught you the probability mass function of the Poisson point process? Again, I am certain that what you are saying has merit and may be the only logical response to such as question, though I still fail to understand as I have not learned this content yet and have little time to submit my findings.
 
  • #17
ayans2495 said:
How would you explain this as though you were writing a log book for this experiment, recording observations for this data qualitatively as your teachers have not yet taught you the probability mass function of the Poisson point process? Again, I am certain that what you are saying has merit and may be the only logical response to such as question, though I still fail to understand as I have not learned this content yet and have little time to submit my findings.
Why do you think it needs explaining as part of your submission? I doubt I could provide a better way than simply quoting a standard fact about Poisson distributions (variance equals mean) and demonstrate the data roughly match that.

You still have not explained exactly what you are asked to do. (See posts #7 and #13.)
If the main aim is to study the relationship of rate to distance, as seemed to be the case earlier, quit worrying about the per second counts and just take the average rate for each run.
 
Last edited:
  • #18
Have you tried measuring the background radiation. For example, how many counts do you get with no source present. How much variation is there when there is no source except background.
Poisson processes are quite involved. I expect a grad student in the sciences might be ready to tackle them. One good treatment I like is in Sheldon Ross book Introduction to Probability Models. I draw your attention to page 273 in the sixth edition which treats electronic counters.

Here goes a hand waving argument and don't worry if it seems vague inscrutable

You know a radioactive sample decays as an exponential decay law. You can expect each atom in the sample decays at a different time. Like people, we do not all die after 70 years say. Some live to 50 and some to 90. The chance of an individual atom decaying follows and exponential distribution. You can infer this from the sample, that after a "mean lifetime " only 1/e of the sample is left.

Now if you follow the argument in Sheldon Ross or Papoulis or other good probability books you find that given a population that each member decays as an exponential distribution. If you count the number remaining in the population, you will find the count is distributed as a poisson distribution. Again this is handwaving.

Your teacher will most likely be happy if you allow a probability textboook treating the processe as a reference. That i,s cite the textbook or source, the same way you would do as a professional researcher.
 

1. What is the Inverse-Square Law of Radiation?

The Inverse-Square Law of Radiation is a physical principle that states that the intensity of radiation is inversely proportional to the square of the distance from the source. This means that as the distance from the source increases, the intensity of radiation decreases.

2. How does the Inverse-Square Law of Radiation apply to light?

The Inverse-Square Law of Radiation applies to all types of radiation, including light. This means that as you move further away from a light source, the brightness of the light decreases. For example, a light bulb will appear much brighter up close than it will from a distance.

3. What are the implications of the Inverse-Square Law of Radiation for radiation safety?

The Inverse-Square Law of Radiation is an important concept in radiation safety. It means that the further away you are from a source of radiation, the less exposure you will receive. This is why it is important for scientists and workers who handle radioactive materials to use proper protective equipment and maintain a safe distance from the source.

4. Does the Inverse-Square Law of Radiation apply to all types of radiation?

Yes, the Inverse-Square Law of Radiation applies to all types of radiation, including gamma rays, x-rays, and even sound waves. This is because all types of radiation follow the same physical principles and behave in a similar manner when it comes to the relationship between distance and intensity.

5. How is the Inverse-Square Law of Radiation used in medical imaging?

The Inverse-Square Law of Radiation is used in medical imaging, such as x-rays and CT scans, to determine the appropriate distance between the patient and the radiation source. This helps to ensure that the patient receives the appropriate amount of radiation for the desired image quality, while also minimizing their exposure to radiation.

Similar threads

  • Introductory Physics Homework Help
Replies
10
Views
2K
  • Introductory Physics Homework Help
Replies
8
Views
934
  • Introductory Physics Homework Help
Replies
3
Views
672
  • Introductory Physics Homework Help
Replies
8
Views
3K
Replies
3
Views
756
  • Introductory Physics Homework Help
Replies
2
Views
907
  • Introductory Physics Homework Help
Replies
15
Views
828
Replies
12
Views
375
  • Introductory Physics Homework Help
Replies
3
Views
917
Replies
18
Views
1K
Back
Top