- #1
whatisreality
- 290
- 1
Homework Statement
If one bar of a histogram has been generated with ##n## entries from a total of ##x## measurements, i.e. the event occurs randomly ##n## times in the ##x## event interval, then what is the standard deviation of values in this bar? Let ##k## be the range of values that could have been measured for this particular bar of the histogram, and assume that the expectation value of ##k## is ##n##.
Homework Equations
The Attempt at a Solution
I'm finding the wording tricky to understand. ##n## is the number of events in the interval, and the average value of ##k## is also ##n##? The variance is the square of the distance from the mean and if all the events were equally likely, I'd divide by ##n## and have something like
##\frac{1}{n}\Sigma(k_i - n)^2##
But I'm not sure that's what it means when it says the event occurs randomly a total of ##n## times. I'd really appreciate any hints on how to tackle this, thanks for any help!