- #1

marco1235

I'm feeling a bit doubtful about this issue. I'm working with optical detectors and I have to characterize them in terms of quantum efficiency and other similar things. Now suppose my detector is, ideally, a single large pixel, which I illuminate for a specific time. Then I store the recorded N

_{photons}and repeat the procedure for 10k times! At each iteration, due to the randomness of the process I can get 100 counts in the first step, 102 at the second, 95, 87, 101, 106, ... an so on.

I want to make an average of such 10k values, and that's fine. But how about the uncertainty associated with this repeated measure? I have two ways:

1) computing the standard deviation using std-like function (Matlab)

2) putting N

_{avg}as argument of the squared-root like in Poisson processes

I'm really stucked in this situation.

Hope someone could help me!

Have a nice day