- 1,270
- 7
In doing a problem, I considered N (a large number, in the range 100,000-1,000,000) raindrops, falling into A (fixed at 100) segments on a roof, distributed using a random number generator I programmed. In considering the number of raindrops that fell into a given segment, the average would be \mu=\frac{N}{A}. For a fixed N, I calculated the standard deviation \sigma. I then plotted the standard deviation against the average, and found a nearly perfect relationship \sigma = \sqrt{\mu}. Is this relationship correct? Can anyone tell me why it would be so, or give me a starting point to show it?