- #1

- 7,179

- 514

- TL;DR Summary
- some say the percentage of probability of rain apply to a given point in the forecast area , while others say it applies to the percentage of the forecast area that will see rain. Are these two definitions actually the same from a mathematical perspective.?

I am trying to settle a debate over two definitions of the 'probability of rain' in a weather forecast area.

Definition 1 states that for example there is a 50% averaged probability of rain at some point in the forecast area over a given duration of time, that is, there is a 50-50 chance that I will get wet at my particular location in that area.

Definition 2 says that the 50% probability of rain implies that likely 50% of the forecast area will see some rain, and 50% of that area will not.

But here is my question: Others say that both definitions

Thanks.

Definition 1 states that for example there is a 50% averaged probability of rain at some point in the forecast area over a given duration of time, that is, there is a 50-50 chance that I will get wet at my particular location in that area.

Definition 2 says that the 50% probability of rain implies that likely 50% of the forecast area will see some rain, and 50% of that area will not.

But here is my question: Others say that both definitions

*are the same*. From a mathematical perspective , not a meteorological perspective, does the 50% probability of rain falling on my head, versus rain falling over 50% of the forecast area, mean the same thing?Thanks.

Last edited: