- 6

- 0

I have a question relating to the sum of continuous random variable probabilities that I hope you can help to answer.

In any probability density function (pdf), dealing with discrete or continuous random variables, the sum of the probabilities of all possible events must equal 1. This stands to reason so no difficulty here.....

If we consider, for example, the case of a pdf of a continuous random variable, Y, such as maximum height of a wave at a constant point in the ocean. If we take n measurements, (say, for agrument sake, one measurement every 24 hours for 1 year, giving us 365 measurements), then the probability of a encountering a maximum wave height of exactly Y in any given 24 hour period is calculated as: number of occurences of wave height equal to Y÷365, from which we can construct our pdf. (Please correct me if I'm incorrect on this).

Now, if we sum all of the

**probabilities of Y, this will equal 1. However, if we sum**

__recorded__**of the pdf at smaller and smaller intervals between values of Y, we very quickly reach a stage that this sum exceeds 1, which clearly cannot be the case.**

__probabilities that we can read from the graph__For the example used, I'm only looking at a purely theoretical analysis, i.e. meteorological/ocean processes and any other complications such as independence of wave events can be completely ignored for this answer.

Can anyone explain how this can be please? Much appreciated.