# Summing probabilities

1. Dec 3, 2008

### SW VandeCarr

Consider an infinite set E(G) where the elements are interpreted as independent events assigned probabilities under a Gaussian distribution. It can be shown that the probabilities of all events in E(G) will sum to one. Now consider an infinite set E(U) with the same interpretation under a uniform distribution (every event has equal probability). Given an infinite set, this implies that the probability of any randomly chosen event in E(U) is zero. Is it then true that for any continuous uniform probability distribution, the sum of probabilities will be zero?

2. Dec 3, 2008

### rochfor1

Do you mean integral of the probabilities (for the continuous distribution)?

3. Dec 4, 2008

If your infinite set of independent events (call it A) is countable, then there can be no uniform distribution. Proof: Suppose that each element of A had equal probability. If this is zero, then the sum of all probabilities is zero; if it is nonzero, then the sum of all probabilities is infinite. In either case, you don't have a probability space.

If your set is uncountable, then you can't sum the probabilities of all the events (unless all but countably many have zero probability).

4. Dec 4, 2008

### SW VandeCarr

I'm actually talking about a random variable that maps from an infinite set of elements termed "events" to a continuous probability space [0,1] according to a Gaussian distribution (G) vs a uniform (U). The events are countable. The sum of the probabilities in E(G) will be '1' at the limit. The open integral of G is '1'. My question was with E(U). It appears that in any mapping of countable events in E(U) under a (continuous)uniform distribution U, the sum of probabilities will be '1' for any finite number of events, but zero for infinite events and the open integral of U is '0'. That is, a continuous uniform distribution does not exist (which is essentially what I conclude from adriank's post).

I asked this because 1) The uniform distribution is often used as the prior assumption in Bayesian inference. 2) In demonstrating Freiling's axiom of symmetry (AX) (see 'maze' "dense geodesic" post 32 Nov 30 in topology), a random variable chooses a number on the real interval [0,1] and maps to an (infinite?) set of countable subsets (S)on the same interval. (S) would be an infinite set of countable elements under a uniform distribution given a complete mapping, which according to the above, would be impossible since the probabilities sum to zero under a distribution whose open integral is zero.

Last edited: Dec 4, 2008
5. Dec 5, 2008

### SW VandeCarr

To clarify the previous post, the probability p(x), given on a continuous distribution G, is expressed as a probability density [0, p(x)] on the definite integral [0,1] of G. I used the term "open" integral to exclude events with probability zero or one.

With respect to the uniform distribution U the same holds for a finite set of possible events 'x'. However, the uniform distribution must be discrete since the probability of one event is fixed as the reciprocal of the total number of events. For an infinite number events, a (continuous) uniform distribution seems impossible unless the reciprocal of infinity can be defined.

The demonstration of Freiling's axiom of symmetry (in the Wiki) seems to contemplate a random variable 'choosing' a number from the real interval [0,1] which implies a continuous uniform distribution since there's no reason to consider any one number has a different probability than any other to be chosen.

Last edited: Dec 5, 2008