# Exponential statistical process to be characterised

• A

## Summary:

There is a random variable (X) with a determined outcome (i.e., X = x0), p(x0) = 1. Consider x0 to have a “discrete uniform distribution” over the integer interval [1, s], where the sample space (s) = 1, such that p(x0) = 1. This is the initial state. What is the probability of x0 and the associated information entropy (H), as s increases exponentially?
I'd be grateful for any formulation that describes this statistical process

Related Set Theory, Logic, Probability, Statistics News on Phys.org
pbuk
Gold Member
Summary:: There is a random variable (X) with a determined outcome (i.e., X = x0), p(x0) = 1. Consider x0 to have a “discrete uniform distribution” over the integer interval [1, s], where the sample space (s) = 1, such that p(x0) = 1. This is the initial state. What is the probability of x0 and the associated information entropy (H), as s increases exponentially?

I'd be grateful for any formulation that describes this statistical process
I understand what all those words mean individually, but when you put them together like that they don't mean anything to me. Let's try and break it down:

There is a random variable (X) with a determined outcome (i.e., X = x0), p(x0) = 1.
So X = 1 with probability 1. That's not much of a random variable.

Consider x0 to have a “discrete uniform distribution” over the integer interval [1, s]
So x0 = 1 with probability 1/s, x0 = 2 with probability 1/s...

where the sample space (s) = 1
I'm not sure about your notation, but if you are saying that the sample space of s is { 1 } then s = 1 with probability 1, and so the 'interval' for the distribution of x0 is [1, 1].

such that p(x0) = 1
No, you've lost me there. What is the event that has a probability of 1? You have already defined x0 to be a random variable with a particular distribution, not an event.

What is the probability of x0
You have already defined x0 to be a random variable with a particular distribution, not an event so it cannot have a probability. x0 has a probability distribution but you have already defined that.

the associated information entropy (H)
Do you know how to calculate information entropy - it is not difficult? What do you plan to do with the answer?

Jarvis323, berkeman and PeroK
For a discrete, uniform distribution with ##n## possible outcomes, each outcome has the same probability, ##p=1/n##, so the Shannon entropy simplifies to ##H(X)=-log2(p)##. If we have an exponentially increasing sample space (let's just make it base 2 for convenience), like I think you're saying, then we have a sequence of random variables, ##X_0, X_1, X_2,...,X_n##, where ##X_i## has ##2^i## equally probably outcomes. Then ##p_i = 1/2^i## and ##H(X_i)=-log2(1/2^i)=i## bits.

Many thanks Jarvis323, that's exactly what I believed the answer to be. As you, say the base doesn't matter, but I actually had the base e in mind. Using your terminology I'd express it a little differently to the base e as: pi = exp(-loge(2) x i).

Sorry Jarvis323, I mean just a different way of expressing pi to the base 2.

@Larry Lacey, I think that @Jarvis323 is great, but isn't the great answer in post #2 a product of the also great @pbuk, and not of @Jarvis323 ? (I think that maybe you erred on the screen name)

Sorry, I deleted my answer. I couldn't tell if it's homework related and I was giving away the answer, or if I missed the point. Anyway, Shannon entropy will increase linearly with an exponentially increasing discrete uniform sample space.

sysprog
berkeman
Mentor
Sorry, I deleted my answer. I couldn't tell if it's homework related and I was giving away the answer, or if I missed the point. Anyway, Shannon entropy will increase linearly with an exponentially increasing discrete uniform sample space.
I know from PM conversations with the OP that it's not for schoolwork. Do you want me to restore your post?

sysprog
I know from PM conversations with the OP that it's not for schoolwork. Do you want me to restore your post?
I guess so.

berkeman
Sorry, I deleted my answer. I couldn't tell if it's homework related and I was giving away the answer, or if I missed the point. Anyway, Shannon entropy will increase linearly with an exponentially increasing discrete uniform sample space.
No Jarvis323 it was just a query.

@Larry Lacey, I think that @Jarvis323 is great, but isn't the great answer in post #2 a product of the also great @pbuk, and not of @Jarvis323 ? (I think that maybe you erred on the screen name)
Sure sysprog, all feedback was gratefully received

sysprog
I understand what all those words mean individually, but when you put them together like that they don't mean anything to me. Let's try and break it down:

So X = 1 with probability 1. That's not much of a random variable.

So x0 = 1 with probability 1/s, x0 = 2 with probability 1/s...

I'm not sure about your notation, but if you are saying that the sample space of s is { 1 } then s = 1 with probability 1, and so the 'interval' for the distribution of x0 is [1, 1].

No, you've lost me there. What is the event that has a probability of 1? You have already defined x0 to be a random variable with a particular distribution, not an event.

You have already defined x0 to be a random variable with a particular distribution, not an event so it cannot have a probability. x0 has a probability distribution but you have already defined that.

Do you know how to calculate information entropy - it is not difficult? What do you plan to do with the answer?
Many thanks for the feedback. Did not respond as by the time I had seen your feedback, I had seen the feedback from jarvis323 below which was the formulation I was seeking. Many thanks to all.

sysprog and berkeman