A Exponential statistical process to be characterised

AI Thread Summary
The discussion focuses on characterizing a statistical process involving a random variable (X) with a determined outcome and a discrete uniform distribution over an integer interval. Participants clarify the concept of probability associated with the random variable and the calculation of information entropy (H) as the sample space (s) increases exponentially. It is established that Shannon entropy increases linearly with an exponentially expanding discrete uniform sample space. There is also a suggestion to explore a multi-exponential expansion of the sample space, prompting further inquiry into the characterization of such a sequence of random variables. The conversation emphasizes the importance of understanding the definitions and implications of the statistical terms used.
Larry Lacey
Messages
17
Reaction score
6
TL;DR Summary
There is a random variable (X) with a determined outcome (i.e., X = x0), p(x0) = 1. Consider x0 to have a “discrete uniform distribution” over the integer interval [1, s], where the sample space (s) = 1, such that p(x0) = 1. This is the initial state. What is the probability of x0 and the associated information entropy (H), as s increases exponentially?
I'd be grateful for any formulation that describes this statistical process
 
Physics news on Phys.org
Larry Lacey said:
Summary:: There is a random variable (X) with a determined outcome (i.e., X = x0), p(x0) = 1. Consider x0 to have a “discrete uniform distribution” over the integer interval [1, s], where the sample space (s) = 1, such that p(x0) = 1. This is the initial state. What is the probability of x0 and the associated information entropy (H), as s increases exponentially?

I'd be grateful for any formulation that describes this statistical process
I understand what all those words mean individually, but when you put them together like that they don't mean anything to me. Let's try and break it down:

Larry Lacey said:
There is a random variable (X) with a determined outcome (i.e., X = x0), p(x0) = 1.
So X = 1 with probability 1. That's not much of a random variable.

Larry Lacey said:
Consider x0 to have a “discrete uniform distribution” over the integer interval [1, s]
So x0 = 1 with probability 1/s, x0 = 2 with probability 1/s...

Larry Lacey said:
where the sample space (s) = 1
I'm not sure about your notation, but if you are saying that the sample space of s is { 1 } then s = 1 with probability 1, and so the 'interval' for the distribution of x0 is [1, 1].

Larry Lacey said:
such that p(x0) = 1
No, you've lost me there. What is the event that has a probability of 1? You have already defined x0 to be a random variable with a particular distribution, not an event.

Larry Lacey said:
What is the probability of x0
You have already defined x0 to be a random variable with a particular distribution, not an event so it cannot have a probability. x0 has a probability distribution but you have already defined that.

Larry Lacey said:
the associated information entropy (H)
Do you know how to calculate information entropy - it is not difficult? What do you plan to do with the answer?
 
  • Like
Likes Jarvis323, berkeman and PeroK
For a discrete, uniform distribution with ##n## possible outcomes, each outcome has the same probability, ##p=1/n##, so the Shannon entropy simplifies to ##H(X)=-log2(p)##. If we have an exponentially increasing sample space (let's just make it base 2 for convenience), like I think you're saying, then we have a sequence of random variables, ##X_0, X_1, X_2,...,X_n##, where ##X_i## has ##2^i## equally probably outcomes. Then ##p_i = 1/2^i## and ##H(X_i)=-log2(1/2^i)=i## bits.
 
Many thanks Jarvis323, that's exactly what I believed the answer to be. As you, say the base doesn't matter, but I actually had the base e in mind. Using your terminology I'd express it a little differently to the base e as: pi = exp(-loge(2) x i).
 
Sorry Jarvis323, I mean just a different way of expressing pi to the base 2.
 
@Larry Lacey, I think that @Jarvis323 is great, but isn't the great answer in post #2 a product of the also great @pbuk, and not of @Jarvis323 ? (I think that maybe you erred on the screen name)
 
Sorry, I deleted my answer. I couldn't tell if it's homework related and I was giving away the answer, or if I missed the point. Anyway, Shannon entropy will increase linearly with an exponentially increasing discrete uniform sample space.
 
  • Like
Likes sysprog
Jarvis323 said:
Sorry, I deleted my answer. I couldn't tell if it's homework related and I was giving away the answer, or if I missed the point. Anyway, Shannon entropy will increase linearly with an exponentially increasing discrete uniform sample space.
I know from PM conversations with the OP that it's not for schoolwork. Do you want me to restore your post?
 
  • Like
Likes sysprog
berkeman said:
I know from PM conversations with the OP that it's not for schoolwork. Do you want me to restore your post?
I guess so.
 
  • Like
Likes berkeman
  • #10
Jarvis323 said:
Sorry, I deleted my answer. I couldn't tell if it's homework related and I was giving away the answer, or if I missed the point. Anyway, Shannon entropy will increase linearly with an exponentially increasing discrete uniform sample space.
No Jarvis323 it was just a query.
 
  • #11
sysprog said:
@Larry Lacey, I think that @Jarvis323 is great, but isn't the great answer in post #2 a product of the also great @pbuk, and not of @Jarvis323 ? (I think that maybe you erred on the screen name)
Sure sysprog, all feedback was gratefully received
 
  • Like
Likes sysprog
  • #12
pbuk said:
I understand what all those words mean individually, but when you put them together like that they don't mean anything to me. Let's try and break it down:So X = 1 with probability 1. That's not much of a random variable.So x0 = 1 with probability 1/s, x0 = 2 with probability 1/s...I'm not sure about your notation, but if you are saying that the sample space of s is { 1 } then s = 1 with probability 1, and so the 'interval' for the distribution of x0 is [1, 1].No, you've lost me there. What is the event that has a probability of 1? You have already defined x0 to be a random variable with a particular distribution, not an event.You have already defined x0 to be a random variable with a particular distribution, not an event so it cannot have a probability. x0 has a probability distribution but you have already defined that.Do you know how to calculate information entropy - it is not difficult? What do you plan to do with the answer?
Many thanks for the feedback. Did not respond as by the time I had seen your feedback, I had seen the feedback from jarvis323 below which was the formulation I was seeking. Many thanks to all.
 
  • Like
Likes sysprog and berkeman
  • #13
Jarvis323 said:
For a discrete, uniform distribution with ##n## possible outcomes, each outcome has the same probability, ##p=1/n##, so the Shannon entropy simplifies to ##H(X)=-log2(p)##. If we have an exponentially increasing sample space (let's just make it base 2 for convenience), like I think you're saying, then we have a sequence of random variables, ##X_0, X_1, X_2,...,X_n##, where ##X_i## has ##2^i## equally probably outcomes. Then ##p_i = 1/2^i## and ##H(X_i)=-log2(1/2^i)=i## bits.
Hi Jarvis (and anyone else interested), instead of a statistical process with a mono-exponential expanding sample space, what about the characterisation of the more general process in which the expansion is multi-exponential in nature. What is the probability of x0 and the associated information entropy (H), as s increases multi-exponentially?
 
  • #14
Larry Lacey said:
Hi Jarvis (and anyone else interested), instead of a statistical process with a mono-exponential expanding sample space, what about the characterisation of the more general process in which the expansion is multi-exponential in nature.

What do you mean by a "statistical process"? The general definition of a stochastic process is that it is an indexed collection of random variables. Thus in a stochastic process there is an indexed collection of associated sample spaces. What an "expanding" sample space would mean for a stochastic process is unclear. A random variable has an associated sample space. The sample space of a random variable doesn't change as long as we are talking about the same random variable.

Perhaps you intend to ask about the limit of a sequence of values associated with a sequence of random variables. If ##X_1, X_2,...## is a sequence of random variables then there is an associated sequence of sample spaces and an associated sequence of Shannon entropies.
 
  • Like
Likes sysprog
  • #15
Stephen Tashi said:
What do you mean by a "statistical process"? The general definition of a stochastic process is that it is an indexed collection of random variables. Thus in a stochastic process there is an indexed collection of associated sample spaces. What an "expanding" sample space would mean for a stochastic process is unclear. A random variable has an associated sample space. The sample space of a random variable doesn't change as long as we are talking about the same random variable.

Perhaps you intend to ask about the limit of a sequence of values associated with a sequence of random variables. If ##X_1, X_2,...## is a sequence of random variables then there is an associated sequence of sample spaces and an associated sequence of Shannon entropies.
Hi Stephen, thanks for your feedback. The process is described at the outset of the post. However I believe you could interpret it as a sequence of random variables, each with a greater sample size that the previous one, and that the sequence progresses in a multi-exponential matter in terms of sample size. So what is the charactersation of the sequence? Does that address your query?
 
Back
Top