Exponential statistical process to be characterised

Click For Summary

Discussion Overview

The discussion revolves around the characterization of a statistical process involving a random variable with a discrete uniform distribution and its associated information entropy as the sample space increases, particularly focusing on exponential and multi-exponential growth. Participants explore the implications of these concepts in the context of probability and entropy calculations.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant seeks a formulation to describe a statistical process involving a random variable (X) with a determined outcome, questioning the probability of x0 and its information entropy as the sample space increases exponentially.
  • Another participant clarifies that for a discrete uniform distribution, the Shannon entropy can be calculated using the formula H(X) = -log2(p), where p is the probability of outcomes, and discusses the implications of an exponentially increasing sample space.
  • A different participant expresses agreement with the previous explanation but suggests an alternative formulation using base e instead of base 2.
  • Some participants engage in a meta-discussion about the attribution of contributions, with one participant correcting another regarding the source of a previous answer.
  • One participant expresses uncertainty about the nature of the statistical process and the meaning of an "expanding" sample space, suggesting that it may refer to a sequence of random variables and their associated entropies.
  • Another participant proposes exploring a more general process characterized by multi-exponential growth rather than mono-exponential growth.

Areas of Agreement / Disagreement

There is no consensus on the definitions and implications of the statistical process being discussed, with multiple competing views and interpretations remaining unresolved.

Contextual Notes

Participants express uncertainty regarding the notation and definitions used, particularly concerning the nature of random variables and their associated sample spaces. The discussion includes varying interpretations of entropy calculations and the implications of exponential growth in sample spaces.

Larry Lacey
Messages
17
Reaction score
6
TL;DR
There is a random variable (X) with a determined outcome (i.e., X = x0), p(x0) = 1. Consider x0 to have a “discrete uniform distribution” over the integer interval [1, s], where the sample space (s) = 1, such that p(x0) = 1. This is the initial state. What is the probability of x0 and the associated information entropy (H), as s increases exponentially?
I'd be grateful for any formulation that describes this statistical process
 
Physics news on Phys.org
Larry Lacey said:
Summary:: There is a random variable (X) with a determined outcome (i.e., X = x0), p(x0) = 1. Consider x0 to have a “discrete uniform distribution” over the integer interval [1, s], where the sample space (s) = 1, such that p(x0) = 1. This is the initial state. What is the probability of x0 and the associated information entropy (H), as s increases exponentially?

I'd be grateful for any formulation that describes this statistical process
I understand what all those words mean individually, but when you put them together like that they don't mean anything to me. Let's try and break it down:

Larry Lacey said:
There is a random variable (X) with a determined outcome (i.e., X = x0), p(x0) = 1.
So X = 1 with probability 1. That's not much of a random variable.

Larry Lacey said:
Consider x0 to have a “discrete uniform distribution” over the integer interval [1, s]
So x0 = 1 with probability 1/s, x0 = 2 with probability 1/s...

Larry Lacey said:
where the sample space (s) = 1
I'm not sure about your notation, but if you are saying that the sample space of s is { 1 } then s = 1 with probability 1, and so the 'interval' for the distribution of x0 is [1, 1].

Larry Lacey said:
such that p(x0) = 1
No, you've lost me there. What is the event that has a probability of 1? You have already defined x0 to be a random variable with a particular distribution, not an event.

Larry Lacey said:
What is the probability of x0
You have already defined x0 to be a random variable with a particular distribution, not an event so it cannot have a probability. x0 has a probability distribution but you have already defined that.

Larry Lacey said:
the associated information entropy (H)
Do you know how to calculate information entropy - it is not difficult? What do you plan to do with the answer?
 
  • Like
Likes   Reactions: Jarvis323, berkeman and PeroK
For a discrete, uniform distribution with ##n## possible outcomes, each outcome has the same probability, ##p=1/n##, so the Shannon entropy simplifies to ##H(X)=-log2(p)##. If we have an exponentially increasing sample space (let's just make it base 2 for convenience), like I think you're saying, then we have a sequence of random variables, ##X_0, X_1, X_2,...,X_n##, where ##X_i## has ##2^i## equally probably outcomes. Then ##p_i = 1/2^i## and ##H(X_i)=-log2(1/2^i)=i## bits.
 
Many thanks Jarvis323, that's exactly what I believed the answer to be. As you, say the base doesn't matter, but I actually had the base e in mind. Using your terminology I'd express it a little differently to the base e as: pi = exp(-loge(2) x i).
 
Sorry Jarvis323, I mean just a different way of expressing pi to the base 2.
 
@Larry Lacey, I think that @Jarvis323 is great, but isn't the great answer in post #2 a product of the also great @pbuk, and not of @Jarvis323 ? (I think that maybe you erred on the screen name)
 
Sorry, I deleted my answer. I couldn't tell if it's homework related and I was giving away the answer, or if I missed the point. Anyway, Shannon entropy will increase linearly with an exponentially increasing discrete uniform sample space.
 
  • Like
Likes   Reactions: sysprog
Jarvis323 said:
Sorry, I deleted my answer. I couldn't tell if it's homework related and I was giving away the answer, or if I missed the point. Anyway, Shannon entropy will increase linearly with an exponentially increasing discrete uniform sample space.
I know from PM conversations with the OP that it's not for schoolwork. Do you want me to restore your post?
 
  • Like
Likes   Reactions: sysprog
berkeman said:
I know from PM conversations with the OP that it's not for schoolwork. Do you want me to restore your post?
I guess so.
 
  • Like
Likes   Reactions: berkeman
  • #10
Jarvis323 said:
Sorry, I deleted my answer. I couldn't tell if it's homework related and I was giving away the answer, or if I missed the point. Anyway, Shannon entropy will increase linearly with an exponentially increasing discrete uniform sample space.
No Jarvis323 it was just a query.
 
  • #11
sysprog said:
@Larry Lacey, I think that @Jarvis323 is great, but isn't the great answer in post #2 a product of the also great @pbuk, and not of @Jarvis323 ? (I think that maybe you erred on the screen name)
Sure sysprog, all feedback was gratefully received
 
  • Like
Likes   Reactions: sysprog
  • #12
pbuk said:
I understand what all those words mean individually, but when you put them together like that they don't mean anything to me. Let's try and break it down:So X = 1 with probability 1. That's not much of a random variable.So x0 = 1 with probability 1/s, x0 = 2 with probability 1/s...I'm not sure about your notation, but if you are saying that the sample space of s is { 1 } then s = 1 with probability 1, and so the 'interval' for the distribution of x0 is [1, 1].No, you've lost me there. What is the event that has a probability of 1? You have already defined x0 to be a random variable with a particular distribution, not an event.You have already defined x0 to be a random variable with a particular distribution, not an event so it cannot have a probability. x0 has a probability distribution but you have already defined that.Do you know how to calculate information entropy - it is not difficult? What do you plan to do with the answer?
Many thanks for the feedback. Did not respond as by the time I had seen your feedback, I had seen the feedback from jarvis323 below which was the formulation I was seeking. Many thanks to all.
 
  • Like
Likes   Reactions: sysprog and berkeman
  • #13
Jarvis323 said:
For a discrete, uniform distribution with ##n## possible outcomes, each outcome has the same probability, ##p=1/n##, so the Shannon entropy simplifies to ##H(X)=-log2(p)##. If we have an exponentially increasing sample space (let's just make it base 2 for convenience), like I think you're saying, then we have a sequence of random variables, ##X_0, X_1, X_2,...,X_n##, where ##X_i## has ##2^i## equally probably outcomes. Then ##p_i = 1/2^i## and ##H(X_i)=-log2(1/2^i)=i## bits.
Hi Jarvis (and anyone else interested), instead of a statistical process with a mono-exponential expanding sample space, what about the characterisation of the more general process in which the expansion is multi-exponential in nature. What is the probability of x0 and the associated information entropy (H), as s increases multi-exponentially?
 
  • #14
Larry Lacey said:
Hi Jarvis (and anyone else interested), instead of a statistical process with a mono-exponential expanding sample space, what about the characterisation of the more general process in which the expansion is multi-exponential in nature.

What do you mean by a "statistical process"? The general definition of a stochastic process is that it is an indexed collection of random variables. Thus in a stochastic process there is an indexed collection of associated sample spaces. What an "expanding" sample space would mean for a stochastic process is unclear. A random variable has an associated sample space. The sample space of a random variable doesn't change as long as we are talking about the same random variable.

Perhaps you intend to ask about the limit of a sequence of values associated with a sequence of random variables. If ##X_1, X_2,...## is a sequence of random variables then there is an associated sequence of sample spaces and an associated sequence of Shannon entropies.
 
  • Like
Likes   Reactions: sysprog
  • #15
Stephen Tashi said:
What do you mean by a "statistical process"? The general definition of a stochastic process is that it is an indexed collection of random variables. Thus in a stochastic process there is an indexed collection of associated sample spaces. What an "expanding" sample space would mean for a stochastic process is unclear. A random variable has an associated sample space. The sample space of a random variable doesn't change as long as we are talking about the same random variable.

Perhaps you intend to ask about the limit of a sequence of values associated with a sequence of random variables. If ##X_1, X_2,...## is a sequence of random variables then there is an associated sequence of sample spaces and an associated sequence of Shannon entropies.
Hi Stephen, thanks for your feedback. The process is described at the outset of the post. However I believe you could interpret it as a sequence of random variables, each with a greater sample size that the previous one, and that the sequence progresses in a multi-exponential matter in terms of sample size. So what is the charactersation of the sequence? Does that address your query?
 

Similar threads

  • · Replies 24 ·
Replies
24
Views
3K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 17 ·
Replies
17
Views
4K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 12 ·
Replies
12
Views
5K
  • · Replies 2 ·
Replies
2
Views
2K