Exponential statistical process to be characterised

In summary, there is a random variable (X) with a determined outcome (i.e., X = x0), p(x0) = 1. Consider x0 to have a “discrete uniform distribution” over the integer interval [1, s], where the sample space (s) = 1, such that p(x0) = 1. This is the initial state. The probability of x0 is irrelevant as it has a probability of 1. As the sample space (s) increases exponentially, the associated information entropy (H) will increase linearly. The Shannon entropy for a discrete, uniform distribution with n possible outcomes is H(X) = -log2(p), where p = 1/n. If
  • #1
Larry Lacey
17
6
TL;DR Summary
There is a random variable (X) with a determined outcome (i.e., X = x0), p(x0) = 1. Consider x0 to have a “discrete uniform distribution” over the integer interval [1, s], where the sample space (s) = 1, such that p(x0) = 1. This is the initial state. What is the probability of x0 and the associated information entropy (H), as s increases exponentially?
I'd be grateful for any formulation that describes this statistical process
 
Physics news on Phys.org
  • #2
Larry Lacey said:
Summary:: There is a random variable (X) with a determined outcome (i.e., X = x0), p(x0) = 1. Consider x0 to have a “discrete uniform distribution” over the integer interval [1, s], where the sample space (s) = 1, such that p(x0) = 1. This is the initial state. What is the probability of x0 and the associated information entropy (H), as s increases exponentially?

I'd be grateful for any formulation that describes this statistical process
I understand what all those words mean individually, but when you put them together like that they don't mean anything to me. Let's try and break it down:

Larry Lacey said:
There is a random variable (X) with a determined outcome (i.e., X = x0), p(x0) = 1.
So X = 1 with probability 1. That's not much of a random variable.

Larry Lacey said:
Consider x0 to have a “discrete uniform distribution” over the integer interval [1, s]
So x0 = 1 with probability 1/s, x0 = 2 with probability 1/s...

Larry Lacey said:
where the sample space (s) = 1
I'm not sure about your notation, but if you are saying that the sample space of s is { 1 } then s = 1 with probability 1, and so the 'interval' for the distribution of x0 is [1, 1].

Larry Lacey said:
such that p(x0) = 1
No, you've lost me there. What is the event that has a probability of 1? You have already defined x0 to be a random variable with a particular distribution, not an event.

Larry Lacey said:
What is the probability of x0
You have already defined x0 to be a random variable with a particular distribution, not an event so it cannot have a probability. x0 has a probability distribution but you have already defined that.

Larry Lacey said:
the associated information entropy (H)
Do you know how to calculate information entropy - it is not difficult? What do you plan to do with the answer?
 
  • Like
Likes Jarvis323, berkeman and PeroK
  • #3
For a discrete, uniform distribution with ##n## possible outcomes, each outcome has the same probability, ##p=1/n##, so the Shannon entropy simplifies to ##H(X)=-log2(p)##. If we have an exponentially increasing sample space (let's just make it base 2 for convenience), like I think you're saying, then we have a sequence of random variables, ##X_0, X_1, X_2,...,X_n##, where ##X_i## has ##2^i## equally probably outcomes. Then ##p_i = 1/2^i## and ##H(X_i)=-log2(1/2^i)=i## bits.
 
  • #4
Many thanks Jarvis323, that's exactly what I believed the answer to be. As you, say the base doesn't matter, but I actually had the base e in mind. Using your terminology I'd express it a little differently to the base e as: pi = exp(-loge(2) x i).
 
  • #5
Sorry Jarvis323, I mean just a different way of expressing pi to the base 2.
 
  • #6
@Larry Lacey, I think that @Jarvis323 is great, but isn't the great answer in post #2 a product of the also great @pbuk, and not of @Jarvis323 ? (I think that maybe you erred on the screen name)
 
  • #7
Sorry, I deleted my answer. I couldn't tell if it's homework related and I was giving away the answer, or if I missed the point. Anyway, Shannon entropy will increase linearly with an exponentially increasing discrete uniform sample space.
 
  • Like
Likes sysprog
  • #8
Jarvis323 said:
Sorry, I deleted my answer. I couldn't tell if it's homework related and I was giving away the answer, or if I missed the point. Anyway, Shannon entropy will increase linearly with an exponentially increasing discrete uniform sample space.
I know from PM conversations with the OP that it's not for schoolwork. Do you want me to restore your post?
 
  • Like
Likes sysprog
  • #9
berkeman said:
I know from PM conversations with the OP that it's not for schoolwork. Do you want me to restore your post?
I guess so.
 
  • Like
Likes berkeman
  • #10
Jarvis323 said:
Sorry, I deleted my answer. I couldn't tell if it's homework related and I was giving away the answer, or if I missed the point. Anyway, Shannon entropy will increase linearly with an exponentially increasing discrete uniform sample space.
No Jarvis323 it was just a query.
 
  • #11
sysprog said:
@Larry Lacey, I think that @Jarvis323 is great, but isn't the great answer in post #2 a product of the also great @pbuk, and not of @Jarvis323 ? (I think that maybe you erred on the screen name)
Sure sysprog, all feedback was gratefully received
 
  • Like
Likes sysprog
  • #12
pbuk said:
I understand what all those words mean individually, but when you put them together like that they don't mean anything to me. Let's try and break it down:So X = 1 with probability 1. That's not much of a random variable.So x0 = 1 with probability 1/s, x0 = 2 with probability 1/s...I'm not sure about your notation, but if you are saying that the sample space of s is { 1 } then s = 1 with probability 1, and so the 'interval' for the distribution of x0 is [1, 1].No, you've lost me there. What is the event that has a probability of 1? You have already defined x0 to be a random variable with a particular distribution, not an event.You have already defined x0 to be a random variable with a particular distribution, not an event so it cannot have a probability. x0 has a probability distribution but you have already defined that.Do you know how to calculate information entropy - it is not difficult? What do you plan to do with the answer?
Many thanks for the feedback. Did not respond as by the time I had seen your feedback, I had seen the feedback from jarvis323 below which was the formulation I was seeking. Many thanks to all.
 
  • Like
Likes sysprog and berkeman
  • #13
Jarvis323 said:
For a discrete, uniform distribution with ##n## possible outcomes, each outcome has the same probability, ##p=1/n##, so the Shannon entropy simplifies to ##H(X)=-log2(p)##. If we have an exponentially increasing sample space (let's just make it base 2 for convenience), like I think you're saying, then we have a sequence of random variables, ##X_0, X_1, X_2,...,X_n##, where ##X_i## has ##2^i## equally probably outcomes. Then ##p_i = 1/2^i## and ##H(X_i)=-log2(1/2^i)=i## bits.
Hi Jarvis (and anyone else interested), instead of a statistical process with a mono-exponential expanding sample space, what about the characterisation of the more general process in which the expansion is multi-exponential in nature. What is the probability of x0 and the associated information entropy (H), as s increases multi-exponentially?
 
  • #14
Larry Lacey said:
Hi Jarvis (and anyone else interested), instead of a statistical process with a mono-exponential expanding sample space, what about the characterisation of the more general process in which the expansion is multi-exponential in nature.

What do you mean by a "statistical process"? The general definition of a stochastic process is that it is an indexed collection of random variables. Thus in a stochastic process there is an indexed collection of associated sample spaces. What an "expanding" sample space would mean for a stochastic process is unclear. A random variable has an associated sample space. The sample space of a random variable doesn't change as long as we are talking about the same random variable.

Perhaps you intend to ask about the limit of a sequence of values associated with a sequence of random variables. If ##X_1, X_2,...## is a sequence of random variables then there is an associated sequence of sample spaces and an associated sequence of Shannon entropies.
 
  • Like
Likes sysprog
  • #15
Stephen Tashi said:
What do you mean by a "statistical process"? The general definition of a stochastic process is that it is an indexed collection of random variables. Thus in a stochastic process there is an indexed collection of associated sample spaces. What an "expanding" sample space would mean for a stochastic process is unclear. A random variable has an associated sample space. The sample space of a random variable doesn't change as long as we are talking about the same random variable.

Perhaps you intend to ask about the limit of a sequence of values associated with a sequence of random variables. If ##X_1, X_2,...## is a sequence of random variables then there is an associated sequence of sample spaces and an associated sequence of Shannon entropies.
Hi Stephen, thanks for your feedback. The process is described at the outset of the post. However I believe you could interpret it as a sequence of random variables, each with a greater sample size that the previous one, and that the sequence progresses in a multi-exponential matter in terms of sample size. So what is the charactersation of the sequence? Does that address your query?
 

1. What is an exponential statistical process?

An exponential statistical process is a type of statistical model that is used to describe the behavior of data that follows an exponential distribution. This type of process is characterized by a constant rate of change, meaning that the data grows or decays at a consistent rate over time.

2. How is an exponential statistical process characterized?

An exponential statistical process is characterized by two parameters: the rate parameter (λ) and the scale parameter (β). The rate parameter determines the rate at which the data grows or decays, while the scale parameter determines the magnitude of the data.

3. What are some common applications of exponential statistical processes?

Exponential statistical processes are commonly used in fields such as finance, engineering, and biology to model data that follows an exponential distribution. They can be used to forecast future trends, analyze growth rates, and make predictions about future events.

4. How is an exponential statistical process different from other statistical processes?

An exponential statistical process is different from other statistical processes in that it assumes a constant rate of change, while other processes may have varying rates of change. Additionally, exponential processes are often used for data that follows an exponential distribution, while other processes may be used for different types of data.

5. What are some limitations of using an exponential statistical process?

One limitation of using an exponential statistical process is that it may not accurately describe data that does not follow an exponential distribution. Additionally, it may not be appropriate for data that has outliers or extreme values. It is important to carefully consider the data and its characteristics before using an exponential statistical process.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
468
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
15
Views
367
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
737
  • Set Theory, Logic, Probability, Statistics
Replies
12
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
18
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
675
Back
Top