Need Help calculating Entropy & probabilities of a simple Markov Process

In summary, the conversation discusses a problem involving a source that produces binary symbols, with different probabilities for each symbol to appear. The main questions are how to calculate the probabilities of 0 and 1 appearing and the entropy of the source. Two possible methods for calculating entropy are mentioned, with uncertainty about which one is correct. The person is in a hurry and needs a clear answer for tomorrow.
  • #1
degs2k4
74
0

Homework Statement



I am in a hurry with the following problem:

We have a source that produces binary symbols, 0 and 1.
0 follows a 0 at probability 7/8
1 follows a 1 at probability 1/2

A) Calculate probability of the symbols 0 and 1 to appear.
B) Calculate entropy of source.

The Attempt at a Solution



A) We can say that:

0 follows a 0 at probability p(0,0) = 7/8 => 1 follows a 0 at probability p(0,1) = 1/8
1 follows a 1 at probability p(1,1) = 1/2 => 0 follows a 1 at probability p(1,0) = 1/2

p(0) = p(0) p(0,0) + p(1) p(1,0) = p(0) 7/8 + p(1) 1/2
p(1) = p(0) p(0,1) + p(1) p(1,1) = p(1) 1/2 + p(0) 1/8

Now, we have to get p(0) and p(1) but how ? I only come up with p(0) = 4 p(1)
(according to "a solution", p(0) should be 4/5 and p(1) 1/5)

B) I think it can be calculated in 2 ways:

Way 1:
H(S) = p(0) H0 + p(1) H1
(sum of the probabilities of getting a 0 and 1 multiplied by its respective variabilities, where
H0 = -7/8log(7/8) - 1/8log(1/8)
H1 = -1/2log(1/2) - 1/2log(1/2)

Way 2:
H(S) = -p(0)log(p(0)) -p(1)log(p(1))

Which way is the correct one?

Thanks in advance!
 
Physics news on Phys.org
  • #2
Anyone? I am a bit in a hurry, need to have this clear for tomorrow...
 

1. How do I calculate entropy for a simple Markov process?

To calculate entropy for a simple Markov process, you will need to determine the probabilities of each state in the process. Then, use the formula for entropy: S = -Σpiln(pi), where pi is the probability of each state. Plug in the probabilities and sum them to get the total entropy.

2. What is the relationship between entropy and probabilities in a Markov process?

Entropy is a measure of the uncertainty or randomness in a system. In a Markov process, the probabilities of each state represent the likelihood of transitioning between states. The higher the entropy, the more unpredictable the process is.

3. Can I use the same formula for entropy in a more complex Markov process?

Yes, the formula for entropy can be used for any Markov process, regardless of complexity. However, the probabilities may be more difficult to calculate in a more complex process.

4. How can I use entropy to analyze a Markov process?

Entropy can provide insight into the behavior of a Markov process by quantifying the amount of randomness or uncertainty in the system. A higher entropy value may indicate a more unpredictable process, while a lower entropy value may suggest a more stable and predictable process.

5. Is it necessary to calculate the probabilities for every state in a Markov process?

In order to accurately calculate entropy, it is necessary to have the probabilities for every state in the process. However, in some cases, it may be possible to estimate probabilities or use a simplified model to approximate the probabilities for a complex process.

Similar threads

  • Introductory Physics Homework Help
Replies
5
Views
1K
  • Introductory Physics Homework Help
Replies
8
Views
745
  • Introductory Physics Homework Help
Replies
6
Views
929
  • Introductory Physics Homework Help
Replies
28
Views
368
  • Engineering and Comp Sci Homework Help
Replies
4
Views
2K
  • Precalculus Mathematics Homework Help
Replies
7
Views
803
  • Introductory Physics Homework Help
Replies
14
Views
2K
  • Precalculus Mathematics Homework Help
Replies
4
Views
747
Replies
70
Views
3K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
970
Back
Top