# Need Help! calculating Entropy & probabilities of a simple Markov Process

1. Aug 28, 2010

### degs2k4

1. The problem statement, all variables and given/known data

I am in a hurry with the following problem:

We have a source that produces binary symbols, 0 and 1.
0 follows a 0 at probability 7/8
1 follows a 1 at probability 1/2

A) Calculate probability of the symbols 0 and 1 to appear.
B) Calculate entropy of source.

3. The attempt at a solution

A) We can say that:

0 follows a 0 at probability p(0,0) = 7/8 => 1 follows a 0 at probability p(0,1) = 1/8
1 follows a 1 at probability p(1,1) = 1/2 => 0 follows a 1 at probability p(1,0) = 1/2

p(0) = p(0) p(0,0) + p(1) p(1,0) = p(0) 7/8 + p(1) 1/2
p(1) = p(0) p(0,1) + p(1) p(1,1) = p(1) 1/2 + p(0) 1/8

Now, we have to get p(0) and p(1) but how ? I only come up with p(0) = 4 p(1)
(according to "a solution", p(0) should be 4/5 and p(1) 1/5)

B) I think it can be calculated in 2 ways:

Way 1:
H(S) = p(0) H0 + p(1) H1
(sum of the probabilities of getting a 0 and 1 multiplied by its respective variabilities, where
H0 = -7/8log(7/8) - 1/8log(1/8)
H1 = -1/2log(1/2) - 1/2log(1/2)

Way 2:
H(S) = -p(0)log(p(0)) -p(1)log(p(1))

Which way is the correct one?