How Does Markov Source Entropy Calculate Information for Binary Sources?

  • Thread starter Thread starter Drao92
  • Start date Start date
  • Tags Tags
    Entropy Source
Drao92
Messages
70
Reaction score
0
Greetings,
I want to ask you somthing if i understood well this subject.
Lets say we have an order 1 binary source.
H(a)=-Paa*log(Paa)-Paa*log(Pab) bit/symbol.
From what i understand this is the average information of a symbol generated after an "a", like aa or ab.
Is it right?
 
Physics news on Phys.org
Hey Drao92.

This is spot on and this entropy does give the information content of something following an a in the context of a markovian model.

If it wasn't markovian and was a general statement, then it would be a lot more complex (however it should always be bounded by this entropy figure).

As a footnote, recall that something of maximum entropy is purely random and the lower the entropy, the more order and less random a particular process or random variable (or distribution) is so if you have extra information that makes something less random, then the entropy will be lower.

This is the intuitive reason for the bound and being aware of this can be extremely useful when looking at entropy identities as well as for solving practical problems (like engineering stuff dealing with some maximal noise component).
 
Sorry for late post. Can you tell me if this is correct.
The transition matrix is:
[0.2 0.8]
[1 0]
If the total etropy is H(0)+H(1)=e;
The quantity of information generated after a zero would be
e*0.2+e*0.8?
Because e is the mean informatiojn generated per symbol and 0.2 and 0.8 are the probabilities to generated 0 or 1 after a "zero".
 
Namaste & G'day Postulate: A strongly-knit team wins on average over a less knit one Fundamentals: - Two teams face off with 4 players each - A polo team consists of players that each have assigned to them a measure of their ability (called a "Handicap" - 10 is highest, -2 lowest) I attempted to measure close-knitness of a team in terms of standard deviation (SD) of handicaps of the players. Failure: It turns out that, more often than, a team with a higher SD wins. In my language, that...
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...
Back
Top