About markov source entropy


by Drao92
Tags: entropy, markov, source
Drao92
Drao92 is offline
#1
Oct28-12, 05:22 AM
P: 67
Greetings,
I want to ask you somthing if i understood well this subject.
Lets say we have an order 1 binary source.
H(a)=-Paa*log(Paa)-Paa*log(Pab) bit/symbol.
From what i understand this is the average information of a symbol generated after an "a", like aa or ab.
Is it right???
Phys.Org News Partner Science news on Phys.org
Better thermal-imaging lens from waste sulfur
Hackathon team's GoogolPlex gives Siri extra powers
Bright points in Sun's atmosphere mark patterns deep in its interior
chiro
chiro is offline
#2
Oct29-12, 12:01 AM
P: 4,570
Hey Drao92.

This is spot on and this entropy does give the information content of something following an a in the context of a markovian model.

If it wasn't markovian and was a general statement, then it would be a lot more complex (however it should always be bounded by this entropy figure).

As a footnote, recall that something of maximum entropy is purely random and the lower the entropy, the more order and less random a particular process or random variable (or distribution) is so if you have extra information that makes something less random, then the entropy will be lower.

This is the intuitive reason for the bound and being aware of this can be extremely useful when looking at entropy identities as well as for solving practical problems (like engineering stuff dealing with some maximal noise component).
Drao92
Drao92 is offline
#3
Nov8-12, 11:25 AM
P: 67
Sorry for late post. Can you tell me if this is correct.
The transition matrix is:
[0.2 0.8]
[1 0]
If the total etropy is H(0)+H(1)=e;
The quantity of information generated after a zero would be
e*0.2+e*0.8???
Because e is the mean informatiojn generated per symbol and 0.2 and 0.8 are the probabilities to generated 0 or 1 after a "zero".


Register to reply

Related Discussions
Huffman coding a Markov source Set Theory, Logic, Probability, Statistics 6
Power supplied by both a Current Source and a voltage source connected in parallel Engineering, Comp Sci, & Technology Homework 1
Need Help! calculating Entropy & probabilities of a simple Markov Process Introductory Physics Homework 1
RL circuit with dependent voltage source and indep. current source with unit step Engineering, Comp Sci, & Technology Homework 5
CIRCUIT ANALYSIS: Use superposition - 2 Current source, 1 Voltage source, 4 resistors Advanced Physics Homework 6