Conditional Probability - Markov chain

Apteronotus
Messages
201
Reaction score
0
Hi,

I was reading about Markov chains and came across the following statement:

"The conditional distribution p(x_n|x_{n-1}) will be specified by a set of K-1 parameters for each of the K states of x_{n-1} giving a total of K(K-1) parameters."

In the above we have assumed that the observations are discrete variables having K states.

I understand that x_{n-1} can have K states, but why K-1 parameters for each state? And what are those parameters?

Thanks,
 
Physics news on Phys.org
There are K different probabilities in the set of values p(x_n|x_{n-1}) and you could call each of these numbers a parameter. Since these probabilities must sum to 1, you only have to specify K-1 of them and this will determine the value of "the last one".
 
Brilliant!

Thank you.
 
Namaste & G'day Postulate: A strongly-knit team wins on average over a less knit one Fundamentals: - Two teams face off with 4 players each - A polo team consists of players that each have assigned to them a measure of their ability (called a "Handicap" - 10 is highest, -2 lowest) I attempted to measure close-knitness of a team in terms of standard deviation (SD) of handicaps of the players. Failure: It turns out that, more often than, a team with a higher SD wins. In my language, that...
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...
Back
Top