## Markov Chains and absorption probabilites

A single-celled organism contains N particles, some of which are of type A, the others of
type B . The cell is said to be in state i where 0<=i<=N if it contains exactly i particles
of type A. Daughter cells are formed by cell division, but rst each particle replicates itself;
the daughter cell inherits N particles chosen at random from the 2i particles of type A
and 2N-2i of type B in the parent cell.

Find the absorption probabilities and expected times to absorption for the case N = 3.

I so far have that the absorbing states are i=0, i=3 but have no idea where to go from there

 PhysOrg.com science news on PhysOrg.com >> Hong Kong launches first electric taxis>> Morocco to harness the wind in energy hunt>> Galaxy's Ring of Fire
 Mentor For N=3, you can calculate the transition matrix manually. Many entries are 0, and some others follow from symmetry, so you just need 2 interesting entries.
 how do i calculate the entries though, thats where i'm stuck at the moment, i know of course the lines for starting in state 0 and 3, but have no clue about 1 or 2, once i know that the rest of the question becomes fairly trivial, could you push me in the right direction?

Mentor

## Markov Chains and absorption probabilites

i=1 leads to AABBBB in the cell before splitting. If you randomly pick 3 of them, what is the probability of getting 0 (,1,2,3) times A?

 oh is that standard binomial? so probability of going from state 1 to 0 would be (2/3)^3 which is 8/27 then do the same for the other states? or am i missing something?
 i really don't understand the probabilities of getting to the other states, do i not need to also consider what the other cell will contain or is that irrelevant?
 I think i finally get it, so probability of 0 A's is equal to (2/3)*(3/5)*(1/2) which is the probability of selecting a B each time Then follow the same method for 1 A taking into account whether you chose the A first, second or third? I hope thats right
 Mentor That is correct.
 Thanks for the help

 Tags markov chains