# DNA sequence modeled as 4 faced die

1. Feb 2, 2015

### bowlbase

1. The problem statement, all variables and given/known data
I have a DNA sequence generated by L throws of a 4 faced die with probabilities $\pi_A, \pi_C, \pi_G, \pi_T$. Each probability is unknown. Task: estimate the probability of each side of the die. Hint: use a random variable defined by the sequence that has a binomial distribution then use the likelihood maximization.

2. Relevant equations

3. The attempt at a solution
So, as always with these problems my first attempt at this is wildly incorrect despite making perfect sense to me. My naive approach would just be to count the number of each A, C, G, T within the sequence and divide by the sequence length to get an approximate probability for that letter at any position within the sequence. But this doesn't use either hint.

The other way I would do this is to still use each letter as a separate random variable 4 binomial distributions. These would give me probability distribution of particular letters in the sequence. Then for each binomial distribution I would take the derivative and set to 0 to get the solution to the probability.

For instance: $F(\pi_A)=\binom{L}{N_A}\pi_A^{N_A}(1-\pi_A)^{L-N_A}$
Doing log on both sides, taking the derivative and setting to 0.
$0=\frac{N_A}{\pi_A}- \frac{L-N_A}{1-\pi_A}$

I'm about to be late for class but at first glance this looks like I made a mistake or that my method is wrong here as well. But the idea is that I do this four letters.

Thanks for the help.

2. Feb 2, 2015

### Ray Vickson

I liked your first method better, but if you insist on using something like the second method, you should use the appropriate distribution. In this case you have 4 possible outcomes at each "toss", so a binomial would not be appropriate (unless you look at two outcomes only, such as "A" or "not-A", etc.) Much better, I think, would be to use the multinomial distribution (see, eg., http://en.wikipedia.org/wiki/Multinomial_distribution or http://stattrek.com/probability-distributions/multinomial.aspx ). So, the probability of outcome counts of $n_A,n_C,n_G,n_T$ for given total $N = n_A + n_C + n_G + n_T$ is
$$\text{probability} = f(p_A,p_C,p_G,p_T) \equiv \frac{N!}{n_A! n_C! n_G! n_T!} p_A^{n_A} p_C^{n_C} p_G^{n_G} p_T^{n_T}$$
A maximum-likelihood estimator of the $p_i$ would be given by solving the constrained optimization problem
$$\text{maximize} f(p_A,p_C,p_G,p_T),\\ \text{subject to} \;\; p_A + p_C + p_G + p_T = 1 \;\; \text{and}\;\ p_A, p_C, p_G, p_T \geq 0$$
This could be tackled by the Lagrange multiplier method (provided that we neglect the "$\geq 0$" constraints). I will let you worry about whether or not you get the same final solution as given by your first, simple, method.

3. Feb 2, 2015

### haruspex

Sounds like a good method to me. Isn't it a lot simpler than handling all four at once?

4. Feb 2, 2015

### Ray Vickson

Yes, provided that the resulting 4 probs add to 1; I will let the OP worry about whether or not that will happen,

5. Feb 2, 2015

### bowlbase

My idea was handle each as either A or Not-A as you mentioned. Since I was explicitly given the hint to use the binomial and maximization likelihood. I don't believe we ever discussed the multinomial distribution but I can see how it works here.

I think that since the problem is really just looking for an estimation that if the sum is not exactly 1, that it will be okay.

6. Feb 2, 2015

### Ray Vickson

I suggest that before deciding this one way or another, you carry out the complete solution for your max. likelihood estimate $\pi_A$ in terms of $N_A$ and $L$. Then, of course, you would estimate $\pi_C, \pi_G, \pi_T$ using the same formula, but with $N_A$ replaced by by $N_C, N_G, N_T$.

7. Feb 3, 2015

### bowlbase

I finally found time to sit and finish this, sorry it took so long. So, I found the maximum with the method I described initially and got the exact some result as if I had just taken the first "simple" method I thought of. It seems my first instinct was correct.

Thanks for the help.

8. Feb 3, 2015

### haruspex

Oh yes, that was always going to be the answer, but I thought the object of the exercise was to derive it from the maximum likelihood method.

9. Feb 3, 2015

### bowlbase

Well, the ML method described in class was to take the binomial distribution's derivative, set it to 0 and solve for the probability. So that is the method I used. Is this not correct?

10. Feb 3, 2015

### haruspex

Yes, it's correct. I was responding to this:
was always going to give the right answer. According to the hint, you were to use ML to get the answer, which you have done.

11. Feb 3, 2015

### bowlbase

Oh, okay. I was worried for a second that I had done something else wrong. Thanks!

12. Feb 4, 2015

### Ray Vickson

The MLE from the binomial (applied four times to the four different $p$s) gives the same probabilities as the MLE from the multinomial, applied once to all four $p$s simultaneously. That is a nice fact, because IF they had given different results that would have been a real source of worry.