# Double root

r,q are constants. I need to factor this equation such that there is a double root.

$$-\frac{r}{q}u^3+ru^2-\left(\frac{r}{q}+1\right)u+r=0$$

Are there any tricks for this because this just a nasty equation.

I don't know if that is a wise approach but:

$$(au+b)(cu+d)^2 = ac^2u^3+(2acd+c^2b)u^2+(ad^2+2bcd)u+bd^2$$

Then

$$ac^2 = -\frac{r}{q}$$

$$2acd+c^2b=bd^2=r$$

$$ad^2+2bcd = \frac{r}{q}+1$$

Last edited:

Related Calculus and Beyond Homework Help News on Phys.org
Curious3141
Homework Helper
r,q are constants. I need to factor this equation such that there is a double root.

$$-\frac{r}{q}u^3+ru^2-\left(\frac{r}{q}+1\right)u+r=0$$

Are there any tricks for this because this just a nasty equation.

I don't know if that is a wise approach but:

$$(au+b)(cu+d)^2 = ac^2u^3+(2acd+c^2b)u^2+(ad^2+2bcd)u+bd^2$$

Then

$$ac^2 = -\frac{r}{q}$$

$$2acd+c^2b=bd^2=r$$

$$ad^2+2bcd = \frac{r}{q}+1$$
Hmm...I haven't fully explored this approach, but it might simplify your work. Remember that if a polynomial $p(x)$ has a repeated (double) root $\alpha$, then $\alpha$ is also a root of the derivative $p'(x)$. You reduce the problem to dealing with a quadratic.

SammyS
Staff Emeritus
Homework Helper
Gold Member
r,q are constants. I need to factor this equation such that there is a double root.

$$-\frac{r}{q}u^3+ru^2-\left(\frac{r}{q}+1\right)u+r=0$$

Are there any tricks for this because this just a nasty equation.
...
Curious_π has a very good idea. Before I read his post, I played around with this for a while.

What I came up with is the following:
Let 1/v = u. Substitute that for u, then multiply by v3/r. That gives:
$\displaystyle v^3-\left(\frac{1}{q}+\frac{1}{r}\right)v^2+v-\frac{1}{q}=0$​
Then notice that a cubic function with leading coefficient of 1, and a repeated root can be written as:
$(v-a)^3(v-b)\quad \to\quad v^3-(2a+b)v^2+(a^2+2ab)v-a^2b$​
That's as far as I have taken it. You can try equating coefficients, and/or combining this with Curious3141's suggestion.

.

Just to verify that I even took the right approach.

The question wanted we to show using conditions for a double root that the curve in r-q space is given parametrically by

$$r=\frac{2a^3}{(1+a^2)^2}, \ q=\frac{2a^3}{a^2-1}$$

and the given equations were

$$r\left(1-\frac{u}{q}\right), \ \frac{u}{1+u^2}$$

I set them equal to each other and then re-arranged the equation to set it equal to zero.

That was the right idea though, right?

Which is where I obtained:

$$-\frac{r}{q}u^3+ru^2-\left(\frac{r}{q}+1\right)u+r=0$$

SammyS
Staff Emeritus
Homework Helper
Gold Member
Just to verify that I even took the right approach.

The question wanted we to show using conditions for a double root that the curve in r-q space is given parametrically by

$$r=\frac{2a^3}{(1+a^2)^2}, \ q=\frac{2a^3}{a^2-1}$$

and the given equations were

$$r\left(1-\frac{u}{q}\right), \ \frac{u}{1+u^2}$$
...
What do you mean by:
"... and the given equations were

$\displaystyle r\left(1-\frac{u}{q}\right), \ \frac{u}{1+u^2}$​
Those are not equations. No equal signs.

Last edited:
Just say the first is U = and the second is V =.

It looks like you are trying to solve exercise 1 from the book "Mathematical Biology" by Murray. I am also stuck on that same problem, essential they are trying to make the reader derive the parametric equations from:

r(1-u/q)=u/(1+u^2)

to get the two parametric equations you gave. i.e. r = 2a^3/(1+a^2)^2 etc

Does anyone else know how to solve this?