# Homework Help: Reason if polynomials A, B and C exist s.t. they satisfy the following

1. Jun 1, 2014

### powerof

1. The problem statement, all variables and given/known data

Reason or prove whether there exist polynomials A, B and C such that the following is satisfied where $y=e^{k\cdot arcsinx}$:

$A\cdot y''+B\cdot y'+C\cdot y=0$

Note that this is high school level calculus so it shouldn't be something too complicated. While I said "prove", "reason" is more accurate, so no need to prove formally.

The exercise is in Spanish, section 3. a) in the attachment.

2. Relevant equations

Relevant equations would be the derivatives but it's pointless to write them all here.

3. The attempt at a solution

I tried deriving y to find y' and y'' and then I substituted:

$\left\{\begin{matrix}y=e^{k\cdot arcsinx} \\ y'=e^{k\cdot arcsinx}\cdot \frac{k}{\sqrt{1+x^2}}=y\cdot \frac{k}{\sqrt{1+x^2}} \\ y''=e^{k\cdot arcsinx}\cdot \frac{k^2}{1+x^2}+e^{k\cdot arcsinx}\cdot \frac{-kx}{\sqrt{(1+x^2)^{3}}}=y\cdot (\frac{k^2}{1+x^2}-\frac{kx}{(1+x^2)\sqrt{1+x^2}})=y\cdot (\frac{k^2\sqrt{1+x^2}-kx}{(1+x^2)\sqrt{1+x^2}})\end{matrix}\right.$

$\left\{\begin{matrix}A\cdot y''+B\cdot y'+C\cdot y=0 \\ A\cdot [y\cdot (\frac{k^2\sqrt{1+x^2}-kx}{(1+x^2)\sqrt{1+x^2}})]+B\cdot [y\cdot \frac{k}{\sqrt{1+x^2}}]+C\cdot y=0 \\ A\cdot({k^2\sqrt{1+x^2}-kx})+B\cdot{k(1+x^2)}+C\cdot (1+x^2)\sqrt{1+x^2}=0\end{matrix}\right.$

Please note that I don't know if doing all the above was the best option. I tried to simplify as much as I could but perhaps I needn't have done that (ie, the solution is somewhere else).

I don't know how to explain why there should exist polynomials such that they satisfy the condition mentioned previously, so please give me some pointers.

Thank you for reading. Have a nice day.

#### Attached Files:

• ###### Matemáticas 09 -10.pdf
File size:
17.7 KB
Views:
85
Last edited: Jun 1, 2014
2. Jun 1, 2014

### ehild

The derivative of arcsin(x) is $\frac{1}{\sqrt{1-x^2}}$.
Collect the terms containing the square root and those, without it. Choose the polynomials A, B,C so both terms are zero separately.

ehild

Last edited: Jun 1, 2014
3. Jun 1, 2014

### powerof

Thanks, I'll give it a try.

4. Jun 1, 2014

### powerof

With my error corrected you get:

$\left\{\begin{matrix}A\cdot({k^2\sqrt{1-x^2}+kx})+B\cdot{k(1-x^2)}+C\cdot (1-x^2)\sqrt{1-x^2}=0 \\ Akx+B\cdot{k(1-x^2)}=(-C\cdot (1-x^2)-A{k^2})\sqrt{1-x^2} \\\sqrt{1-x^2}=\frac{Akx+B\cdot{k(1-x^2)}}{-C\cdot (1-x^2)-A{k^2}}=-\frac{k(Bx^2-Ax-B))}{Cx^2-A{k^2}-C} \end{matrix}\right.$

Is this what you meant? I don't know what to do next.

5. Jun 1, 2014

### haruspex

I assume a non-trivial solution is required .
If that is to be satisfied for all x by some polynomials A, B and C, what does that suggest about the factor $(-C\cdot (1-x^2)-A{k^2})$?

6. Jun 1, 2014

### ehild

Choose the polynomials so that $Akx+B\cdot{k(1-x^2)}=0$ and $-C\cdot (1-x^2)-A{k^2}=0$

ehild

7. Jun 1, 2014

### powerof

That is has to somehow cancel the square root for there to be integer powers on both sides?

8. Jun 1, 2014

### ehild

Your equation is satisfied if both sides are identically zero.

ehild

9. Jun 1, 2014

### powerof

As I understand it, these two have to be zero because since they're polynomials, they cannot "cancel" the square root since they're powers of integers, therefore the only option is for them to be zero.

So we have two equations with three unknowns (A, B and C):

$\begin{bmatrix} kx & k(1-x^2) &0 \\ -k^2 & 0 & x^2-1 \end{bmatrix}\begin{bmatrix}A \\ B \\ C\end{bmatrix}=\begin{bmatrix}0 \\ 0 \\ 0 \end{bmatrix}$

There seem to be infinite solutions. I'll factor out a k and try to solve with Gauss' method:

$\begin{bmatrix} x & 1-x^2 &0 \\ -k & 0 & \frac{x^2-1}{k} \end{bmatrix}\begin{bmatrix}A \\ B \\ C\end{bmatrix}=\begin{bmatrix}0 \\ 0 \\ 0 \end{bmatrix}$

I should study the case where k=0 (because for k=0 I can't factor it out) but I'll just ignore it for now because it's getting quite long anyway (writing in latex is a bit tiring but it's a mess if I don't do it).

$\begin{bmatrix} x & 1-x^2 &0 & 0\\ -k & 0 & \frac{x^2-1}{k} & 0 \end{bmatrix} \mapsto \begin{bmatrix} x & 1-x^2 &0 & 0\\ 0 & k(1-x^2) & \frac{x}{k}(x^2-1) & 0 \end{bmatrix} ^{R_{2}' \rightarrow x\cdot R_{2}+k\cdot R_{1}} \mapsto \begin{bmatrix} kx & 0 & \frac{x}{k}(1-x^2) & 0\\ 0 & k(1-x^2) & \frac{x}{k}(x^2-1) & 0 \end{bmatrix}^{R_{1}'\rightarrow k\cdot R_{1}-R_{2}}$

I have to go now so I'll post to save what I've written until now. I can just "fix" C (make it a sort of parameter polynomial) and get A and B in function of C. Since this system is solvable this proves that such polynomials exist.

Have I understood this right?

10. Jun 1, 2014

### ehild

It looks all right in principle, but I do not follow what you did. It need not be so overcomplicated.
From the second equation $-C\cdot (1-x^2)-A{k^2}=0→A(x)=-\frac{C(x)}{k^2}(1-x^2)$
C(x) can be any polynomial, you get A(x) by multiplying it with (1-x2) and dividing by the konstant k2. For C(x), you can choose the simplest : C=k2. A constant is a polynomial of 0th order...
You know A(x) and C(x) and there is the first equation for B(x), and you can divide the equation by k (assuming it is not zero): $A(x)x+B(x)(1-x^2)=0$
Substitute A(x) in terms of C. What do you get for B(x)?

ehild

Last edited: Jun 1, 2014
11. Jun 1, 2014

### powerof

Assuming x is not 1 and taking C=k^2, we would get B=x. Since just finding one example proves what we wanted, this exercise is done.

Regarding what I did previously, writing it in matrix form may have not been needed but I personally prefer it. R' is the new row and R(sub)1 is the old row 1 and R(sub)2 is the old row 2. The arrow should have be inverse, from the "old" rows to the new one, showing the logical progression (the operations I did) in order to avoid confusion; writing it like that was a mistake of mine.

Thanks for your help and have a nice day.

12. Jun 1, 2014

### ehild

You are welcome.

ehild