# Solve D.E. by power series

1. Oct 29, 2011

### burgerkin

I've just started learning to solve DE by using power series and I am not sure If I did it the right way. Would anybody be kind enough going thru my solution here and see if I did it right? I wanna make sure I am doing the right thing, I am very bad with series. Thanks in advance!!

and i have another problem : xy' = y^2 +x^2, how do I solve it? I do not know how to deal with the y^2 .

Last edited: Oct 29, 2011
2. Oct 30, 2011

### jackmell

That's Bessel's equation of index zero. In general, that's not the way to solve it and you're not taking the derivatives correctly either. If:

$$y(x)=\sum_{n=0}^{\infty}a_n x^n$$

then:

$$y'(x)=\sum_{n=1}^{\infty} n a_n x^{n-1}$$

Also, the equation is singular at x=0 so the solution you get would be valid for x>0 so any IVP would be like y(x0)=y0, y'(x0)=y1, x0>0.

However, usually for these types of problems we let:

$$y(x)=\sum_{n=0}^{\infty} a_n x^{n+c}$$

Ok, here's my suggestion and you're not gonna' like it: that problem is too tough for you. The solution includes log terms and it usually placed at the very end of that section on power series, the group where the "indical" equation has equal roots. Tell you what, start at the beginning of the section, work 5 power series solutions for ordinary points, five for singular points indicial equation with difference of roots non integral, two with equal roots, then finally go back and work on this one.

Here's the solution for x>0 that I came up with and I believe is equivalent to the power-series representation for the J(x) and Y(x) Bessel functions which are it's solution:

$$\text{y1}[\text{x\_},\text{c\_}]\text{:=}x^c+\sum _{n=1}^{\infty } \frac{(-1)^n x^{2n+c}}{\prod _{k=0}^{n-1} (2n-2k+c)^2}$$

$$y2[x,c]=\frac{\partial}{\partial c} y1(x,c)$$

Then the general solution is:

$$y(x)=c_1 y1(x,0)+c_2 y2(x,0)$$

And here is the check in Mathematica for:

$$xy''+y'+xy=0,\quad y(1)=1,\quad y'(1)=0$$

Not doing this to show off (too much any way), but rather to show you what has to be done with problems like this: solve it analytically first, code it numerically next, solve it numerically, then compare the numerical solution to the approximate analytic solution. If they agree, then there is a good chance the power series expression you derived is correct.
Code (Text):

Clear[y, y1, ya, yb, yh]
y0 = 1;
y1 = 0;
x0 = 1;
mysol = NDSolve[{x*Derivative[2][y][x] + Derivative[1][y][x] + x*y[x] == 0, y[x0] == y0,
Derivative[1][y][x0] == y1}, y, {x, x0, 5}];
p0 = Plot[y[x] /. mysol, {x, 1, 5}, PlotStyle -> Blue]
nmax = 30;
yh[x_, c_] := x^c + Sum[((-1)^n*x^(2*n + c))/Product[(2*n - 2*k + c)^2, {k, 0, n - 1}], {n, 1, nmax}]
ya[x_] := yh[x, 0];
yb[x_] = D[yh[x, c], c] /. c -> 0;
they[x_] := c1*ya[x] + c2*yb[x]
thec = N[Solve[{they[x0] == y0, (D[they[x], x] /. x -> x0) == y1}, {c1, c2}]]
p1 = Plot[they[x] /. thec, {x, x0, 5}, PlotStyle -> Red]
Show[{p0, p1}]

Last edited: Oct 30, 2011
3. Oct 30, 2011

### burgerkin

Thank you so very much Jackmell! I will have to go back and do a good review on power series in order to grasp your input.

The way you represented y and y' is what I read in the textbook as well. However, my prof is teaching us to represent y in a different way. Below I am showing you an example he did in class, it is simpler problem. I just kinda copied the way he showed us to do my problem. He did not talk much about singular points and convergence.

4. Oct 30, 2011

### jackmell

Dang it! Sorry I made a mistake. Your way of differentiating that series is correct ok. If:

$$y(x)=\sum_{n=0}^{\infty}a_n\frac{x^n}{n!}$$

then:

$$y'(x)=\sum_{n=0}^{\infty} a_{n+1} \frac{x^n}{n!}$$

$$y''(x)=\sum_{n=0}^{\infty} a_{n+2}\frac{x^n}{n!}$$

Just didn't understand that but do now. I looked at your paper some. You're trying to solve an IVP at x=0 but the DE is singular there. I don't think you're approaching it correctly and I'm pretty sure the solution is as I've written it or in the form of the power series for J(x) and Y(x) (equivalent) but only for x>0.

Also that second one you showed me is much, much simpler and one you should study and understand first before working on more complicated ones, then do four more like it, then go to the next sections in the book and do some of those, gradually building up to the more complicated ones. Bessel's equation would be found in the last part of the power series section because it's more complicated to solve.

Last edited: Oct 30, 2011
5. Oct 30, 2011

### burgerkin

Thanks Jackmell! The example is simple so I thought I understood it, but the first problem I showed you is in my hw so I just have to try it. There are couple of more hw problems which seem to be even more complicated cause there are sin x ,cos x in them...hehe, lucky me!

Can you help me a little on this next prob:
xy' = y^2 +x^2
I need to solve it by power series as well, but how can I deal with the y^2? How do I square the expansion of a series? I rarely see this kind of problem with higher order y in there.

Last edited: Oct 30, 2011
6. Oct 30, 2011

### jackmell

Dang dude. You got some hard problems. But don't be afraid to multiply those power series together like:

$$xy'=y^2+x^2$$

so it's:

$$x\sum_{n=0}^{\infty} na_nx^{n-1}=\sum_{n=0}^{\infty} a_n x^n \sum_{n=0}^{\infty} a_n x^n+x^2$$

or use the expressions you were using for the sums (if applicapable) and multiply them. Look up "Cauchy Product" to get the right form of those double-sums and of course, the recursive relation for the a_n will involve sum's themselves. same dif for the sin(x) and cos(x).

The region of convergence however is more complicated to determine on double sums though. Don't know how to do that.

Here's one where I used double series and had a sine in it but it's all in Mathematica code: