Series, in general, whether Taylor's series, or Fourier series, give us a way of representing complicated functions a sums of simpler functions, then working with just the simpler individual terms of the series.
Here's a simple example of their use.
Solve y'= y with y(0)= 1.
Okay, we know that y(0)= 1 and, since y'= y, y'(0)= y(0)= 1. Differentiating both sides of the equation, y"= y' so y"(0)= y'(0)= 1. Differentiating again, y"'= y" so y"'= y"(-0)= 1. It should be easy to see that that leads to the nth derivative of y at x=0 equal to 0 for all n. That's only at x= 0, not other x, but now we can say, from the Taylor's series, that y(x)= y(0)+ y'(0)x+ (1/2)y"(0)x^2+ (1/6)y"'(0)x^3...= 1+ x+ (1/2)x^2+ (1/6)x^3+ ... and we now know y as a series expanion. If we happened to have found the Taylor's series earlier for e^x and recognize that this is the same thing, then we no that y(x)= e^x satisfies that but even if it is not, we have a solution none the less.
Another important application is to extend functions to other "number systems" like the complex numbers or even matrices.
I know how to add and multiply complex numbers and matrices but what would e^x be for x a complex number or a matrix?
Well, I know that the McLaurin series for e^x is 1+ x+ (1/2)x^3+ (1/6)x^3+ ...+ (1/n!) x^n+ ... and that only involves multiplication and addition!
e^{bi}= 1+ bi+ (1/2)(bi)^2+ (1/6)(bi)^3+ ...+ (1/n!)(bi)^n+ ...
Now, it is not too hard to see that i^2= -1, i^3= -i, i^4= 1 and then it starts over again: the powers of i are: 1, i, -1, -i, 1, i, -1, -i, 1, ...
So
e^{bi}= 1+ bi- (1/2)b^2- (1/6)ib^3+ ...
Or, separating real and imaginary parts,
e^{bi}= 1- (1/2)b^2+ (1/4!)b^4- (1/6!)b^6+ ...+ i(b- (1/6)b^3+ (1/5!)b^6- (1/7!)b^7+ ...)[/itex]<br />
<br />
And, having studied series, we recognize those two series as being the power series for cosine and sine at x= 0. Thus:<br />
e^{bi}= cos(b)+ i sin(b).<br />
<br />
Looks pretty useful to me!