Differentials, taylor series, and function notation

  • Thread starter PFuser1232
  • Start date
  • #1
479
20
"Expanding the taylor series for ##f(x)##.." (See picture) is this a typo? Aren't we expanding ##f(x + \Delta x)##?
Also, when we evaluate ##f(x)## (coefficients in the expansion), are we assuming ##\Delta x = 0## by setting ##x + \Delta x## (argument of the function) equal to ##x##? Or are we just "replacing" ##x + \Delta x## by ##x##? I'm confused because usually when a function is evaluated at a particular number we just set the variable inside the argument of the function equal to that number, but when we're evaluating at "some other variable", for instance, finding ##f(x^2 + 1)## given ##f(x) = x^5##, we're not implying that ##x^2 + 1 = x^5##.
 

Attachments

Answers and Replies

  • #2
OldEngr63
Gold Member
732
51
You are getting wrapped up in semantics. The whole thing is about expanding the function f in a series form. Whether this is f evaluated at x ( f(x) ) or f evaluated at x+Δx ( f(x+Δx) ) is not anything to split hairs over. The way it was written in the image is the standard way to write this, and in fact, it is the correct way as well.

What it says is that the function, evaluated at x+Δx is equal to the function evaluated at x, plus the additional terms shown. This is true.

Let us change notations just a bit. Let a and b denote specific numerical values fairly close together. Then the expansion says
f(b) = f(a) +f'(a)*(b-a) + (1/2)*f"(a)*(b-a)^2 + ....
Does that make sense to you?
 
  • Like
Likes PFuser1232
  • #3
3
0
I think you may be a bit unclear on the Taylor series itself and how it is being used.

The Taylor series gives an approximation to a function f(x) at any point x. It basically gives a way to model the function in a neighborhood of x (imagine x = a) by saying it is the sum of f(a) plus f'(a) x q plus f''(k) x r plus f'''(k) x s plus....

Now, at x = a, all of the terms after the first are unnecessary - f(x) at x = a is just f(a).

The benefit of a Taylor series is often to ask, if we know f(a), what f(a + delta) is, approximately, for small delta. To do this we add up those other terms and add to them the delta times the term.

It might be easiest to think of an example. Imagine f(x) = x^2. We know f(2) is 4. We want to know what F(x) is for a small delta above 2, like at 2.00001. Thus, delta is .00001. The first two terms tell us to estimate f(2.0001) as f(2) plus .0001 x f'(2). Now, f'(x) is 2x for all x, and is the slope of f(x) at x. So, you see, we are taking f(2) and adjusting it by the delta times the slope right there at x = 2 to get a better estimate. More terms get more precise, but it is just adding more corrections.
 
Last edited:

Related Threads on Differentials, taylor series, and function notation

Replies
7
Views
2K
  • Last Post
Replies
12
Views
10K
Replies
9
Views
758
Replies
14
Views
3K
Replies
5
Views
12K
Replies
7
Views
1K
Replies
6
Views
2K
Replies
4
Views
10K
  • Last Post
Replies
1
Views
2K
  • Last Post
Replies
2
Views
725
Top