"Don't panic!"
- 600
- 8
Apologies if this isn't quite the right forum to post this in, but I was unsure between this and the calculus forum.
Something that has always bothered me since first learning calculus is how to interpret dx, essentially, what does it "mean"? I understand that it doesn't make sense to consider it as an infinitesimal change in x (in a rigorous sense) as the idea of an infinitesimal cannot be formulated rigorously (at least in standard analysis), but can any sense of this notion be retained?
I read in Spivak's book that we can not consider quantities such as df in the classical sense, but to overcome this we can replace this notion of infinitesimals by promoting the quantities df to functions of infinitesimal changes along particular directions, i.e. functions of tangent vectors. In this sense the function df: T_{p}M\rightarrow\mathbb{R} contains all information about the infinitesimal changes in the function f as it moves in particular directions (i.e. along particular directions). Thus we can consider the functions dx^{i}: T_{p}M\rightarrow\mathbb{R} as containing all information about the infinitesimal change in the coordinate functions in particular directions. I have paraphrased what is written in the book and tried to reformulate it in the way that I can understand it; would what I put be correct?
Also, is there a way to formulate the idea of a differential in elementary calculus (without resorting to non-standard analysis)? Is it correct to say that one can consider the rate of change in a function at a point, f'(x_{0}) which is the gradient of the tangent line y to the function f at this point. From this we can construct a new function df which is dependent on the point x and this change in its value \Delta x, such that df(x_{0},\Delta x)=f'(x_{0})\Delta x Thus, the (finite) change in the function near a point x_{0}, \Delta f can be expressed as the following \Delta f=f'(x_{0})\Delta x +\varepsilon =df+\varepsilon where \varepsilon is some error function. We note that dx(x_{0},\Delta x)=\Delta x and so \Delta f=f'(x_{0})dx +\varepsilon =df+\varepsilon \Rightarrow df=f'(x_{0})dx where df=f'(x_{0})dx represents a (finite) change along the tangent line to the function f at the point x_{0}. I'm unsure how to proceed from here though?!
Something that has always bothered me since first learning calculus is how to interpret dx, essentially, what does it "mean"? I understand that it doesn't make sense to consider it as an infinitesimal change in x (in a rigorous sense) as the idea of an infinitesimal cannot be formulated rigorously (at least in standard analysis), but can any sense of this notion be retained?
I read in Spivak's book that we can not consider quantities such as df in the classical sense, but to overcome this we can replace this notion of infinitesimals by promoting the quantities df to functions of infinitesimal changes along particular directions, i.e. functions of tangent vectors. In this sense the function df: T_{p}M\rightarrow\mathbb{R} contains all information about the infinitesimal changes in the function f as it moves in particular directions (i.e. along particular directions). Thus we can consider the functions dx^{i}: T_{p}M\rightarrow\mathbb{R} as containing all information about the infinitesimal change in the coordinate functions in particular directions. I have paraphrased what is written in the book and tried to reformulate it in the way that I can understand it; would what I put be correct?
Also, is there a way to formulate the idea of a differential in elementary calculus (without resorting to non-standard analysis)? Is it correct to say that one can consider the rate of change in a function at a point, f'(x_{0}) which is the gradient of the tangent line y to the function f at this point. From this we can construct a new function df which is dependent on the point x and this change in its value \Delta x, such that df(x_{0},\Delta x)=f'(x_{0})\Delta x Thus, the (finite) change in the function near a point x_{0}, \Delta f can be expressed as the following \Delta f=f'(x_{0})\Delta x +\varepsilon =df+\varepsilon where \varepsilon is some error function. We note that dx(x_{0},\Delta x)=\Delta x and so \Delta f=f'(x_{0})dx +\varepsilon =df+\varepsilon \Rightarrow df=f'(x_{0})dx where df=f'(x_{0})dx represents a (finite) change along the tangent line to the function f at the point x_{0}. I'm unsure how to proceed from here though?!