- #1

- 580

- 21

## Main Question or Discussion Point

In elementary calculus (and often in courses beyond) we are taught that a differential of a function, ##df## quantifies an

All of this leaves me confused on how to interpret expressions such as $$df=f'(x)dx$$ Should it be seen simply as a definition, quantifying the first-order (linear) change in a function about a point ##x##? i.e. a new function that is dependent both on ##x## and a finite change in ##x##, ##\Delta x##, $$df(x,\Delta x):=f'(x)\Delta x$$ then one can interpret ##dx## as $$dx:=dx(x,\Delta x)= \Delta x$$ such that $$\Delta f=f'(x)dx+\varepsilon =df +\varepsilon$$ (in which ##\varepsilon## quantifies the error between this linear change in ##f## and the actual change in ##f##, with ##\lim_{\Delta x\rightarrow 0}\varepsilon =0##).

I feel that there must be some sort of rigorous treatment of the notion of differentials since these kind of manipulations are used all the time, at least in physics?!

I've had some exposure to differential geometry in which one has differential forms, in particular $1-$forms which suggestively notationally

I've heard people say that differential forms make the notion of a differential of a function mathematically rigorous, however, in my mind I can't seem to reconcile how this is the case, since at best they specify the

If someone could enlighten me on this subject I'd really appreciate it.

*infinitesimal*change in that function. However, the notion of an*infinitesimal*is not well-defined and is nonsensical (I mean, one cannot define it in terms of a limit, and it seems nonsensical to have a number that is*smaller than any other real number*- this simply doesn't exist in standard analysis). Clearly the definition $$df=\lim_{\Delta x\rightarrow 0}\Delta f =f'(x)dx$$ makes no sense, since, in the case where ##f(x)=x## we have that $$ dx=\lim_{\Delta x\rightarrow 0}\Delta x =0.$$All of this leaves me confused on how to interpret expressions such as $$df=f'(x)dx$$ Should it be seen simply as a definition, quantifying the first-order (linear) change in a function about a point ##x##? i.e. a new function that is dependent both on ##x## and a finite change in ##x##, ##\Delta x##, $$df(x,\Delta x):=f'(x)\Delta x$$ then one can interpret ##dx## as $$dx:=dx(x,\Delta x)= \Delta x$$ such that $$\Delta f=f'(x)dx+\varepsilon =df +\varepsilon$$ (in which ##\varepsilon## quantifies the error between this linear change in ##f## and the actual change in ##f##, with ##\lim_{\Delta x\rightarrow 0}\varepsilon =0##).

I feel that there must be some sort of rigorous treatment of the notion of differentials since these kind of manipulations are used all the time, at least in physics?!

I've had some exposure to differential geometry in which one has differential forms, in particular $1-$forms which suggestively notationally

*"look like"*differentials, for example $$\omega =df$$ but as I understand it these are defined as*linear maps*, members of a dual space to some vector space, ##V##, which act on elements of ##V##, mapping them to real numbers. Furthermore, the basis ##1-##-forms are suggestively written as what in elementary calculus one would interpret as an*infinitesimal change in x*, ##dx##. But again, this is simply symbolic notation, since the basis ##1##-forms simply span the dual space and are themselves linear maps which act on elements of ##V##.I've heard people say that differential forms make the notion of a differential of a function mathematically rigorous, however, in my mind I can't seem to reconcile how this is the case, since at best they specify the

*direction*in which the differential change in a function occurs, via $$df(v)=v(f)$$ (since ##v(f)## is the directional derivative of a function ##f## along the vector ##v##).If someone could enlighten me on this subject I'd really appreciate it.