So this is a somewhat random question as I'm asking it not because I've ever seen it as any sort of homework problem or the like, but more that I just can't seem to find a good explanation for it.(adsbygoogle = window.adsbygoogle || []).push({});

Long ago I first learned about taking limits and, eventually, calculus. And of course in working with calculus we first see our beloved differentials of x and y "dx" and "dy," respectively. We learn that it is some infinitesimally small change in the x or y variable (respectively), and to find a derivative with respect to some variable, we compare an infinitesimally small change in the function versus a comparably small change in our chosen variable.

It was also dropped to me somewhere along the track that "dx/dy isn't technically a fraction"... and yet time after time we'll "multiply two sides of the equation by dx or dt" to help solve a differential equation, set up an integral, etc.

So I suppose my question is... can we really treat these differentials as "just really tiny numbers" that can be multiplied divided etc. to our liking? If I'm remembering correctly, Einstein once confessed to treating derivatives like fractions and later realizing it was a mistake. But if it is, why do we time and time again "play around with them" as if a derivative is a fraction?

Perhaps I just got lost somewhere along the way I got lost in definitions, or perhaps it's a fuzzy topic not usually covered thoroughly? It just irks me to no end!

Thanks in advance for any illumination or clarification anyone might have.

- Brom

**Physics Forums | Science Articles, Homework Help, Discussion**

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# How do we technically work with differentials?

**Physics Forums | Science Articles, Homework Help, Discussion**