How do we technically work with differentials?

  • Context: Undergrad 
  • Thread starter Thread starter Brom
  • Start date Start date
  • Tags Tags
    Differentials Work
Click For Summary
SUMMARY

This discussion centers on the conceptual understanding of differentials in calculus, specifically the notation "dx" and "dy." Participants explore the nature of differentials as infinitesimally small changes in variables and the common practice of manipulating these symbols as if they were fractions, despite the technical distinction. The conversation highlights the confusion surrounding the treatment of derivatives and differentials, referencing Einstein's acknowledgment of the potential pitfalls in this approach. Ultimately, the discussion seeks clarity on the mathematical legitimacy of treating differentials as numbers for practical applications in calculus.

PREREQUISITES
  • Understanding of basic calculus concepts, including limits and derivatives.
  • Familiarity with differential notation, specifically "dx" and "dy."
  • Knowledge of integral calculus and its applications.
  • Awareness of historical perspectives on calculus, including Einstein's views.
NEXT STEPS
  • Research the formal definitions of differentials in calculus.
  • Explore the concept of infinitesimals in non-standard analysis.
  • Study the implications of treating derivatives as fractions in calculus.
  • Read about the historical context of calculus and notable mathematicians' perspectives.
USEFUL FOR

Students of calculus, educators teaching differential calculus, and mathematicians interested in the foundational concepts of differentials and their applications in mathematical analysis.

Brom
Messages
4
Reaction score
0
So this is a somewhat random question as I'm asking it not because I've ever seen it as any sort of homework problem or the like, but more that I just can't seem to find a good explanation for it.

Long ago I first learned about taking limits and, eventually, calculus. And of course in working with calculus we first see our beloved differentials of x and y "dx" and "dy," respectively. We learn that it is some infinitesimally small change in the x or y variable (respectively), and to find a derivative with respect to some variable, we compare an infinitesimally small change in the function versus a comparably small change in our chosen variable.

It was also dropped to me somewhere along the track that "dx/dy isn't technically a fraction"... and yet time after time we'll "multiply two sides of the equation by dx or dt" to help solve a differential equation, set up an integral, etc.

So I suppose my question is... can we really treat these differentials as "just really tiny numbers" that can be multiplied divided etc. to our liking? If I'm remembering correctly, Einstein once confessed to treating derivatives like fractions and later realizing it was a mistake. But if it is, why do we time and time again "play around with them" as if a derivative is a fraction?

Perhaps I just got lost somewhere along the way I got lost in definitions, or perhaps it's a fuzzy topic not usually covered thoroughly? It just irks me to no end!

Thanks in advance for any illumination or clarification anyone might have.

- Brom
 
Physics news on Phys.org
Ah, thanks for the link. I might follow up by looking at the one book in there that was referenced.
 

Similar threads

  • · Replies 22 ·
Replies
22
Views
4K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 15 ·
Replies
15
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 14 ·
Replies
14
Views
3K
  • · Replies 38 ·
2
Replies
38
Views
5K
  • · Replies 1 ·
Replies
1
Views
2K