When can I apply the idea of differentials?

In summary: Differentials are linear because they preserve the vector gradient. That is, if you have two vectors and you deform one of them by a different amount than the other, the gradient of the difference vector is the same as the gradient of the original vector. In summary, this means that derivatives are linear forms that map differentials into differentials.
  • #1
LucasGB
181
0
My calculus book says sometimes derivatives can be regarded as the ratio of differentials, and sometimes they can't. Apparently, there's a similar rule for integrals. When can I think of derivatives and integrals as operations with differentials? And when can't I?
 
Physics news on Phys.org
  • #2
LucasGB said:
My calculus book says sometimes derivatives can be regarded as the ratio of differentials, and sometimes they can't. Apparently, there's a similar rule for integrals. When can I think of derivatives and integrals as operations with differentials? And when can't I?
What exactly does your book say? What are its exact words. While, strictly speaking, derivatives are NOT ratios, I can't think of a case in which they cannot be regarded as ratios of differentials.
 
  • #3
LucasGB said:
My calculus book says sometimes derivatives can be regarded as the ratio of differentials, and sometimes they can't. Apparently, there's a similar rule for integrals. When can I think of derivatives and integrals as operations with differentials? And when can't I?

The slope of a tangent line can be thought of as the deviation of y, dy, for a given deviation of x, dx. This is a ratio.
 
  • #4
HallsofIvy said:
What exactly does your book say? What are its exact words. While, strictly speaking, derivatives are NOT ratios, I can't think of a case in which they cannot be regarded as ratios of differentials.

Actually, now that I think about it, aren't all derivatives the ratio of differentials? I say this based on the definition of differentials:

dx = (dx/dy)dy, where (dx/dy) is the derivative of x with respect to dy. Therefore:

dx/dy = (dx/dy). What do you think?

PS. 1: I think this whole differential business is one big mess because every author seems to be afraid to address the subject in a clear way. The consequence is that students like me have to go through great lengths in order to try and understand whether differentials are, or aren't, a legitimate concept of mathematics which can be employed without fear.

PS. 2: Thank you all for your replies.
 
  • #5
LucasGB said:
Actually, now that I think about it, aren't all derivatives the ratio of differentials? I say this based on the definition of differentials:

dx = (dx/dy)dy, where (dx/dy) is the derivative of x with respect to dy. Therefore:

dx/dy = (dx/dy). What do you think?

PS. 1: I think this whole differential business is one big mess because every author seems to be afraid to address the subject in a clear way. The consequence is that students like me have to go through great lengths in order to try and understand whether differentials are, or aren't, a legitimate concept of mathematics which can be employed without fear.

PS. 2: Thank you all for your replies.
originally differentials were considered to be small deviations of the variables and their ratio was an approximation to the derivative. The question was - given a very small deviation in one variable what is the deviation on the other provided these deviations are small - where small means small enough to give a good linear approximation - much like a regression.

So for instance if x^2 - y =2 then at the point (1,1) 2dx - dy = 0 meaning twice the small deviation in x is the small deviation in y, approximately - where "approximately" means that this gives the best linear relationship for small enough deviations. I always think of differentials as small deviations and in Physics that is how they are thought of.

In modern mathematics differentials are thought of as linear 1 forms that map tangent vectors into tangent vectors. This is a formalism that obscures this fundamental intuition.
 

1. When is the concept of differentials applicable in science?

The concept of differentials is applicable in science when dealing with quantities that are continuously changing, such as in physics, chemistry, and engineering. It is also commonly used in mathematical modeling and data analysis.

2. How do differentials help in scientific research?

Differentials provide a way to measure and analyze small changes in a quantity, which can be crucial in understanding complex systems and making accurate predictions. They also allow for more precise calculations and help to simplify complicated mathematical equations.

3. Can differentials be used in real-world applications?

Yes, differentials are used in many real-world applications, such as in designing and optimizing machinery, predicting weather patterns, and studying population dynamics. They are also commonly used in fields like economics, biology, and medicine.

4. Are there any limitations to using differentials in science?

While differentials are a powerful tool in scientific research, they do have some limitations. They are only applicable to continuously changing quantities, and may not be accurate in cases where there are sudden, discontinuous changes. Additionally, the accuracy of the results depends on the accuracy of the initial measurements.

5. How can I learn more about applying differentials in science?

There are many resources available for learning about differentials and their applications in science. You can start by studying calculus, which is the branch of mathematics that deals with differentials and their properties. Additionally, there are many books, online courses, and tutorials that specifically focus on the application of differentials in various scientific fields.

Similar threads

Replies
46
Views
1K
Replies
1
Views
3K
Replies
22
Views
2K
Replies
2
Views
1K
Replies
6
Views
1K
Replies
20
Views
2K
Replies
10
Views
3K
  • Calculus
Replies
4
Views
3K
Replies
14
Views
2K
Back
Top