Trivial question on differentiating logarithms

In summary, I think that differentiating a log gives a negative delta, while differentiating a square gives a positive delta.
  • #1
JC2000
186
16
Homework Statement
Given ##Z = A/B##. Prove that ##\frac{\delta Z}{Z} = \frac{\delta A}{A} \pm \frac{delta B}{B}##
Relevant Equations
##Z=\frac A b\implies \log(Z)=\log(A)-\log(B)## making
##\frac{\Delta Z}Z=\frac{\Delta A}A+\frac{\Delta B}B##
My Question :
Shouldn't differentiating ##-log B## give ##\frac{-\delta B}{B}##?
(Note : A, B and Z are variables not constants)

By extension for ##Z=A^a \,B^b\, C^c## where ##c## is negative, should ##\frac{\Delta Z}Z=|a|\frac{\Delta A}A+|b|\frac{\Delta B}B-|c|\frac{\Delta C}C##?
 
Physics news on Phys.org
  • #2
JC2000 said:
Shouldn't differentiating ##-\log B\ \ ## give ##\ \ -\frac{\delta B}{B}\ ##?
Correct. In analysis - which some designate calculus: increase ##B## and ##A/B## decreases.

In error analysis however, we work with squares and the sign disappears.

So: what is your context ? And why this particular notation: first ##\delta##, then ##\Delta## ?
 
  • Like
Likes JC2000
  • #3
Yes, I was trying to understand error analysis, my book stated the following as 'rules for combinations of errors'(addition, multiplication and exponents).

BvU said:
In error analysis however, we work with squares and the sign disappears.

Can you explain that further? Why work with squares?
Thanks for your time...
 
  • #4
JC2000 said:
My Question :
Shouldn't differentiating ##-log B## give ##\frac{-\delta B}{B}##?
Not quite, if by "differentiating" you mean "Find the derivative of -log(B)". In this case, ##\frac d {dB}\left(-\log(B)\right) = -\frac 1 B##
If you're asking about the differential of -log(B) -- d(-log(B)) -- then that would be ##d(-\log(B)) = -\frac {dB} B##

I'm assuming here that log here means natural log, or ln.
JC2000 said:
(Note : A, B and Z are variables not constants)

By extension for ##Z=A^a \,B^b\, C^c## where ##c## is negative, should ##\frac{\Delta Z}Z=|a|\frac{\Delta A}A+|b|\frac{\Delta B}B-|c|\frac{\Delta C}C##?
 
  • Like
Likes JC2000
  • #5
Mark44 said:
Not quite, if by "differentiating" you mean "Find the derivative of -log(B)". In this case, ##\frac d {dB}\left(-\log(B)\right) = -\frac 1 B##
If you're asking about the differential of -log(B) -- d(-log(B)) -- then that would be ##d(-\log(B)) = -\frac {dB} B##

I'm assuming here that log here means natural log, or ln.

Seems like I have not understood some fundamentals here! I thought derivative of a function was the same as finding the differential?

Yes as using log would unnecessarily give constants, though they would cancel out in the overall equation(?).
 
  • #6
JC2000 said:
Seems like I have not understood some fundamentals here! I thought derivative of a function was the same as finding the differential?
No, these are different processes. If y = f(x), then ##\frac {dy}{dx} = f'(x)##, but ##dy = f'(x)dx##.

For example, if ##y = x^2##, then ##\frac {dy}{dx} = 2x##, and ##dy = 2x~dx##.
Other notation for ##\frac {dy}{dx}## is y'.
 
  • Like
Likes JC2000
  • #7
So there is a difference other than simply notation/ manipulation of equations?
 
  • #8
JC2000 said:
Can you explain that further? Why work with squares?
It's the way random variable combine. Say you have two independent random variables, X and Y, and you calculate ##Z = X+Y##. This new variable, ##Z##, is also a random variable. It turns out it's the variances, ##\sigma^2##, and not the standard deviations, which add. In other words, you have ##\sigma_Z^2 = \sigma_X^2 + \sigma_Y^2##, not ##\sigma_Z = \sigma_X + \sigma_Y## as you might assume.
 
  • Like
Likes JC2000
  • #9
vela said:
It's the way random variable combine. Say you have two independent random variables, X and Y, and you calculate ##Z = X+Y##. This new variable, ##Z##, is also a random variable. It turns out it's the variances, ##\sigma^2##, and not the standard deviations, which add. In other words, you have ##\sigma_Z^2 = \sigma_X^2 + \sigma_Y^2##, not ##\sigma_Z = \sigma_X + \sigma_Y## as you might assume.

I am at sea with this correlation between standard deviations and the derivatives of variables. Could you recommend a source from where I can learn more? My textbooks have not mentioned this. Thanks!
 
  • #10
JC2000 said:
Yes, I was trying to understand error analysis, my book stated the following as 'rules for combinations of errors'(addition, multiplication and exponents).
Can you explain that further? Why work with squares?
Check here (or read the whole thing if you are up to it). Has to do with
Most commonly, the uncertainty on a quantity is quantified in terms of the standard deviation, σ, the positive square root of variance, σ2

http://ipl.physics.harvard.edu/wp-uploads/2013/03/PS3_Error_Propagation_sp13.pdf is justified if errors are independent (uncorrelated) and have a gaussian distribution.
The keyword is independent: adding linearly -- as you would do with fully correlated errors -- is a kind of worst-case approach. Compare the expression for ##\Delta AB## with that for ##\Delta A^2##.
 
  • Like
Likes JC2000
  • #11
BvU said:
Compare the expression for ##\Delta AB## with that for ##\Delta A^2##.

Do you mean the distinction between the result for ##Z = AB## and ##Z=A^2## (i.e ##\frac{\delta Z}{Z}=\frac{\delta A}{A}+\frac{\delta B}{B}## as compared to ##Z=\frac{2\delta A}{A}##)?

Thanks for the links, I guess I have a lot of reading to do before it would make sense to ask further questions...
 
  • #12
JC2000 said:
Do you mean
Yes. The error in ##A## is 100% correlated with the error in ##A##

Keep asking. That way we get a chance to freshen up some old brain dust :biggrin: .
 
  • #13
vela said:
It's the way random variable combine.
I'm not sure that the OP's question had to do with random variables. I could be wrong.
JC2000 said:
Given ##Z = A/B##. Prove that ##\frac{\delta Z}{Z} = \frac{\delta A}{A} \pm \frac{delta B}{B}##
First, to get this character -- ##\Delta## -- uppercase delta, use this TEX: \Delta. This script -- \delta -- renders as lowercase delta, ##\delta##.

Second, there should not be a ##\pm## symbol in the formula you're supposed to prove. This formula should be ##\frac{\Delta Z}{Z} = \frac{\Delta A}{A} - \frac{\Delta B}{B}##

In terms of differentials, this would be ##\frac{dZ} Z = \frac {dA} A - \frac{dB} B##.

##\Delta A## and ##\Delta B## are typically signed quantities; if A increases, then ##\Delta A## will be positive, and if A decreases, then ##\Delta A## will be negative. Similar for B and ##\Delta B##.
JC2000 said:
So there is a difference other than simply notation/ manipulation of equations?
Absolutely -- differentials and derivatives are two different things. A differential represents the change in function value produced by an infinitesimally small change in the independent variable. A derivative represents the slope of the tangent line at a particular point.
 
  • Like
Likes JC2000
  • #14
Mark44 said:
Second, there should not be a ##\pm## symbol in the formula you're supposed to prove. This formula should be ##\frac{\Delta Z}{Z} = \frac{\Delta A}{A} - \frac{\Delta B}{B}##
The context is error analysis (combination of errors), my book shows the result for ##Z=AB## to be ##\frac{\Delta Z}{Z}= \frac{\Delta A}{A}+\frac{\Delta B}{B}##.

I was unsure as to why the same result could be derived if ##Z=A/B## (since ##-ln B##). Regarding that it seems 'squares' are used in the case of error analysis and hence the sign remains same (I am a little lost about this but I have yet to thoroughly go through the resources that explain this provided by other contributors).
Mark44 said:
In terms of differentials, this would be ##\frac{dZ} Z = \frac {dA} A - \frac{dB} B##.
Mark44 said:
Absolutely -- differentials and derivatives are two different things. A differential represents the change in function value produced by an infinitesimally small change in the independent variable. A derivative represents the slope of the tangent line at a particular point.

I see hence the differential of ##y = -log B (say)## would be ##d(-log(B))## gives ##\frac{-dB}{B}## i.e the infinitesimal change in dependent variable B on changing the dependent variable ##y## by a very small value. And this is represented as :
##dy = f'(x)dx##

While finding the derivative of function ##-log B## gives ##\frac{-1}{B}##, ie ## \frac{dy}{dx} = f'(x)##.(?)

I think I understand the distinction between differential and derivative now. Thanks!
 
Last edited by a moderator:
  • #15
JC2000 said:
I was unsure as to why the same result could be derived if ##Z=A/B## (since ##-ln B##). Regarding that it seems 'squares' are used in the case of error analysis and hence the sign remains same (I am a little lost about this but I have yet to thoroughly go through the resources that explain this provided by other contributors).
You can get either one (for either Z = AB or Z = A/B) by using logarithmic differentiation. This technique involves taking the log of both sides, then taking the differential of both sides. It doesn't have anything to do with squares, but differentials are involved in error analysis.

Here's how log differentiation works for Z = AB:
Take the log of both sides (I'm using ##\ln## or ##\log_e##)
##\ln Z = \ln (AB) = \ln A + \ln B##
Take differentials: ##\frac 1 Z ~dZ = \frac 1 A~dA + \frac 1 B~dB##, or
##\frac {dZ}Z = \frac {dA}A + \frac{dB}B##

If ##\Delta Z, \Delta A,## and ##\Delta B## are reasonably small, we can write this equation with ##\Delta## quantities instead of dZ, dA, and dB.
Exactly the same technique can be used to show that if Z = A/B, the ##\frac {dZ}Z = \frac {dA}A - \frac{dB}B##
There's no magic involved.
JC2000 said:
While finding the derivative of function ##-log B## gives ##\frac{-1}{B}##, ie ## \frac{dy}{dx} = f'(x)##.(?)
Yes, assuming you're talking about the natural log function, ##\ln##.
As a complete mathematica thought, ##\frac d {dB}\left(-\log B\right) = \frac {-1} B##.
 
  • #16
Precisely what I thought, but :
BvU said:
In error analysis however, we work with squares and the sign disappears.

And the fact that my book makes no mention of this (hence I am assuming that the formula with + applies to both cases..)
 
  • #17
JC2000 said:
And the fact that my book makes no mention of this (hence I am assuming that the formula with + applies to both cases..)
No. See my post #15. The signs are different for Z = AB and Z = A/B.
 
  • #18
Mark44 said:
No. See my post #15. The signs are different for Z = AB and Z = A/B.

But #8 and #2 seem to suggest otherwise (though I am clear about your line of reasoning).
 
  • #19
JC2000 said:
But #8 and #2 seem to suggest otherwise (though I am clear about your line of reasoning).
No, they don't. In post #2, BvU misunderstood what you were asking about.
BvU said:
In error analysis however, we work with squares and the sign disappears.
Your two problems have nothing to do with squares.

In post #8, vela also misunderstood, thinking you were asking about random variables, which from what I've seen in the thread so far, aren't related to what you are asking about.

I believe that the notation you used, with capital letters (which are associated with probability and random variables) and lower-case Greek letter delta (##\delta##) confused some of the responders.

So...
If z = ab, then ##\frac {dz}z = \frac {da}a## + ##\frac {db}b##
and If z = a/b, then ##\frac {dz}z = \frac {da}a## - ##\frac {db}b##
Period.
 
  • Like
Likes JC2000
  • #20
I guess for ##Z=A/B## the maximum relative error could be the expression with the plus sign?
Since in the problem (Find the percentage error in ##x##, where ##x = \frac{(a^3)(b^3)}{c(\sqrt d)}## where the percentage errors in a,b,c and d are 2%, 1%, 3% and 4%, the answer is ##\pm 14%##) the correct solution can be obtained by using the expression without the negative values. I suppose I confused the two different results. Sorry for the confusion and thank you for bearing with me.
 
  • #21
JC2000 said:
I am at sea with this correlation between standard deviations and the derivatives of variables. Could you recommend a source from where I can learn more? My textbooks have not mentioned this. Thanks!
I found this online: http://ipl.physics.harvard.edu/wp-uploads/2013/03/PS3_Error_Propagation_sp13.pdf
 
  • #22
vela said:
I found this online: http://ipl.physics.harvard.edu/wp-uploads/2013/03/PS3_Error_Propagation_sp13.pdf

Yes I am in the midst of going through that link (suggested in #10).

Also, is this the case?
Mark44 said:
In post #8, vela also misunderstood, thinking you were asking about random variables, which from what I've seen in the thread so far, aren't related to what you are asking about.
 
  • #23
JC2000 said:
Also, is this the case?
No, I was addressing the question about why BvU referred to squares of the uncertainty. The PDF explains we don't typically just add uncertainties due to the probabilistic nature of error.
 
  • #24
Take logs of both sides of your equation, then take the differentials. For the max. rel. error, they are adding the absolute values of the individual error terms.

For example, if ##z = \frac {a^2\sqrt b}{c^3}##, then
##\ln z = 2\ln a + \frac 1 2 \ln b - 3\ln c##
Taking differentials:
##\frac {dz}z = \frac{2da}a + \frac {.5db} b - \frac {3 dc}c##

If the percentage errors of a, b, and c are 1%, 2%, and 3%, then the max. rel. error would be 2|.01| + .5|.02| + 3|-.01| = ±.06 or ±6%.

I think I'm doing this right...
 
  • #25
Mark44 said:
Take logs of both sides of your equation, then take the differentials. For the max. rel. error, they are adding the absolute values of the individual error terms.

For example, if ##z = \frac {a^2\sqrt b}{c^3}##, then
##\ln z = 2\ln a + \frac 1 2 \ln b - 3\ln c##
Taking differentials:
##\frac {dz}z = \frac{2da}a + \frac {.5db} b - \frac {3 dc}c##

If the percentage errors of a, b, and c are 1%, 2%, and 3%, then the max. rel. error would be 2|.01| + .5|.02| + 3|-.01| = ±.06 or ±6%.

I think I'm doing this right...

Yes but I guess they seem to be adding the last terms and not subtracting...
 
  • #26
BvU said:
In error analysis however, we work with squares and the sign disappears.
Mark44 said:
No, they don't. In post #2, BvU misunderstood what you were asking about.

So which is it now I wonder...
 
  • #27
A cruel offtop was here. I should have read initial topic carefully. Sorry
 
Last edited:
  • #28
We have a few issues happily mixing up to a very confusing thread o0)
  • differentiation
  • propagation of errors linearly
  • propagation of errors in quadrature
  • [edit] and now wrobel on logatrithms ... oh boy :rolleyes:
Mark is into differentiation and makes no mistakes (except perhaps slightly overlooking the error analysis context as hinted at in #2 and confirmed in #3)

Your book adds errors linearly and vela and I are in the quadrature camp - so we slightly missed noticing the different approach in your book (as obvious from #14).

My motto is 'try a simple example'.

To avoid sign issues I multiply ##A \pm \Delta A ## and ##B \pm \Delta B ##. Result ##AB##.

Worst case down ##(A - \Delta A) (B- \Delta B) = AB - B\Delta A- A\Delta B ## to first order in ##\Delta## (i.e. assuming ##\Delta A \Delta B << ## other terms).

Worst case up ##(A + \Delta A) (B+ \Delta B) = AB + B\Delta A+ A\Delta B ##.

So worst case ##{\Delta AB\over AB} = {\Delta A\over A} + {\Delta B\over B}## -- as in your book I hope.Check this two ways:
  1. in excel multiply ##100 \pm 5 ## and ##120 \pm 12 ## or something in Excel or on a calculator.
  2. on a piece of paper draw a rectangle of 100 x 120 and the lines at 95, 105 and 108, 132
Worst case is pessimistic: if the errors are independent the probability that both are way off is smaller: the probability distributions have to be combined and then the squares come in.

You can imagine the probability distribution around the 100 as a vertical shading in gray-ish with the darkest grey at 100 and a bit lighter further away. Idem horizontally at 120.
The probability distribution of the product then becomes a gaussian hat around 100x120.

Hmm, googling some picturesmight be useful here. No time now. Work :frown:Lets first synchronize at this point -- all clear and all agreed ? :nb)
 
  • Like
Likes JC2000

1. What is the basic concept behind differentiating logarithms?

The basic concept behind differentiating logarithms is to find the rate of change of a logarithmic function with respect to its input variable. This is done by using the rules of differentiation, such as the power rule and the chain rule, to find the derivative of the logarithmic function.

2. How do you differentiate a logarithmic function?

To differentiate a logarithmic function, you can use the power rule and chain rule. First, take the natural logarithm of the function to convert it into a simpler form. Then, apply the power rule to the simplified function. Finally, use the chain rule to find the derivative of the original logarithmic function.

3. Can you provide an example of differentiating a logarithmic function?

Sure, let's say we have the function f(x) = ln(2x). First, we take the natural logarithm of the function to get f(x) = ln(2) + ln(x). Then, we apply the power rule to get f'(x) = 0 + 1/x = 1/x. Finally, we use the chain rule to find the derivative of the original function, which is f'(x) = 1/(2x).

4. What is the importance of differentiating logarithms in science?

Differentiating logarithms is important in science because it allows us to analyze and understand the behavior of logarithmic functions, which are commonly used in many scientific fields. It also helps us to find the rate of change of these functions, which is crucial in many applications such as physics, chemistry, and biology.

5. Are there any special cases when differentiating logarithmic functions?

Yes, there are a few special cases when differentiating logarithmic functions. One example is when the input variable is a constant, in which case the derivative of the function is 0. Another case is when the input variable is 1, in which case the derivative is undefined. Additionally, if the logarithmic function contains a variable in the exponent, we use the logarithmic differentiation method to find its derivative.

Similar threads

  • Calculus and Beyond Homework Help
Replies
7
Views
241
  • Calculus and Beyond Homework Help
Replies
7
Views
546
  • Calculus and Beyond Homework Help
Replies
2
Views
488
  • Calculus and Beyond Homework Help
Replies
2
Views
283
  • Calculus and Beyond Homework Help
Replies
1
Views
240
  • Calculus and Beyond Homework Help
Replies
6
Views
739
  • Calculus and Beyond Homework Help
Replies
8
Views
195
  • Topology and Analysis
Replies
14
Views
444
  • Calculus and Beyond Homework Help
Replies
1
Views
562
  • Calculus and Beyond Homework Help
Replies
2
Views
348
Back
Top