• Support PF! Buy your school textbooks, materials and every day products via PF Here!

Trivial question on differentiating logarithms

104
5
Problem Statement
Given ##Z = A/B##. Prove that ##\frac{\delta Z}{Z} = \frac{\delta A}{A} \pm \frac{delta B}{B}##
Relevant Equations
##Z=\frac A b\implies \log(Z)=\log(A)-\log(B)## making
##\frac{\Delta Z}Z=\frac{\Delta A}A+\frac{\Delta B}B##
My Question :
Shouldn't differentiating ##-log B## give ##\frac{-\delta B}{B}##?
(Note : A, B and Z are variables not constants)

By extension for ##Z=A^a \,B^b\, C^c## where ##c## is negative, should ##\frac{\Delta Z}Z=|a|\frac{\Delta A}A+|b|\frac{\Delta B}B-|c|\frac{\Delta C}C##?
 

BvU

Science Advisor
Homework Helper
11,806
2,571
Shouldn't differentiating ##-\log B\ \ ## give ##\ \ -\frac{\delta B}{B}\ ##?
Correct. In analysis - which some designate calculus: increase ##B## and ##A/B## decreases.

In error analysis however, we work with squares and the sign disappears.

So: what is your context ? And why this particular notation: first ##\delta##, then ##\Delta## ?
 
104
5
Yes, I was trying to understand error analysis, my book stated the following as 'rules for combinations of errors'(addition, multiplication and exponents).

In error analysis however, we work with squares and the sign disappears.
Can you explain that further? Why work with squares?
Thanks for your time...
 
32,344
4,130
My Question :
Shouldn't differentiating ##-log B## give ##\frac{-\delta B}{B}##?
Not quite, if by "differentiating" you mean "Find the derivative of -log(B)". In this case, ##\frac d {dB}\left(-\log(B)\right) = -\frac 1 B##
If you're asking about the differential of -log(B) -- d(-log(B)) -- then that would be ##d(-\log(B)) = -\frac {dB} B##

I'm assuming here that log here means natural log, or ln.
JC2000 said:
(Note : A, B and Z are variables not constants)

By extension for ##Z=A^a \,B^b\, C^c## where ##c## is negative, should ##\frac{\Delta Z}Z=|a|\frac{\Delta A}A+|b|\frac{\Delta B}B-|c|\frac{\Delta C}C##?
 
104
5
Not quite, if by "differentiating" you mean "Find the derivative of -log(B)". In this case, ##\frac d {dB}\left(-\log(B)\right) = -\frac 1 B##
If you're asking about the differential of -log(B) -- d(-log(B)) -- then that would be ##d(-\log(B)) = -\frac {dB} B##

I'm assuming here that log here means natural log, or ln.
Seems like I have not understood some fundamentals here! I thought derivative of a function was the same as finding the differential?

Yes as using log would unnecessarily give constants, though they would cancel out in the overall equation(?).
 
32,344
4,130
Seems like I have not understood some fundamentals here! I thought derivative of a function was the same as finding the differential?
No, these are different processes. If y = f(x), then ##\frac {dy}{dx} = f'(x)##, but ##dy = f'(x)dx##.

For example, if ##y = x^2##, then ##\frac {dy}{dx} = 2x##, and ##dy = 2x~dx##.
Other notation for ##\frac {dy}{dx}## is y'.
 
104
5
So there is a difference other than simply notation/ manipulation of equations?
 

vela

Staff Emeritus
Science Advisor
Homework Helper
Education Advisor
14,328
1,046
Can you explain that further? Why work with squares?
It's the way random variable combine. Say you have two independent random variables, X and Y, and you calculate ##Z = X+Y##. This new variable, ##Z##, is also a random variable. It turns out it's the variances, ##\sigma^2##, and not the standard deviations, which add. In other words, you have ##\sigma_Z^2 = \sigma_X^2 + \sigma_Y^2##, not ##\sigma_Z = \sigma_X + \sigma_Y## as you might assume.
 
104
5
It's the way random variable combine. Say you have two independent random variables, X and Y, and you calculate ##Z = X+Y##. This new variable, ##Z##, is also a random variable. It turns out it's the variances, ##\sigma^2##, and not the standard deviations, which add. In other words, you have ##\sigma_Z^2 = \sigma_X^2 + \sigma_Y^2##, not ##\sigma_Z = \sigma_X + \sigma_Y## as you might assume.
I am at sea with this correlation between standard deviations and the derivatives of variables. Could you recommend a source from where I can learn more? My textbooks have not mentioned this. Thanks!
 

BvU

Science Advisor
Homework Helper
11,806
2,571
Yes, I was trying to understand error analysis, my book stated the following as 'rules for combinations of errors'(addition, multiplication and exponents).
Can you explain that further? Why work with squares?
Check here (or read the whole thing if you are up to it). Has to do with
Most commonly, the uncertainty on a quantity is quantified in terms of the standard deviation, σ, the positive square root of variance, σ2
Adding errors in quadrature is justified if errors are independent (uncorrelated) and have a gaussian distribution.
The keyword is independent: adding linearly -- as you would do with fully correlated errors -- is a kind of worst-case approach. Compare the expression for ##\Delta AB## with that for ##\Delta A^2##.
 
104
5
Compare the expression for ##\Delta AB## with that for ##\Delta A^2##.
Do you mean the distinction between the result for ##Z = AB## and ##Z=A^2## (i.e ##\frac{\delta Z}{Z}=\frac{\delta A}{A}+\frac{\delta B}{B}## as compared to ##Z=\frac{2\delta A}{A}##)?

Thanks for the links, I guess I have a lot of reading to do before it would make sense to ask further questions...
 

BvU

Science Advisor
Homework Helper
11,806
2,571
Do you mean
Yes. The error in ##A## is 100% correlated with the error in ##A##

Keep asking. That way we get a chance to freshen up some old brain dust :biggrin: .
 
32,344
4,130
It's the way random variable combine.
I'm not sure that the OP's question had to do with random variables. I could be wrong.
JC2000 said:
Given ##Z = A/B##. Prove that ##\frac{\delta Z}{Z} = \frac{\delta A}{A} \pm \frac{delta B}{B}##
First, to get this character -- ##\Delta## -- uppercase delta, use this TEX: \Delta. This script -- \delta -- renders as lowercase delta, ##\delta##.

Second, there should not be a ##\pm## symbol in the formula you're supposed to prove. This formula should be ##\frac{\Delta Z}{Z} = \frac{\Delta A}{A} - \frac{\Delta B}{B}##

In terms of differentials, this would be ##\frac{dZ} Z = \frac {dA} A - \frac{dB} B##.

##\Delta A## and ##\Delta B## are typically signed quantities; if A increases, then ##\Delta A## will be positive, and if A decreases, then ##\Delta A## will be negative. Similar for B and ##\Delta B##.


So there is a difference other than simply notation/ manipulation of equations?
Absolutely -- differentials and derivatives are two different things. A differential represents the change in function value produced by an infinitesimally small change in the independent variable. A derivative represents the slope of the tangent line at a particular point.
 
104
5
Second, there should not be a ##\pm## symbol in the formula you're supposed to prove. This formula should be ##\frac{\Delta Z}{Z} = \frac{\Delta A}{A} - \frac{\Delta B}{B}##
The context is error analysis (combination of errors), my book shows the result for ##Z=AB## to be ##\frac{\Delta Z}{Z}= \frac{\Delta A}{A}+\frac{\Delta B}{B}##.

I was unsure as to why the same result could be derived if ##Z=A/B## (since ##-ln B##). Regarding that it seems 'squares' are used in the case of error analysis and hence the sign remains same (I am a little lost about this but I have yet to thoroughly go through the resources that explain this provided by other contributors).
Mark44 said:
In terms of differentials, this would be ##\frac{dZ} Z = \frac {dA} A - \frac{dB} B##.
Absolutely -- differentials and derivatives are two different things. A differential represents the change in function value produced by an infinitesimally small change in the independent variable. A derivative represents the slope of the tangent line at a particular point.
I see hence the differential of ##y = -log B (say)## would be ##d(-log(B))## gives ##\frac{-dB}{B}## i.e the infinitesimal change in dependent variable B on changing the dependent variable ##y## by a very small value. And this is represented as :
##dy = f'(x)dx##

While finding the derivative of function ##-log B## gives ##\frac{-1}{B}##, ie ## \frac{dy}{dx} = f'(x)##.(?)

I think I understand the distinction between differential and derivative now. Thanks!
 
Last edited by a moderator:
32,344
4,130
I was unsure as to why the same result could be derived if ##Z=A/B## (since ##-ln B##). Regarding that it seems 'squares' are used in the case of error analysis and hence the sign remains same (I am a little lost about this but I have yet to thoroughly go through the resources that explain this provided by other contributors).
You can get either one (for either Z = AB or Z = A/B) by using logarithmic differentiation. This technique involves taking the log of both sides, then taking the differential of both sides. It doesn't have anything to do with squares, but differentials are involved in error analysis.

Here's how log differentiation works for Z = AB:
Take the log of both sides (I'm using ##\ln## or ##\log_e##)
##\ln Z = \ln (AB) = \ln A + \ln B##
Take differentials: ##\frac 1 Z ~dZ = \frac 1 A~dA + \frac 1 B~dB##, or
##\frac {dZ}Z = \frac {dA}A + \frac{dB}B##

If ##\Delta Z, \Delta A,## and ##\Delta B## are reasonably small, we can write this equation with ##\Delta## quantities instead of dZ, dA, and dB.
Exactly the same technique can be used to show that if Z = A/B, the ##\frac {dZ}Z = \frac {dA}A - \frac{dB}B##
There's no magic involved.
JC2000 said:
While finding the derivative of function ##-log B## gives ##\frac{-1}{B}##, ie ## \frac{dy}{dx} = f'(x)##.(?)
Yes, assuming you're talking about the natural log function, ##\ln##.
As a complete mathematica thought, ##\frac d {dB}\left(-\log B\right) = \frac {-1} B##.
 
104
5
Precisely what I thought, but :
In error analysis however, we work with squares and the sign disappears.
And the fact that my book makes no mention of this (hence I am assuming that the formula with + applies to both cases..)
 
32,344
4,130
And the fact that my book makes no mention of this (hence I am assuming that the formula with + applies to both cases..)
No. See my post #15. The signs are different for Z = AB and Z = A/B.
 
104
5
No. See my post #15. The signs are different for Z = AB and Z = A/B.
But #8 and #2 seem to suggest otherwise (though I am clear about your line of reasoning).
 
32,344
4,130
But #8 and #2 seem to suggest otherwise (though I am clear about your line of reasoning).
No, they don't. In post #2, BvU misunderstood what you were asking about.
In error analysis however, we work with squares and the sign disappears.
Your two problems have nothing to do with squares.

In post #8, vela also misunderstood, thinking you were asking about random variables, which from what I've seen in the thread so far, aren't related to what you are asking about.

I believe that the notation you used, with capital letters (which are associated with probability and random variables) and lower-case Greek letter delta (##\delta##) confused some of the responders.

So...
If z = ab, then ##\frac {dz}z = \frac {da}a## + ##\frac {db}b##
and If z = a/b, then ##\frac {dz}z = \frac {da}a## - ##\frac {db}b##
Period.
 
104
5
I guess for ##Z=A/B## the maximum relative error could be the expression with the plus sign?
Since in the problem (Find the percentage error in ##x##, where ##x = \frac{(a^3)(b^3)}{c(\sqrt d)}## where the percentage errors in a,b,c and d are 2%, 1%, 3% and 4%, the answer is ##\pm 14%##) the correct solution can be obtained by using the expression without the negative values. I suppose I confused the two different results. Sorry for the confusion and thank you for bearing with me.
 

vela

Staff Emeritus
Science Advisor
Homework Helper
Education Advisor
14,328
1,046
104
5

vela

Staff Emeritus
Science Advisor
Homework Helper
Education Advisor
14,328
1,046
Also, is this the case?
No, I was addressing the question about why BvU referred to squares of the uncertainty. The PDF explains we don't typically just add uncertainties due to the probabilistic nature of error.
 
32,344
4,130
Take logs of both sides of your equation, then take the differentials. For the max. rel. error, they are adding the absolute values of the individual error terms.

For example, if ##z = \frac {a^2\sqrt b}{c^3}##, then
##\ln z = 2\ln a + \frac 1 2 \ln b - 3\ln c##
Taking differentials:
##\frac {dz}z = \frac{2da}a + \frac {.5db} b - \frac {3 dc}c##

If the percentage errors of a, b, and c are 1%, 2%, and 3%, then the max. rel. error would be 2|.01| + .5|.02| + 3|-.01| = ±.06 or ±6%.

I think I'm doing this right...
 
104
5
Take logs of both sides of your equation, then take the differentials. For the max. rel. error, they are adding the absolute values of the individual error terms.

For example, if ##z = \frac {a^2\sqrt b}{c^3}##, then
##\ln z = 2\ln a + \frac 1 2 \ln b - 3\ln c##
Taking differentials:
##\frac {dz}z = \frac{2da}a + \frac {.5db} b - \frac {3 dc}c##

If the percentage errors of a, b, and c are 1%, 2%, and 3%, then the max. rel. error would be 2|.01| + .5|.02| + 3|-.01| = ±.06 or ±6%.

I think I'm doing this right...
Yes but I guess they seem to be adding the last terms and not subtracting...
 

Want to reply to this thread?

"Trivial question on differentiating logarithms" You must log in or register to reply here.

Physics Forums Values

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving
Top