Understanding the Derivative Rule for Inverses

bang
Messages
18
Reaction score
0

Homework Statement


Assume that the Derivative Rule for Inverses holds. Given that f(x) = x + f(x), and g(t) = f-1(t), which of the following is equivalent to g'(t)?
a. g'(t) = 1 + t2
b. g'(t) = 1 + t4
c. g'(t) = 1 + g(x)
d. g'(t) = 1 / (1 + t4)

Homework Equations





The Attempt at a Solution


This question popped up on my recent calc final and my friends and I cannot agree on what the answer is. I answered with C, and most of my friends answered D, arguing that the fraction makes it correct. Can somebody with more knowledge explain this to me? Thank you!
 
Physics news on Phys.org
bang said:

Homework Statement


Assume that the Derivative Rule for Inverses holds. Given that f(x) = x + f(x), and g(t) = f-1(t), which of the following is equivalent to g'(t)?
a. g'(t) = 1 + t2
b. g'(t) = 1 + t4
c. g'(t) = 1 + g(x)
d. g'(t) = 1 / (1 + t4)

Homework Equations


The Attempt at a Solution


This question popped up on my recent calc final and my friends and I cannot agree on what the answer is. I answered with C, and most of my friends answered D, arguing that the fraction makes it correct. Can somebody with more knowledge explain this to me? Thank you!

I don't think any functions satisfy f(x)=x+f(x). Can you correct the statement?
 
That was the function given to us on the test as best as I can remember. It might have been something like f(x)= x + f(x)^3, but definitely f(x) = x + f(x)
 
bang said:
That was the function given to us on the test as best as I can remember. It might have been something like f(x)= x + f(x)^3, but definitely f(x) = x + f(x)

f(x)=x+f(x) means f(x)-f(x)=x. So x=0. It can't be an identity for the function f(x). Can you check with your classmates and figure out what the real question is?
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top