//<![CDATA[ aax_getad_mpb({ "slot_uuid":"f485bc30-20f5-4c34-b261-5f2d6f6142cb" }); //]]> Problem:

Assume that f has a derivative everywhere. Set g(x)=xf(x). Using the definition of the derivative, show that g has a derivative and that g'(x)=f(x)+xf'(x).

What I know:

I know the definition of the derivative is [f(x+h)-f(x)]/h. I don't know how to plug it in and solve it though. I tried just plugging it straight in like

g'(x)=[x(f(x+h))-xf(x)]/h

I pulled the x out and the rest was [f(x+h)-f(x)]/h (the actual definition) and assumed it was xf'(x). I don't think that's right though.

Any help would be great and appreciated. Thanks in advance:]

**Physics Forums - The Fusion of Science and Community**

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Given g(x)=xf(x), show g'(x)=f(x)+xf'(x).

Have something to add?

**Physics Forums - The Fusion of Science and Community**