(adsbygoogle = window.adsbygoogle || []).push({}); Problem:

Assume that f has a derivative everywhere. Set g(x)=xf(x). Using the definition of the derivative, show that g has a derivative and that g'(x)=f(x)+xf'(x).

What I know:

I know the definition of the derivative is [f(x+h)-f(x)]/h. I don't know how to plug it in and solve it though. I tried just plugging it straight in like

g'(x)=[x(f(x+h))-xf(x)]/h

I pulled the x out and the rest was [f(x+h)-f(x)]/h (the actual definition) and assumed it was xf'(x). I don't think that's right though.

Any help would be great and appreciated. Thanks in advance:]

**Physics Forums - The Fusion of Science and Community**

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Given g(x)=xf(x), show g'(x)=f(x)+xf'(x).

Loading...

Similar Threads - Given show +xf' | Date |
---|---|

B How can I show the sum results in this? | Monday at 4:28 PM |

B Is the derivative of a function everywhere the same on a given curve? | Mar 26, 2017 |

I Fnd the area A of the triangle with the given the vertices | Nov 30, 2016 |

I Calculating total force over bounded area [Given p density] | Apr 8, 2016 |

I need the book's title?! (figure given from the book) | Oct 23, 2015 |

**Physics Forums - The Fusion of Science and Community**