# Proving using calculus without trig identity

1. Feb 23, 2012

### kebabs

Please I really need help with this hw question

Prove without trig identity that f`(x)=0 for

F(x)=Asin^2(Bx+C)+Acos^2(Bx+C)

2. Feb 23, 2012

### SteveL27

You're not supposed to use the obvious identity that simplifies this? I suppose you could just use the derivatives of sin and cos along with the chain rule to directly compute the derivative. But eventually you'll need to simplify using some trig identity.

3. Feb 23, 2012

### kebabs

I can't use trig identy to solve it

4. Feb 23, 2012

### kebabs

I mean I'm not allowed to

5. Feb 23, 2012

### SammyS

Staff Emeritus
What is F'(x) if $F(x)=A\sin^2(Bx+C)+A\cos^2(Bx+C)\,?$

6. Feb 23, 2012

### eumyang

Are you sure? I was able to get F'(x) = 0 by using the chain rule, and yet I didn't use any trig identity.

7. Feb 26, 2012

### kebabs

8. Feb 26, 2012

### Ansatz7

It's really simple - just use chain rule to take the d/dx of the whole expression. No trig or any other kinds of tricks necessary. Are you familiar with the use of chain rule?

9. Feb 26, 2012

### Staff: Mentor

This is not permitted at Physics Forums - don't even ask.