# Homework Help: Prove the Trigonometric

1. Nov 23, 2008

### Gaara09

1. The problem statement, all variables and given/known data

prove : 1-cos x / sin x = sin x / 1+cosx

1-cosx / 1+cos x = tan^2 (x/2)

2. Relevant equations

3. The attempt at a solution
I have No idea how to solve it

2. Nov 23, 2008

### Дьявол

If you don't have any idea how to prove it, start by working on both sides (for the first expression).
The first step should look like: (1-cosx)(1+cosx)=sin2x

3. Nov 23, 2008

### Gaara09

Nice i got it now sin^2 x + cos^2 x = 1

wat about the 2nd ?

4. Nov 23, 2008

### Mentallic

I can help you with the first one, since I have yet to learn about half angles. Mainly the $$tan^2(\frac{x}{2})$$

ok so we need to prove $$\frac{1-cosx}{sinx}=\frac{sinx}{1+cosx}$$
For questions like these, it is a good habit to only manipulate one side of the equation and necessary if you want all the marks.
Lets take the Left Hand Side then:

$$LHS=\frac{1-cosx}{sinx}$$
ok so we need to somehow convert the denominator from sine to cosine.
You would've learnt the trigonometric identity $$sin^2x+cos^2x=1$$
Then lets multiply both the numerator and denominator by sinx:
$$LHS=\frac{sinx(1-cosx)}{sin^2x}$$

The denominator can be converted by the simple manipulation of the trig identity
$$sin^2x=1-cos^2x$$

From here it is quite simple so I will let you take over

5. Nov 23, 2008

### icystrike

we are proving not solving..

6. Nov 23, 2008

### Дьявол

My method is absolutely correct.
Since sin2x+cos2x=1
And we can transform it as 1-cos2x-cos2x=1
So 1=1 which is correct.
Maybe Mentallic method is better since uses one of the sides (LHS) to prove the other (RHS).
For the second trigonometric identity:
$$\frac{\frac{1-cosx}{2}}{\frac{1+cosx}{2}}$$
$$\frac{sin^2(x/2)}{cos^2(x/2)}$$

7. Nov 23, 2008

### Gaara09

Дьявол Thank you so much :)

8. Nov 23, 2008

### Gaara09

something out of trigonometric :

i have g(x):

i need (gog)x = but how can i do it ? I'll have 2 radicals

9. Nov 23, 2008

### Chaos2009

Is that supposed to be $$g(x) = \sqrt[3]{x + 1}$$?

If so, I don't really see the problem with having two radicals.

10. Nov 23, 2008

### Mentallic

lol at the picture! :rofl:

Can I just ask what (gog)x is?

11. Nov 24, 2008

### Дьявол

I think, that he thinks of composition.
$$(g \circ g)x$$
If $$g(x) = \sqrt[3]{x + 1}$$, then g(g(x)) would be $$g(\sqrt[3]{x + 1})=\sqrt[3]{\sqrt[3]{x+1}+ 1}$$

12. Nov 24, 2008

### Chaos2009

It sounded to me like he was just worried about having a radical sign inside a radical sign in his composite function, but I don't think there really is a problem with that in this case. It should be fine the way Дьявол wrote it.