Integration: completing the square and inverse trig functions

JOhnJDC
Messages
35
Reaction score
0

Homework Statement



Find \int(x+2)dx/sqrt(3+2x-x2)

Homework Equations



\intdu/sqrt(a2 - u2) = sin-1u/a

The Attempt at a Solution



I began by completing the square:
3+2x-x2 = 4 -(x2-2x+1)

So, 4-(x-1)2 = a2-u2 and a=2 and u=(x-1)
Further, since x=(u+1), dx=du and (x+2)=(u+3)

Substituting, I got:

\int(x+2)dx/sqrt(3+2x-x2) = \int(u+3)du/sqrt(a2-u2)

= \intudu/sqrt(a2-u2) + 3\intdu/sqrt(a2-u2)

This is where I get confused. I know that 3\intdu/sqrt(a2-u2) = 3sin-1u/a

But, according to my book, \intudu/sqrt(a2-u2) = -sqrt(a2-u2) and I don't understand why.

Can someone offer an explanation? Thanks.
 
Physics news on Phys.org
JOhnJDC said:

This is where I get confused. I know that 3\intdu/sqrt(a2-u2) = 3sin-1u/a

But, according to my book, \intudu/sqrt(a2-u2) = -sqrt(a2-u2) and I don't understand why.

Can someone offer an explanation? Thanks.


using substitution

cos \theta = \frac{u}{a}
 
annoymage said:
using substitution

cos \theta = \frac{u}{a}

Are you saying that I should substitute u=a sin theta to obtain:

\intudu/sqrt(a2-u2) = \int(a sin theta)(a cos theta)(d theta)/(a cos theta) = a cos theta

So, u = a cos theta and cos theta = u/a, but u/a doesn't equal -sqrt(a2-u2)

What am I missing?
 
JOhnJDC said:

This is where I get confused. I know that 3\intdu/sqrt(a2-u2) = 3sin-1u/a

But, according to my book, \intudu/sqrt(a2-u2) = -sqrt(a2-u2) and I don't understand why.

Can someone offer an explanation? Thanks.


see the difference :
3\intdu/sqrt(a2-u2) = 3sin-1u/a

and

\intu du/sqrt(a2-u2) = -sqrt(a2-u2)

to obtain the second one, use : v = a2-u2
 
Just take the derivative of both answers and you'll see why.
 
I see it now. Thanks for your help.

\int(a2-u2)-1/2udu

Substitute v=a2-u2, dv=-2udu, udu=-dv/2

= -1/2\intv-1/2dv = -v-1/2 = -sqrt(a2-u2)
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...

Similar threads

Back
Top