What is the issue with the last term in the conversion from v(t) to y(t)?

  • Thread starter Thread starter Shackleford
  • Start date Start date
Shackleford
Messages
1,649
Reaction score
2
For #3, when going from the v(t) to y(t), I wasn't sure what to do with the
C. When you get to the y(t), the last term is C / t^2. When you put in the
initial condition y(0), you get an indeterminate expression C / 0.

http://i111.photobucket.com/albums/n149/camarolt4z28/3.jpg
 
Last edited by a moderator:
Physics news on Phys.org
Shackleford said:
For #3, when going from the v(t) to y(t), I wasn't sure what to do with the
C. When you get to the y(t), the last term is C / t^2. When you put in the
initial condition y(0), you get an indeterminate expression C / 0.

http://i111.photobucket.com/albums/n149/camarolt4z28/3.jpg

There isn't much you can do. However, C/0 is not indeterminate! It's simply undefined. "Indeterminate" only has meaning when the expression is inside the argument of a limit.
 
Last edited by a moderator:
You have y'= a(t)y+ f(t) and assert that the integrating factor is
e^{\int a(t)dt}[/itex]<br /> That is incorrect. The formula is for a d.e. of the form y&#039;+ a(t)y= f(t) so you have the sign wrong. The equation y&#039;= -(2/t)y+ t-1 is equivalent to y&#039;+ (2/t)y= t- 1. The integrating factor is <br /> e^{\int 2/t dt}= e^{2 ln|t|}= t^2.<br /> <br /> Of course, you are still going to have a problem at t= 0 because one of the coefficients of your d.e. is not defined at t= 0.
 
HallsofIvy said:
You have y'= a(t)y+ f(t) and assert that the integrating factor is
e^{\int a(t)dt}[/itex]<br /> That is incorrect. The formula is for a d.e. of the form y&#039;+ a(t)y= f(t) so you have the sign wrong. The equation y&#039;= -(2/t)y+ t-1 is equivalent to y&#039;+ (2/t)y= t- 1. The integrating factor is <br /> e^{\int 2/t dt}= e^{2 ln|t|}= t^2.<br /> <br /> Of course, you are still going to have a problem at t= 0 because one of the coefficients of your d.e. is not defined at t= 0.
<br /> <br /> Hm. I thought it didn&#039;t have to be in standard form.<br /> <br /> How do you get around the undetermined expression?
 
Prove $$\int\limits_0^{\sqrt2/4}\frac{1}{\sqrt{x-x^2}}\arcsin\sqrt{\frac{(x-1)\left(x-1+x\sqrt{9-16x}\right)}{1-2x}} \, \mathrm dx = \frac{\pi^2}{8}.$$ Let $$I = \int\limits_0^{\sqrt 2 / 4}\frac{1}{\sqrt{x-x^2}}\arcsin\sqrt{\frac{(x-1)\left(x-1+x\sqrt{9-16x}\right)}{1-2x}} \, \mathrm dx. \tag{1}$$ The representation integral of ##\arcsin## is $$\arcsin u = \int\limits_{0}^{1} \frac{\mathrm dt}{\sqrt{1-t^2}}, \qquad 0 \leqslant u \leqslant 1.$$ Plugging identity above into ##(1)## with ##u...
Back
Top