- #1
lavster
- 217
- 0
Homework Statement
solve for u:
[tex]u_t=G + \mu(u_{rr}+\frac{1}{r}u_r)[/tex]
with boundary conditions u=0 on r=a and [tex]u_r=0[/tex] on r=0
where G is a constant, u is a function of r only and u_r is the derivative of u with respect to r etc
Homework Equations
the solution is:
[tex]\frac{G_0 a^2}{4\mu}(1-\frac{r^2}{a^2}[/tex]
The Attempt at a Solution
u is independent of t so [tex]u_t=0[/tex].
it is an inhomogeneous differential equation so i thought youd solve [tex](u_rr+\frac{1}{r}u_r)=0[/tex] first.
I then thought ud let [tex]p=u_r[/tex] to get [tex]p_r+\frac{1}{r}p[/tex] and then use separation of variables integral thing to get p = r+c where c is a constant. (initially i got logs but i took the exponential).
then i converted back to u : [tex]u_r=r+c[/tex] to get [tex]u=\frac{r^2}{2}+rc+d[/tex]. This looks wrong and i have no idea how to introduce the \frac{G}{/mu} term.
any help will be much appreciated!