# Heat equation integral - Fourier Series coefficient is zero

Tags:
1. Apr 19, 2017

### dumbdumNotSmart

1. The problem statement, all variables and given/known data
WE have a thermally insulated metallic bar (from enviroment/surroundings) . It has a temperature of 0 ºC. At t=0 two thermal sources are applied at either end, the first being -10 ºC and the second being 10 ºC. Find the equation for the temperature along the bar T(x,t), in function of position and time.

2. Relevant equations
$$T(x,t)= u_l(x) + u(x,t)$$
$$u_l = -10+x*20/L$$
$$u(x,t)=\sum ( a_n cos(kx)+b_n sin(kx) )*e^{-t(kc)^2}$$

3. The attempt at a solution
So this is a heat equation problem and we have the variables we need. If we consider the extremes of the bar are heated to the thermal sources temperature instantly, then we have #a_n=0# through evalutating T at x=0, t=0. We then evaluate T at x=L, t=0 and so we have $k=\pi n /L$.

The problem arises when I go looking for $b_n$ because the integral will be equal to zero! According to my knowledge on the subject, to find $b_n$ I have to integrate $sin(kx)$ times the function for temperature at t=0 along x, which is zero at every point except the extremes! Naturally I'd try to convert to Kelvin, but it just doesn't make sense to me that I'd have to do that for it to work, why would a negative temperature break physics for the heat equation?

Edit: I'm almost sure I'm declaring the integral for $b_n$ wrong. I never really understood what the function I was putting inside the integral was when doing heat equation.

Last edited: Apr 19, 2017
2. Apr 19, 2017

### Orodruin

Staff Emeritus
No it isn't. You are expanding u in the series - not T.