A cube of Al (.1m x .1m x .1m with mass=.27kg) is placed on a large thermal mass maintained at constant 40 C (313 K).The cube is separated from the thermal mass by a .01m thick glass plate (.1m x .1m). If the Al block is initially at 10 C, how long does it take to reach a temperature of 20 C?
(Assume that no heat is transferred to the enviornment and that the thermal mass of the glass plate can be ignored.) Hint: Determine the rate at which the heat is transferred for a given temperature of the Al cube and then determine the rate at which the temperature of the Al cube is changing. The total time is obtained by integrating your equation given the intial and final temps.
The Attempt at a Solution
I worked with my professor, and thought I understood it, but now I am stuck. -_- Here is what I have done so far:
dQ/dt = mc dT/dt=kA[T_hot - T_al(t)]/L (T is temp, t is time, L is length)
mcL dT = kA[T_hot - T_al(t) ] dt
mcL(20-10) = kA(T_hot*t_20)-kA*integral[T_al(t)] (the integral is indefinite and from 0 to T_20,T_20 being what I am solving for)
10*.27*910*.1/(.1*.1*205)= (313 Kelvin)*t_20-integral[T_al(t)]
and here I am stuck, as I don't know what to do with the last integral on the right side. I think I need to find an equation for T_al(t) that relates t to T_al(t), however I cannot figure this out. I know that when t=0,T_al=10 C, but when t=?, T_al = 20 C.
**If anyone noticed that the work kept getting changed, It was because I am still trying to work on it while awaiting a reply and wanted to update the post as to what work I have done. Sorry for any confusion/inconvenience.