- #1

- 5

- 2

- TL;DR Summary
- Heat Transfer Through Two Material System due to Light Source with Known Power

I have been given the task of modeling the heat transfer from a light source of known power into a system consisting of two connected materials. I must find the temperature change in the bottom surface. The two materials are initially in thermal equilibrium with the surroundings.

My first thought was to uses the lumped capacitance approach, thinking the resistance to convection will be much higher than that of conduction. I also assumed that heat lost throughout to the surroundings due to radiation or convection was negligible.

The percentage of light absorbed by the top material will be determined experimentally, so the heat flux into the top surface would be the power of the light (watt/cm^2) * % of light absorbed. Multiplying this flux by the area it is applied on and the time the light is on gives the total heat applied in joules, and setting that equal to rho*V*c(Tf-Ti) to get Tf, the new temperature of the top layer.

I was then thinking to use the new temperature difference between the top layer and bottom layer to get the heat flow due to conduction into the bottom layer, and then essentially repeating the same process to get the new final temp of the bottom layer (when integrating the conduction heat flow, I am unsure if I would use the same time as before, the amount of time the light was on).

However, this feels too simple to me and like I must be overlooking something. Is there a more accurate way to go about this? Are my assumptions valid?

My first thought was to uses the lumped capacitance approach, thinking the resistance to convection will be much higher than that of conduction. I also assumed that heat lost throughout to the surroundings due to radiation or convection was negligible.

The percentage of light absorbed by the top material will be determined experimentally, so the heat flux into the top surface would be the power of the light (watt/cm^2) * % of light absorbed. Multiplying this flux by the area it is applied on and the time the light is on gives the total heat applied in joules, and setting that equal to rho*V*c(Tf-Ti) to get Tf, the new temperature of the top layer.

I was then thinking to use the new temperature difference between the top layer and bottom layer to get the heat flow due to conduction into the bottom layer, and then essentially repeating the same process to get the new final temp of the bottom layer (when integrating the conduction heat flow, I am unsure if I would use the same time as before, the amount of time the light was on).

However, this feels too simple to me and like I must be overlooking something. Is there a more accurate way to go about this? Are my assumptions valid?

Last edited by a moderator: