# Skin temperature increase due to radiation absorption

• I
Hello,

I am trying to figure out how much the human skin temperature would increase when the skin is illuminated by radiation of a certain intensity (W/m^2). We can assume that the skin has an emissivity and absorptivity both equal to 1. For instance, imagine the skin illuminated by the sun (I= 1000W/m^2) or by another radiation source..
We know the initial temperature of the skin and the intensity of the incident radiation.

What equation would I use?

Thanks!

256bits
Gold Member
What equation would I use?
You have poked on the Intermediate Level.
Are you familiar with the Stefan-Boltzmann law?
The Stefan–Boltzmann law states that the power emitted per unit area of the surface of a black body is directly proportional to the fourth power of its absolute temperature

• jim mcnamara
OmCheeto
Gold Member
Hello,

I am trying to figure out how much the human skin temperature would increase when the skin is illuminated by radiation of a certain intensity (W/m^2). We can assume that the skin has an emissivity and absorptivity both equal to 1. For instance, imagine the skin illuminated by the sun (I= 1000W/m^2) or by another radiation source..
We know the initial temperature of the skin and the intensity of the incident radiation.

What equation would I use?

Thanks!
Interesting question.
I worked out a similar problem for a black rock about 6 months ago. Unfortunately, I can't remember how I did it, nor do I know whether or not my answer was correct.

I would recommend looking at the wiki entry on "Black-body radiation", subsection "Human body emission".
It may not give you the answer, but it has a couple of equations that will get you started:
Pnet = Pemit - Pabsorb
and
Pnet = Aσε(T4 - T04)

A is body surface area
T is body surface temperature
ε is body emissivity
T0 is the ambient temperature
σ is the Stefan–Boltzmann constant​

Of course, your problem is a bit more complicated, as rocks don't sweat.

• jim mcnamara
hilbert2