I Skin temperature increase due to radiation absorption

1. Feb 9, 2017

fog37

Hello,

I am trying to figure out how much the human skin temperature would increase when the skin is illuminated by radiation of a certain intensity (W/m^2). We can assume that the skin has an emissivity and absorptivity both equal to 1. For instance, imagine the skin illuminated by the sun (I= 1000W/m^2) or by another radiation source..
We know the initial temperature of the skin and the intensity of the incident radiation.

What equation would I use?

Thanks!

2. Feb 9, 2017

256bits

You have poked on the Intermediate Level.
Are you familiar with the Stefan-Boltzmann law?

3. Feb 9, 2017

OmCheeto

Interesting question.
I worked out a similar problem for a black rock about 6 months ago. Unfortunately, I can't remember how I did it, nor do I know whether or not my answer was correct.

I would recommend looking at the wiki entry on "Black-body radiation", subsection "Human body emission".
It may not give you the answer, but it has a couple of equations that will get you started:
Pnet = Pemit - Pabsorb
and
Pnet = Aσε(T4 - T04)

A is body surface area
T is body surface temperature
ε is body emissivity
T0 is the ambient temperature
σ is the Stefan–Boltzmann constant​

Of course, your problem is a bit more complicated, as rocks don't sweat.

4. Feb 13, 2017

hilbert2

The heat that the skin absorbs is continuously being transferred away with the blood flow through the capillary veins in the skin, so this can't be calculated as a simple radiative transfer problem.