- #1

jaumzaum

- 434

- 33

$$ P=\int_{F_{min}}^{F_{max}} A f^3 e^{-B f} \,df $$

$$ P=-A e^{-B f} (f^3/B+3 f^2/B^2 + 6 f/B^3 + 6/B^4) $$

Where:

$$ A= \frac {\epsilon 2hS \Omega} {c^2} = 7,06 \cdot 10^{-55} (SI units) $$

$$ B =\frac{h}{ k_B T} =1,55 \cdot 10^{ -13} (SI units) $$

Where:

epsilon = emissivity of human body (considered to be 0,96)

P = Power emitted in the visible spectrum

h = Planck constant

c = light speed

kb = Boltzmann constant

f = frequency

T = temperature of human body (considered 36°C)

Fmin = minimum visible frequency assumed to be 400THz

Fmax = maximum visible frequency assumed to be 800THz

S = superficial area of human body (assumed to be 2 meters squared)

Omega = solid angle of our 2 pupils (assumed to have 4 mm each one), assuming a distance of 2 meters to a point sourceIntegrating I get:

$$ P = 3,1 \cdot 10^{-23} W $$

If we consider the average visible photon to have 600 THz = 4E-20 J:

$$ n = 7,8 \cdot 10^{-7} photons/second $$

With this calculation we find that in a day an observer in a distance of 2 meters to an average person will perceive one visible photon once in a 2-week period (very low).

But this is the perceived photons (the one that arrives in the pupils). If we take the solid angle to be 4pi, the emitted visible radiation of a human body is found to be 0,4 photons/second, which is 33600 photons a day.

Ok, this is more interesting. The real question is: how many photons per second can wee actually see?

In this experiment:

https://math.ucr.edu/home/baez/physics/Quantum/see_a_photon.html

They find that in a completely dark room, 60% of the people will answer that they indeed saw something when a flash of 510 nm and about 90 photons arrives in their eyes. It's considered that only 10% (9 photons) get to the retina. So we need to have 90 photons in each synapse period, which is about 1ms from the 2 way round trip. If we substitute that in the calculations (and change our pupil size to 8 mm in the completely dark, and the emissivity to be one) we find that we could see objects with a temperature of 225°C.

But this does not seem to be true. Actually far from true. The Draper point in defined as the point where objects begins to glow, and it is 525°C.

It may seem that this gives some margins to the assumptions (i.e. size of the pupils, emissivity, synapse time...) but I would accept a 50°C change, not a 300°C. If we input this temperature in the calculations we find 317 million visible photons getting in the eye each millisecond, 3 million times more photons than we needed, an error of 300000000%.

I can't wee why the Draper point is much higher than the one I calculated, can someone help me?