What Should the Detector Temperature Be for Measuring Black Body Radiation?

bull0sees
Messages
3
Reaction score
0

Homework Statement


Suppose you are inside a black body radiation cavity which is at temperature T. Your job is to measure the radiation field in the frequency interval from 1014 to 89 x1014 Hz. You have a detector to do the job. What should the temperature of the detector (T’) be?
Explain your answer.
Possible options are T’>T, T’=T, T’<T, T=0K, or is the temperature of the detector is irrelevant to the measurement?



Homework Equations





The Attempt at a Solution

 
Physics news on Phys.org


you post very interesting questions, however I am also no help at this question either.
 


thank you
 


Anybody have an answer for this? I have been wracking my brains all night but cannot come up with a solution :(
 


shouldn't T'=T because if the temp of the detector was higher than in than in the cavity, it would radiate and effect the result, and same if it was colder, it would absorb radiation from the cavity to come up to equilibrium...
 
Last edited:
To solve this, I first used the units to work out that a= m* a/m, i.e. t=z/λ. This would allow you to determine the time duration within an interval section by section and then add this to the previous ones to obtain the age of the respective layer. However, this would require a constant thickness per year for each interval. However, since this is most likely not the case, my next consideration was that the age must be the integral of a 1/λ(z) function, which I cannot model.
Back
Top