- #1
randa177
- 91
- 1
Does anybody have any idea how to solve this problem:
The extinction of light due to its passage through a partially opaque medium is given by Beer’s Law (known by most chemistry students and many physics students, I hope!):
I = Io e^ T
where (Io) is the intensity of the light incident on the medium and (I) is the intensity after exiting the medium. (T) is the “optical depth” of the medium.
Suppose you were observing an O-type star embedded in a molecular cloud and the optical depth at 7 microm was 1. Everything else being equal (detector technology etc.) would it be easier to detect the star at 1 microm or 7 microm?
The extinction of light due to its passage through a partially opaque medium is given by Beer’s Law (known by most chemistry students and many physics students, I hope!):
I = Io e^ T
where (Io) is the intensity of the light incident on the medium and (I) is the intensity after exiting the medium. (T) is the “optical depth” of the medium.
Suppose you were observing an O-type star embedded in a molecular cloud and the optical depth at 7 microm was 1. Everything else being equal (detector technology etc.) would it be easier to detect the star at 1 microm or 7 microm?