Brightness temperature in remote sensing

In summary, the brightness temperature measured at the TOA is positively proportional to the atmospheric transmittance, as more energy can reach the remote sensing device. On the other hand, the ground brightness temperature is inversely proportional to τ, as the Tba term in the formula becomes larger when τ increases, resulting in a lower Tg.
  • #1
jones123
10
0
Hi all,

I don't know if I'm on the right forum to ask this, but maybe somebody knows anything about brightness temperatures measured by remote sensing devices.

In a paper that I read "Atmospheric corrections for retrieving ground brightness temperature at commonly-used passive microwave frequencies" by Han et al. (2017), I found the following formula:

Tb = Tg.τ + Tba

where: Tb is the brightness temperature at the TOA, Tg the brightness temperature at the ground, τ the atmospheric transmittance and Tba the brightness temperature of the atmospheric layers emitting into the direction of the TOA.

This would mean that:
Tg = (Tb - Tba) / τ

Now: can anyone explain me why the brightness temperature at the TOA Tb is positively proportional to the atmospheric transmittance τ, whereas the ground brightness temperature is inversly proportional to it?

Thanks already!
 
Earth sciences news on Phys.org
  • #2
One way to think about it is this:

Tb is inversely proportional to ##\tau## because atmospheric transmittance decreases Tb for a given Tg.

Tb is positively proportional to atmospheric transmittance because the observed brightness temperature increases if there is more transmittance, which allows more radiation from the surface to make it to the remote sensing device.
 
  • #3


Hi there,

I'm not an expert in remote sensing, but I can try to explain this based on my understanding. The brightness temperature at the TOA (top of atmosphere) is the temperature measured by the remote sensing device as it looks down on the Earth's surface. This temperature includes contributions from both the ground and the atmosphere. The ground brightness temperature, on the other hand, is only the temperature of the ground itself without any influence from the atmosphere.

The atmospheric transmittance, τ, refers to the amount of energy (in this case, microwave radiation) that can pass through the atmosphere without being absorbed or scattered. So, when τ is high, more energy can pass through the atmosphere and reach the remote sensing device at the TOA, resulting in a higher Tb. On the other hand, when τ is low, less energy can pass through the atmosphere, so the Tb will be lower.

In the formula you mentioned, Tba represents the temperature of the atmospheric layers that are emitting radiation towards the TOA. This temperature is also affected by the atmospheric transmittance, as higher τ means more energy can reach these layers, resulting in a higher Tba. This is why the ground brightness temperature, Tg, is inversely proportional to τ - as τ increases, the Tba term in the formula becomes larger, resulting in a lower Tg.

I hope this helps to clarify things a bit. Again, I'm not an expert in this field, so if anyone else has a better explanation or can correct any mistakes I may have made, please feel free to chime in!
 

Related to Brightness temperature in remote sensing

1. What is brightness temperature in remote sensing?

Brightness temperature in remote sensing is a measure of the thermal energy emitted by an object or surface. It is often used to study the Earth's surface and atmosphere using satellite imagery.

2. How is brightness temperature measured?

Brightness temperature is measured by remote sensing instruments, such as radiometers, that detect the amount of thermal energy emitted by an object or surface. This measurement is then converted into a temperature value.

3. What is the difference between brightness temperature and actual temperature?

Brightness temperature is a measure of the thermal energy emitted by an object, while actual temperature is a measure of the average kinetic energy of the particles in an object. Brightness temperature can be affected by factors such as the emissivity of the surface and atmospheric conditions, while actual temperature is a more direct measure of the object's temperature.

4. How is brightness temperature used in remote sensing?

Brightness temperature is used in remote sensing to study the Earth's surface and atmosphere. It can provide information about the temperature of different objects and surfaces, which can be useful in studying weather patterns, land surface properties, and the effects of climate change.

5. Can brightness temperature be used to measure temperature changes over time?

Yes, brightness temperature can be used to measure temperature changes over time. By comparing brightness temperature measurements taken at different times, scientists can track changes in the Earth's surface and atmosphere, such as changes in land surface temperature or the presence of atmospheric moisture. This information can be used to monitor and predict changes in the environment and climate.

Similar threads

Replies
89
Views
34K
  • Beyond the Standard Models
Replies
11
Views
3K
Back
Top