# Brightness Temperature, Thermal Spectrum

This is a problem intended to show us that Astronomers are sometimes fooled into thinking that a cosmic source of radiation is thermal, when it is not. Here is the problem:

Consider a cosmic source emitting one hundred solar luminosities all within 1 MHz of the single frequency of 22 GHz, and assume that the source is compact; only 0.001 parsecs in radius. Such a source exists in NGC 3079. What crazy temperature would an Astronomer infer by matching the spectrum of this source at 22 GHz with a thermal spectrum?

Relevant equations:

Planck function for blackbody radiation (for frequency) B = (2hv3/c2)*1/(e(hv/kT)-1)

h = planck constant
v = frequency
c = speed of light
k = boltzmann constant
T = temp

flux: F = L/4(pi)r2

First I solved the planck function for T, and I have all the values I need to plug in except B (blackbody specific intensity, which is erg s-1 cm-2 Hz-1 sr-1) So my idea to get B was to find the flux and manipulate until I have B.

First I calculated the total luminosity...100(solar luminosity) = 3.8e35 erg s-1 s-1

Then the flux, using r = 0.001pc = 3.0857e15 cm and got: 3175.896 erg s-1 cm-2

Since this is energy per unit time per unit area, and I need energy per unit time per unit area per unit frequency per unit steradian, I divided the flux above by the given frequency (2.2e10 Hz), then divided again by 2pi, which is one hemisphere of steradians. This should give me everything I need.

From calculations B = 2.297e-8 erg s-1 cm-2 Hz-1 sr-1

Then I plugged in B, and the rest of the values to calculate T, but the answer was negative.

I hope my explanation makes sense, it is hard to explain on here.
I'm not even sure if my approach was correct, if so I'm not sure if I calculated B correctly.

Thanks

Last edited: