I have been reading about climate change recently and I have run into a problem understanding the earth's energy budget. I will try to summarise my question but am happy to try to expand if needed. Bear in mind I am just an interested average Joe with little science background and probably a very rudimentary graps of physics. The basic point I am grappling with is how the temperature is calculated. I am not sure if I understand just what is being stated as various sites explain the calculation in different ways. What I am uncertain of is how an effective temperature for the earth is calculated. The argument seems to be that if the earth were a perfect blackbody with no atmosphere, the surface temperature would be around 5C. However due to earth's albedo, we adjust this figure to give us an effective temperature of about -18C. As the earth's measured surface temperature is 15C there is therefore a difference of 33C which is due to the Greenhouse Effect of the atmosphere. This seems to derive from a calculation that uses the average solar radiative flux of 1360 w/m2. This number is divided by 4 to give an average for the earth's surface due to the fact that the theoretical blackbody is a rotating sphere. That is, the energy intercepted is equivalent to that of a disk with the same diameter. However, the rotating sphere has an area equal to 4 times that of the disk. On average therefore we have approximately 340 w/m2 for the surface of our theoretical blackbody. This is the number used for the TOA energy flux shown in many energy budget diagrams. I follow that, but this seems to be just a matter of geometry. I don't see why a square metre becomes the effective area of heat transfer when a w/m2 is merely a unit of measure. To try to explain. As I understand it, a perfect blackbody absorbs all incident radiation and has no albedo. The critical point to me is whether this absorption occurs at the molecular level, or at a smaller or larger scale. Imagine the surface of our blackbody sphere. At a point on the equator with light shining from the sun directly at that point, how much radiation can an individual totally absorbent molecule absorb? How much is available for it to absorb? My guess is that it absorbs as much of the available energy as it needs to ‘heat’ up and reach thermal equilibrium. Presuming that my guess is right, let us now consider another molecule at a point very close to the pole. What will be the effect on that molecule, providing that the full disk of the sun is visible to the molecule? Bear in mind that this molecule is part of a blackbody with no atmosphere, so there is no atmospheric absorption or scattering and the body has no albedo. Does our molecule absorb more or less radiation than the one at the equator, and does it heat more or less? The question is, does it reach thermal equilibrium more quickly, more slowly, or at the same rate? Put another way, is there more or less radiation available to our pole based molecule when compared to our equator based molecule? The answer to this seems to me to determine how much energy is available to our blackbody and how hot it will be at equilibrium. If for some reason radiative flux is only effective in units of square metres then the calculation I noted above is correct. But if the flux is effective at the individual molecule level, then surely it is not? Another thought experiment illuminates my question another way. The GHE requires backradiation from greenhouse gas molecules. So, individual molecules can clearly 'heat' and give off energy. And an individual CO2 molecule for example at 1mm above the pole of our theoretical blackbody must logically, if exposed to the full disk of the sun, heat as much and as quickly as a molecule 1mm above the equator. Why would a molecule on the surface of a blackbody not behave the same way? Clearly, there must be something about radiation and heat transfer that I don't understand. Can anyone explain simply?