# Is There an Easy Way to Learn HTML?

• p3t3r1
In summary, the sun emits visible light as well as infrared light, but due to the Earth's atmosphere and the reflective properties of objects, the amount of energy felt from the sun's radiation is not solely dependent on its temperature. The feeling of heat is also influenced by how well our skin absorbs or reflects different wavelengths of light, with infrared light being absorbed more easily and therefore feeling hotter. However, in terms of energy per photon, infrared light does not actually feel hotter than visible light.
p3t3r1
Sorry, kind of stupid question, but just wondering. Thanks.

Visible light has more energy per photon but the light that comes from the sun is spreads out so it is at a much lower intensity then it would be from a black body emission. If you think that infrared light is hotter then try putting you hand on a light bulb, a red hot stove burner or a white hot welding rod. Well, actually don't but note that if you see an object turn red blue or white because of thermal emissions then it is visible light you see and not infra red. Of course there will also be infra red light and the proportion of each frequency will depend on temperature. Better idea, if you think visible light is not hot then take a trip to the sun.

The sun closely resembles a blackbody in its emission spectrum with the peak emission in the yellow green area.
The difference in what you feel is primarily due to how many photons get adsorbed instead of reflected.
You might have noticed that a black surface (say a car in the parking lot) which doesn't reflect much light is much hotter in sunlight than a white surface which reflects most of the light falling on it.

NoTime said:
The sun closely resembles a blackbody in its emission spectrum with the peak emission in the yellow green area.
It depends on which distribution you are considering: if you consider the distribution in wavelenght, the maximum corresponds to ~ 502 nm (green), but if you consider the distribution in frequency, the maximum corresponds to ~ 884 nm which is in the infrared.
I took 5778 K as sun's surface temperature (wikipedia).
The topic was also discussed here:

Last edited:
Think about the total amount of absorbed energy, which is spread over a range of wavelengths. The sun is nearly a 6000 K blackbody, but walking around on a sunny day doesn't feel like the inside of a 6000 K oven because a lot of the energy is absorbed by the atmosphere (and also that hardly any of the radiated energy actually falls on the Earth).

Same for a lightbulb filament or the arc in a short-arc bulb: most of the thermal energy is dissipated before reaching you. Even though a filament may reach 3000K, and an arc around 6000 K, the total amount of radiated energy in Watts is very little.

Also, infrared absorption is qualitatively different from visible and UV absorption: the first goes to vibrational energy in a material, while the second and third go to electronic transitions.

lightarrow said:
It depends on which distribution you are considering: if you consider the distribution in wavelenght, the maximum corresponds to ~ 502 nm (green), but if you consider the distribution in frequency, the maximum corresponds to ~ 884 nm which is in the infrared.
I took 5778 K as sun's surface temperature (wikipedia).
The topic was also discussed here:
Oh my. It's been forever that I've looked at something like that.
However, since wavelength is just 1/f then at a crude level this is just wrong.
The distinction is that peak in your formulas represent two different things.
Someone will have to correct me if I'm wrong, but in one case I think you end up with max photon count while the other is total momentum.

Edit: the latter being the normal way of looking at this (green).

Last edited:
IR feels hot because the nerve cells in our skin reports large quantities of that frequency of photon as heat. You can get hit with higher doses of midIR or microwaves (both just outside the thermal IR range) and not feel anything. Similarly, our skin is fairly transparent to visible light, and it takes a good deal of it to heat up our skin.
The same answer applies to our eyes seeing visible light, and not thermal IR. You can heat something up, or just produce thermal radiation (as in a CO2 laser), and our eyes will not be able to see it, at least until that something is hot enough to give off visible photons.

Cool, thanks for the great responses.

Jakell said:
Similarly, our skin is fairly transparent to visible light, and it takes a good deal of it to heat up our skin.
So what are you saying? It just heats up our insides? Obviously not much gets through us or our shadow would be kind of hard to see. I would think how well our skin reflects light would be more important then how well it transmits light in terms of how hot it makes us feel. Moreover, if the skin was really good at transmitting light then wouldn't we be able to see through it?

Though hopefully you can figure it out from the responses, no one directly addressed the question asked in the OP. The question is wrong: infrared doesn't feel hotter than an equivalent amount (# of photons) of visible radiation.

NoTime said:
Oh my. It's been forever that I've looked at something like that.
However, since wavelength is just 1/f then at a crude level this is just wrong.
The distinction is that peak in your formulas represent two different things.
Someone will have to correct me if I'm wrong, but in one case I think you end up with max photon count while the other is total momentum.

Edit: the latter being the normal way of looking at this (green).
Integrating in dv (first case) or dl (second case) you have total energy density in both ways; this only is the meaningful physical concept.

Last edited:
russ_watters said:
Though hopefully you can figure it out from the responses, no one directly addressed the question asked in the OP. The question is wrong: infrared doesn't feel hotter than an equivalent amount (# of photons) of visible radiation.

Actually it dedends on how close to "black" your body is at different wavelengths. My skin looks fairly pale at visible light so presumably it relects some part of the visible light falling upon it. I don't know what is the "albedo" of my skin at infrared wavelengths, but if its significantly less than at visible light wavelengths then that could explain the observation. I agree with you however that no compelling case has so far been made that the original "observation" is even true.

lightarrow said:
Integrating in dv (first case) or dl (second case) you have total energy density in both ways; this only is the meaningful physical concept.

Yeah one maximizes the energy per incremental wavelength band and the other maximizes the energy per incremental frequency band, that's why they are two different things.

In other words, say you split all the light from l=200nm through to l=20,000nm into many bands of say 10nm increment, then the band corresponding to approx hc/lkT = 5 will contain the most energy out of all those bands. If on the other hand you split the light into frequency bands of say 1THz per band then the band corresponding to approx hf/kT = 2.9 will contain the most energy out of all those bands.

When you look at it like this it's easy to see why they are two different things. At the lower end of the spectrum under consideration (20,000 nm) each 1THz band corresponds to approx a 1250 nm wavelength band whereas at the upper end (200nm) each 1THz band corresponds to a mere 0.13nm wavelength interval. That is, each 1THz frequency band corresponds to about 125 of the 10nm bands at the lower end of the spectrum under consideration but only about 0.013 of one 10nm band at the upper end of the spectrum.

So yes in each case we are comparing energy per band, however there is no direct correspondence or consistency of the widths of the bands between the two cases which is where the discrepancy arises. Actually I'm pretty sure that you (lightarrow) were already aware of this, I'm just explaining this in case anyone else is confused.

Last edited:
russ_watters said:
Though hopefully you can figure it out from the responses, no one directly addressed the question asked in the OP. The question is wrong: infrared doesn't feel hotter than an equivalent amount (# of photons) of visible radiation.

No, Jakell did, actually- as he said, the nerves in skin responsible for sensing temperature are more sensitive to IR radiation than visible, so indeed IR does *feel* hotter, even given equivalent amounts of power.

uart said:
Actually it dedends on how close to "black" your body is at different wavelengths.
This isn't about any specific way a ray of light happens to come to be. All I said is that if the number of photons is equal, the one with the higher frequency has more energy. These photons can come from a radio tower, lasers, reflection, filtration, or, yes, black body radiation. But this is not just about black body radiation.

russ_watters said:
This isn't about any specific way a ray of light happens to come to be. All I said is that if the number of photons is equal, the one with the higher frequency has more energy. These photons can come from a radio tower, lasers, reflection, filtration, or, yes, black body radiation. But this is not just about black body radiation.

I think you may have misread what I said Russ. By "your body" I was literally referring to your body (that is your skin and flesh etc) as the absorber. I was not making any reference at all to the radiation source. If you include the very next sentence after the one you quoted it says,
Actually it dedends on how close to "black" your body is at different wavelengths. My skin looks fairly pale at visible light so presumably it relects some part of the visible light falling upon it.
.
That's the nice thing about context, when you read more than one sentence it often makes more sense.

Last edited:

## 1. What is HTML?

HTML (Hypertext Markup Language) is the standard markup language used for creating web pages. It provides the structure and content of a webpage, including text, images, and other media.

## 2. Is HTML difficult to learn?

HTML is generally considered to be a relatively easy language to learn. It has a simple syntax and does not require any prior programming knowledge. With practice and dedication, anyone can learn HTML.

## 3. What are the best resources for learning HTML?

There are many online tutorials, courses, and resources available to learn HTML. Some popular ones include W3Schools, Codecademy, and FreeCodeCamp. It's also helpful to refer to official documentation and practice by building your own projects.

## 4. How long does it take to learn HTML?

The time it takes to learn HTML depends on the individual's learning pace and dedication. With consistent practice, one can become proficient in HTML in a few weeks or months. However, mastering all the advanced concepts may take longer.

## 5. What are the benefits of learning HTML?

Learning HTML allows you to create and customize your own websites, which can be a valuable skill for personal or professional purposes. It also provides a foundation for learning other web development languages and technologies. Additionally, having HTML knowledge can make it easier to understand and troubleshoot website issues.

• Programming and Computer Science
Replies
5
Views
762
• Programming and Computer Science
Replies
23
Views
1K
• Programming and Computer Science
Replies
4
Views
1K
• New Member Introductions
Replies
3
Views
250
• Programming and Computer Science
Replies
4
Views
1K
• Programming and Computer Science
Replies
6
Views
1K
• Computing and Technology
Replies
8
Views
2K
• Feedback and Announcements
Replies
23
Views
3K
• General Math
Replies
4
Views
581
• Programming and Computer Science
Replies
7
Views
1K