# Is there any relation between wavelength and brightness?

1. Nov 13, 2012

### tris_d

What makes, for example, blue color appear darker or lighter? Is it just amount of photons (intensity) or is there any relation between brightness and wavelength? Or is there something else that comes into equation as well?

2. Nov 13, 2012

### Simon Bridge

The sensitivity of the optical equipment being used to measure it - i.e. the human eye has evolved to respond strongly to yellows and greens so these seem brighter and more noticeable.
How bright a color appears also depends on the colors around it and the context you are looking at it in.

3. Nov 13, 2012

### Hetware

"Brightness" as you have described it is fairly subjective. We might consider "light blue" to be brighter than "dark blue", but "black light" (really dark blue) has more energy per photon then visible light. That's the quantum physical way to describe it.

Intensity is the amount of energy passing through a specified area in a specified amount of time. In classical electromagnetism, the intensity of light is proportional to the square of the amplitude of the waves. In quantum mechanics it depends of the frequency of the light as well as the number of photons.

One photon of blue light has more energy than one photon of red light, but you can have a red light source which is more intense than a blue light source.

"Brightness" as we experience it with our eyes depends on how our eyes work. We can only detect a small slice of the electromagnetic spectrum, and we sense different frequencies of light in different ways.

4. Nov 13, 2012

### ZapperZ

Staff Emeritus
Be aware that what you are categorizing as bright appears to be based on what you perceive with your eyes. Do you think this is a good detector?

Zz.

5. Nov 13, 2012

### tris_d

Ok, let me define brightness as captured light on some photo that we open in Photoshop, turn it to gray-scale image, and then those pixels will have some value from 0 to 100, where 0 is black, 100 is white, and in between are shades of gray.

So then having all the photons be the same wavelength, brightness of each pixel will be proportional to the amount of photons that impacted that particular pixel?

I can see how light intensity would be defined by the number of photons, but what does frequency have anything to do with it?

If frequency defines intensity too, would that mean that one blue photon with higher frequency could produce more bright pixel than the other blue photon with lower frequency?

I thought frequency defines color, that frequency is just another face of the wavelength and that they are always in constant relation.

6. Nov 13, 2012

### ZapperZ

Staff Emeritus
7. Nov 13, 2012

### tris_d

Thank you. If you have some more detailed articles among the same lines let me know. I want to know all there is to know about brightness, intensity and whatever else comes into that equation.

8. Nov 13, 2012

### ZapperZ

Staff Emeritus
You could have done a quick search and come up with the same thing.

https://www.physicsforums.com/blog.php?b=3588 [Broken]

Zz.

Last edited by a moderator: May 6, 2017
9. Nov 13, 2012

### Hetware

Turning some image to gray scale is just removing information. If we are talking digital photography, then the detector responds to certain frequency ranges in such a way to add a certain number of red, green or blue bits to your image. In some ways traditional photographic emulsion using chemical excitation is more faithful to the original scene.

Still, brightness is in the eye of the beholder, even if it is an electronic eye.

Frequency and color are related, but what you perceive is not frequency. You have three different optical receptors in your retina. A combination of frequencies can cause the same sensation as one frequency. The colors of the rainbow are "honest", frequency-related colors. The colors of your TV screen are "playing tricks on your eyes". They are causing your receptors to fire, due to the amount of red, green or blue sent from each pixel.

I actually spent some time trying to figure out what certain brightnesses indicated in relation to some video shot on 9/11/01 of the World Trade Center Building #2. There was an outflow of brightly glowing molten material which I wanted to determine the temperature of.

What I found is that it's a very tricky problem. Many of the pixels are saturated, which means that the information is truncated. 100% saturated pixels only tell us the minimum intensity. They are silent as to how far above the minimum the original scene was.

This isn't one of the images I worked with, but it is an example:

Comparing the black body radiation of the surrounding flames (which were not saturated) gave me some means of calibration.

Second Law of Thermodynamics: heat don't flow uphill.

You might be interested in the theory of black-body radiation. It has a lot to do with frequency and "brightness". It is also instructive if you are interested in photography, and someone tells you the Kelvin temperature equivalence of your light source.

It can be a lot of fun to play with RGB (red, green, blue) combinations, and their anti-combinations (yellow, magenta and cyan).

Last edited: Nov 14, 2012
10. Nov 14, 2012

### Drakkith

Staff Emeritus
Let me use a real world example that I commonly deal with when using my CCD camera for astrophotography. My telescope focuses light down onto the CCD sensor. The charge on each pixel is built up during exposure by photons being absorbed by the pixels. Each pixel can hold a certain number of electrons before reaching "saturation". The CCD's charge amplifier reads the amount of charge during readout and converts that into a voltage signal. This voltage signal, being 16 bits, is capable of being represented by 65,535 different numbers. Some CCD pixels can hold, say 55,000 electrons, while others can hold around 80,000.

The actual number is generally different for different sensors and doesn't matter that much really. It only matters that these electrons are converted into a 16-bit signal which is then displayed as a certain color or grayscale pixel on your screen. Since we can't detect a difference of 1/65,535 in brightness with our eye, we have to "stretch" the values upon display, which only means that we make our high and low points, IE the value of each pixel that represents black or white, different. So I could bring my white point down to 5,000 if I have a very low light picture to make it possible to even see anything. If I just left the white point at 65,535 the whole picture would just look black.

So lets say a pixel is readout and the charge is converted into a signal that measures as 65,000. What does this tell us about the actual light itself? Can it tell us the wavelength of the light? No. It can only tell us how many electrons were built up in each pixel.

So how do we get nice color pictures? We use filters and we take multiple exposures. OR we use a single exposure, but every single pixel has it's own filter in front of it of either red, blue, or green. This is known as a Bayer Array. So, if your software doesn't know you are using a color CCD sensor with a bayer array, it will simply display your image as a grayscale monochrome image. I personally use a monochrome CCD, which means that my sensor doesn't have a bayer array. Instead I use a separate filter wheel with RGB or other filters to get my different color frames and then combine them into a color image.
Nope. ALL photons, of ANY frequency with enough energy to excite an electron will be able to contribute to the pixel's final value. This is why filters are important. We reject the wavelengths that we DON'T want to see. An unfiltered CCD typically has a range of wavelengths that it responds to, with light of around 1,000 nm being the lowest energy capable of exciting an electron, to 300 nm being the highest energy light that it can respond to. Higher energy light is usually absorbed in in the small features of the CCD pixels before it can reach the photosensitive layer.
One more thing on frequency. While CCD's have a range of wavelengths they respond too, they do not respond to all of these wavelengths equally well. See page 5 of the following link: http://www.stargazing.net/david/QSI/KAI-2020LongSpec.pdf

The graph in the lower left of page 5 represents the Quantum Efficiency of the chip. The QE is the percent of light that reaches the CCD that will end up being converted to photoelectrons. The graph is labeled from 0.0-0.6, with 0.6 being 60% efficiency. As you can see, the graph peaks in the 450 nm range, which is the blue-green region, for a monochrome CCD. The colored lines represent the 3 different color filters of a bayer filter that come with the color chips. So even light of the best wavelength for this particular CCD is only being read with a 56-57% efficiency, and much of the spectrum is far worse.

Not that this CCD is bad, this is actually a very good CCD efficiency for the price range. For comparison, your average QE for photographic film is well under 10%. The human eye is far harder to quantify a QE for, as vision is not just a mechanical process of detecting light and turning it into a value, but an extremely complicated process involving multiple receptors, timing of these receptors firing, and dozens if not hundreds of other things. Personally I would venture a guess and say that the QE of normal color vision in the daytime is generally less than 1%. But don't quote me on that.

You are correct.

11. Nov 14, 2012

### sophiecentaur

Frequency and wavelength are related by
c = fλ where c is the wave speed.

A monochromatic wave will excite the three colour receptors in the eye and produce three signal values. The brain interprets this and assigns the combination a name which we call the colour. We are aware of vastly more colours than the ones corresponding to spectrum. The eye is easily fooled into believing that the result of excitation with several frequencies is the same as a spectral colour. Otherwise colour TV and printing wouldn't work.

12. Nov 14, 2012

### tris_d

Oh man! This is not simple. -- Thank you all, I'm chewing on it.

13. Nov 14, 2012

### Drakkith

Staff Emeritus
Unless you are working simple idealized problems, nothing is ever simple!

14. Nov 14, 2012

### Simon Bridge

I prefer to distinguish "complex" and "difficult" ... the second is subjective though the two terms are often used as synonyms in common language. So: Unless you are working simple idealized problems, nothing is ever easy!

It would be much harder for the eye to respond to each possible wavelength with a unique signal ... I suspect the resulting process would be more complex. Breaking the photon into three parts and doing the reconstruction is a handy way to simplify the process and it works very well despite the odd fudge (like "magenta").

This is certainly more difficult to follow than "color comes from wavelength" but it is no more complex.

The color of light does depend on it's wavelength, just like we tell grade-schoolers ... however, the experience of color is subjective and depends on how the eye and brain work together. How we get so much agreement on which color is which is certainly complex. iirc studying it is a big part of work on the mind-body problem.

15. Nov 14, 2012

### tris_d

And if we swap eyes, maybe you would see it's purple what you previously called green. Well maybe not to that extent, but would it be theoretically possible that we see different colors and just use the same names to describe them? Is that what you mean when you say "agreement on which color is which"?

16. Nov 14, 2012

### Simon Bridge

It is logically possible that we are giving the same label to different conscious experiences of light with the same physical properties - if I had to bet I'd say it is certain. How would I find out if my experience of what we both label "blue" is the same as, or similar to, yours? Does it actually matter? ... where differences are important is where they mess with communication eg. color-blindness. There are whole industries (eg fashion) with a basis in the subjective experience of colors - what color "goes with" which and so on. Our aesthetic sense is important for our reproductive chances even.

The range of phenomena is a whole area of study in itself and probably won't be resolved without a decent model for how consciousness works.

In physics, the word "color" ("colour") has a specialized use ... like "work" and "force".
It is usually used as a shorthand for the type of light - particularly when it comes from, or close to, the visible spectrum (we don't normally talk about the color of a gamma ray). Photons are properly characterized by their energy or momentum - which are related to a characteristic frequency. As you advance through your course you'll find yourself using "color" less and less in physics ... then you learn about quarks.

Last edited: Nov 14, 2012
17. Nov 14, 2012

### Hetware

For the macroscopic (bigger than quantum) effects, you are better to stick with the wave concept of light (electromagnetic radiation). In that realm, "brightness" can either mean what a little girl says when she says a yellow balloon is "brighter" than a blue balloon. That is subjective, and of little value to the physicist. Though it may be of extraordinary value to the cognitive neurophysiologist. Let's call it qualitative.

The idea of brightness can also have a more quantitative meaning. That is, how much energy is passing through some (conceptual) surface area in a specified amount of time. That's basically "power", or energy per time.

It boils down to how "hot" the source is. Except some sources can cheat.

Not that abstraction is a bad thing. Physics and mathematics wouldn't exist without it.

Last edited: Nov 14, 2012
18. Nov 15, 2012

### sophiecentaur

How do you intend to "Break a photon into three parts"? Before you use the term 'Photon' you should understand what it represents. It is the smallest amount of energy which can be carried by a given frequency of EM. It is not 'made up' of other photons.
I am aware how much people are attracted to the idea of giving explanations in terms of photons but this is a great example where it is not appropriate and the explanation just falls on its face.
Stick to waves, wavelength, power and all the other classical ideas where they are appropriate here. They are moire than adequate for this sort of discussion. Avoid Photons until you have a proper idea of what they are considered to be by Physicists.
The way that our colour vision (three colour analysis) works is pretty well established and 'personal' interpretations can seriously damage the understanding of newcomers to the subject.

19. Nov 15, 2012

### Simon Bridge

OK - I should signal better when I'm not being technically correct.

I actually didn;t need to refer to a particular model for the point I was trying to make.

Well called - I was too focussed on the point I was trying to make and slipped up elsewhere.
I should have talked about the receptor's response to the incoming light (simpler to have the three "signals" that a continuous frequency response) and left it at that. I was trying to convey a sense of the increased simplicity of this method and it does not help to do this if I use a complex and technical-sounding language.

Perhaps you can show me how I could have made the same points better?

---------------

Aside: parametric down conversion of photons is often described by physicists as "splitting a photon in half".
Also see Hübel H. et al. Direct generation of photon triplets using cascaded photon-pair sources Nature 466, 601–603 (29 July 2010)
... for more on how physicists understand "photon splitting". It is just not the process that happens in the eye.

Last edited: Nov 15, 2012
20. Nov 15, 2012

### tris_d

http://www.cv.nrao.edu/course/astr534/Brightness.html

This above is a link to internet article that talks exactly about the things I want to know, it's just that some parts of it do not seem to quite fit with what I can read everywhere else. Here are some statements that do not seem to compare:

1.) The number of photons falling on the film per unit area per unit time per unit solid angle does not depend on the distance between the source and the observer.

Are they talking about intensity? Should not number of photons per unit area per unit time drop off with the square of the distance?

--//--

2.) Thus we distinguish between the brightness of the Sun, which does not depend on distance, and the apparent flux, which does.

Now they say flux depends on the distance, but is flux not the number of photons per unit area per unit time that they just previously said does not depend on the distance?

--//--

3.) Brightness is independent of distance. Brightness is the same at the source and at the detector.

I guess this is true if the light source is not point source?

--//--

4.) If a source is unresolved, meaning that it is much smaller in angular size than the point-source response of the eye or telescope observing it, its flux density can be measured but its spectral brightness cannot.

What in the world did they just say here?

Wikipedia says:
http://en.wikipedia.org/wiki/Apparent_brightness
- Note that brightness varies with distance; an extremely bright object may appear quite dim, if it is far away. Brightness varies inversely with the square of the distance.

So when I try to put everything together my interpretation is this: If light source is "resolved", that is when its focused projection has angular size greater than point source, then brightness does NOT fall off with the distance. But when the light source gets so far away that its focused projection covers no more than one pixel on the image, then it becomes "point source" or "unresolved", and then the inverse square law starts to apply in such way that the brightness DOES drop off with the square of any further distance from that point on.

--//--

5.) If a source is much larger than the point-source response, its spectral brightness at any position on the source can be measured directly, but its flux density must be calculated by integrating the observed spectral brightnesses over the source solid angle.

What is "spectral brightness" and how is it different to just "brightness"?

--//--

6.) The specific intensity or brightness is an intrinsic property of a source, while the flux density of a source also depends on the distance between the source and the observer.

How can intensity and brightness be intrinsic property of a source? Is intensity and flux not one and the same thing? -- I'd say intensity is a property of emitted light rather than a property of a light source, and that brightness is a property of an image, rather than property of either emitted light or light source itself.