I On Mixing Colors of Light

  • I
  • Thread starter Thread starter Charles Link
  • Start date Start date
  • Tags Tags
    Color Prism
AI Thread Summary
The discussion explores the principles of color mixing using light, specifically how combining green (550 nm) and red (650 nm) light can create the appearance of yellow (600 nm) light, despite the actual wavelengths remaining unchanged. A thought experiment involving a prism spectrometer illustrates that the perceived yellow light is a result of the mixture rather than a new wavelength. The conversation also touches on the limitations of human color perception and the differences between spectral and perceived colors, emphasizing that the eye does not function as a spectrometer. Additionally, there is a mention of using LED technology in screens to generate colors and the complexities of color vision. Overall, the discussion highlights the nuances of color perception and the physics behind light mixing.
  • #101
sophiecentaur said:
Can you apply your personal theory to explain how we see all those non-spectral colours that lie between the straight portion of the CIE chart and the central White Point? Nothing spectral in that area.
The monochromatic sources of the visible spectrum make a horseshoe around the border of the CIE chart. We can get any point in the interior with a couple of points from this outside ring in the right combination. In the OP, I happened to pick a couple of points that lie on the straight line of the upper right portion of the outer ring, but that was completely by chance. It was only after I wrote the OP, around post 20 and after, that I figured out some of the details of the CIE chart, including the mathematics upon which it is based.

I do not think my inputs fall into the category of "personal theory". I'm simply taking what is there, and explaining it in very simple terms. Cheers. :)

Edit: and note that even broadband sources are represented by a single point on the CIE chart. With TV screens, if they can get 3 sources that have points sort of spread out around the horseshoe of the CIE chart, they can the cover anything inside the triangle that has vertices at these 3 points, so they can cover most of the interior of the CIE color map. They find the combination of a red source, a green source, and a blue source, even with each being somewhat broadband, works very well.
 
Last edited:
Physics news on Phys.org
  • #102
Charles Link said:
Edit: and note that even broadband sources are represented by a single point on the CIE chart.
That's the whole point of the system; it can only produce colour matches for colours that lie within the triangle of the chosen primaries and that actually excludes all the spectral colours in any practical TV system. It excludes a huge area of colours in the region of Cyan to Blue. After passing through the system, all those colours end up in the 'right direction from White but sit along that side of the triangle. Luckily ,the eye just doesn't care too much about those colours so the error is accepted. That's because evolution 'realised' that such colours are not in our world or, when they are present, the information is not needed.
Charles Link said:
even with each being somewhat broadband,
Exactly. If the analysis were not broadband, with contributions from all spectral content, then most things would be invisible.
Charles Link said:
I do not think my inputs fall into the category of "personal theory".
Sorry. "Theory" was the wrong word. The word I was looking for is "agenda". You seem preoccupied with the spectral content of a 'colour'; you keep bringing that up all the time - you even seem to think I agree with that idea. The eye is not a spectrometer; it's only aware of 'colour'.

Many birds and insects have colours which are not all due to pigments but use interference filtering. They are 'startling' and grab our attention because they are not common.
 
  • #103
sophiecentaur said:
"Theory" was the wrong word. The word I was looking for is "agenda".
I did find it interesting to read about the mathematics of the CIE color chart from the Wiki article, and I thought I had a couple of useful things that could add to their write-up, (such as the color coordinates (x,y,z) are where the ## X,Y, Z ## crosses the plane ## x+y+z=1##), but in any case the audience seems to be so limited that I may be as well off just doing calculations for myself in my neighborhood Starbucks. These days I find there are very few people I meet there who even know how to work the Pythagorean theorem for the simple case of 5, 12 and 13. I still think my OP is a post that is worth reading, but the coffee is still good in Starbucks, as well as the people-watching, even if I can't find very many, if any, who have even a little interest in the things my generation learned when we were in school=many years ago. Cheers. :)
 
  • #104
Charles Link said:
I did find it interesting to read about the mathematics of the CIE color chart from the Wiki article, and I thought I had a couple of useful things that could add to their write-up
According to https://en.wikipedia.org/wiki/Help:Editing anyone can edit a Wikipedia topic. Why don't you concentrate on updating their CIE entry and thereby help a much broader audience?
 
  • #105
renormalize said:
According to https://en.wikipedia.org/wiki/Help:Editing anyone can edit a Wikipedia topic. Why don't you concentrate on updating their CIE entry and thereby help a much broader audience?
Sounds good, but I really don't have a lot of interest in making an anonymous edit. I thought I would have a much wider audience on Physics Forums than I did=we had it a few years ago, but it seems for some reason the search engines aren't steering people to the Physics Forums posts nearly as much as they did previously. My plan is to continue to stick with the Physics Forums for better or worse. Cheers. :)
 
  • #106
Charles Link said:
Sounds good, but I really don't have a lot of interest in making an anonymous edit. I thought I would have a much wider audience on Physics Forums
It doesn't have to be anonymous.

Still, it's strange that you would turn your nose up at contributing to the quintessential Library of the Internet, where you will reach countless people who have come there specifically out of interest in the very topic you're writing about.
 
  • Like
Likes renormalize
  • #107
To respond to the above=I was glad I was able to figure out what the CIE chart is all about, but from what I can tell from the limited positive feedback that my explanation received on the Physics Forums, to me it would be somewhat pointless to put much effort into posting it in other places. This forum IMO should be the one where people would have the most interest in hearing what may be another way to look at something. I'm presently retired=I've seen the very competitive world for many years at both the university and at the workplace. The coffee was good again today at Starbucks. Cheers. :)
 
  • #108
DaveC426913 said:
This is actually a different phenomenon completely.

Your brother's night vision is comprised mostly of activated rod photoreceptors - which do not detect colour. The cone photoreceptors - which do detect colour - are not activated in dim light.
He is seeing in black, white and shades of grey.

His camera does not suffer from this affliction and sees colour at low illuminations no problem.
There is something else going on with digital cameras - they can capture light outside what humans can see, but then display that light as visible light. You can see this if you point your phone camera at a TV remote and push some buttons on the remote. In the phone display, you can see the LED's light up, but if you look at it, nothing is apparent.

I also notice if I am photographing or videoing with my camera, the blues tend to look more purple when viewed back.

I am not sure how much of this phenomenon plays into the Northern Lights effect that you described.
 
  • Like
Likes Charles Link
  • #109
scottdave said:
In the phone display, you can see the LED's light up, but if you look at it, nothing is apparent.
Regular CMOS sensors have an extended response into the IR region. There is a filter placed on top of the camera sensor which brings the sensitivity range more into the visual red spectrum. It's a fairly easy job to modify a regular DSLR for astrophotography by removing that filter. You can then put filters of choice in front of the sensor That gives you more flexibility and selectivity for bringing features out in your images. Astrophotographs seldom show the 'true' colours but that's ok because your eye is not capable of seeing colours accuratly / at all out there because, at that light level it's only the rods that work. A solid state sensor can be left to cook for many minutes and multiple images can be stacked with softwre. Also, the colour pallette that's normaly used for astro images is not tied to human sight but to enhance various features which show the presence of elemts in the different structures out there.
 
  • Like
Likes scottdave and Gleb1964
  • #110
scottdave said:
the blues tend to look more purple when viewed back.
It's usually possible to adjust colour balance to get rid of that common problem, either to a permanent setting or for individual images.
 
  • #111
I would like to make a couple further comments regarding the mixing of colors of light. My comments here are made more at the beginner level=someone who already has an in-depth understanding might find my inputs perhaps even somewhat boring.

When mixing red, green, and blue light together, the result in general is white light. In the CIE color map of post 16 I'm a little surprised that I don't see a larger section of white in the middle of the color map, but there again, I think the map of post 16, at least what I get on my computer screen, does not have very accurate colors in at least a couple places.

To me it is somewhat remarkable that these 3 colors can generate what we see as white light=even as remarkable as red and green able to generate what we perceive as yellow. This may be obvious, but I will state it in any case=if we put the white light into the prism of the OP, we get the colors of the rainbow. If we combine the colors of the rainbow onto a white sheet of paper, we will see a bright white. If we send this white light into the second prism, we once again get the colors of the rainbow. Thereby, the "white" isn't really a color, i.e. it is not light of a single wavelength, like red, green, yellow, or blue can be.

Once again, my inputs here are basically for the beginner, and also for the sake of completeness. It may be somewhat obvious, but if you don't state the obvious, at times it can become the oblivious. Perhaps a viewer or two will find my latest input here of some benefit.
 
  • #112
Pure spectral colours are placed on the outside line of the CIE color map ( excluding the strait purple line connecting blue and red, which is a combination of blue and red). All other colours that are inside outer line are represent a mixture of pure spectral colours. White colour in that respect has no difference from the other.
 
Last edited:
  • Like
Likes Charles Link
  • #113
Charles Link said:
Thereby, the "white" isn't really a color, i.e. it is not light of a single wavelength, like red, green, yellow, or blue can be.
"white" is not a specific location on the CIR chart. It is chosen ad-hoc in the specification of the illuminant that's used for the system. But it's right there in the middle of the CIE chart so how can it not be 'a colour'?
Charles Link said:
Thereby, the "white" isn't really a color, i.e. it is not light of a single wavelength, like red, green, yellow, or blue can be.
You really are locked into the idea of colours all being spectral. Is the CIE chart not crammed full of 'colours'? Are the 'colours sitting on the Red - Blue line not really colours. By your argument you would have to say that you haven't seen a single colour since you woke up this morning, unless you were looking at the output of a spectrometer. Can you quote a passage that agrees with your view? (You would need a quality source.)
 
  • #114
What I was trying to say in the above, and I was trying to address a beginner audience, is that "white" light is not a single wavelength. The concept is simple enough that I think the more advanced should be able to see what I was referring to, and could perhaps say it in a better way, rather than putting a lot of effort into finding a lot of fault with the statement. I did think it was necessary for completeness to mention white light in this thread.

The concept of monochromatic (single wavelength sources of some ## \lambda ## around a narrow ## \Delta \lambda ## ) is also needed early-on for students. I do think I am presenting this concept in a reasonably good way, but @sophiecentaur , you may disagree. Both in college and at the workplace I did a fair amount of spectroscopy work, and for doing any kind of interference, such as from a diffraction grating or a thin film interference filter, one works with single wavelengths in the calculations, because light of two different wavelengths, except in rare cases, does not interfere. This is also a concept that is taught early-on in the Optics courses.

Edit: and to say the above in another way, there is no such thing as "monochromatic" white light, where monochromatic means single wavelength. Red, blue, green, yellow, orange, and violet can all come in monochromatic form, but white light can not. It was shown in the OP that yellow can have a monochromatic form with ## \lambda \approx 585 ## nm, or it can come as a mixture of red and green light, but white light is never of a monochromatic form. I thought this is indeed a useful concept for the beginner=I thought it should have been fairly clear what I was trying to get across to the reader.

In any case, it is worthwhile to get some feedback in an ordinary thread, rather than to try to write the topic up as an Insights article, and then find there are a couple who disagree, perhaps even strongly, with how it is presented.
 
Last edited:
  • #115
Charles Link said:
Edit: and to say the above in another way, there is no such thing as "monochromatic" white light,
Did we need 114 posts to establish that? If you are anxious to get a worthwhile message over to students then get then to distinguish between Wavelength and Colour. Colour is a totally subjective experience and wavelength refers to just one measurable spectral line.
Monochromatic is a term for single wavelength and, as I have pointed out many times, we just do not see examples of monochromatic / spectral light in everyday life. None of the colours which we can recognise and name in a normal day is monochromatic. Likewise, the term 'white light' is not precise enough; two people in two different labs are likely to be using different 'white points' unless they make a point of sharing the spec of the white source they are using.
Charles Link said:
except in rare cases,
which cases are you referring to?
Charles Link said:
Thereby, the "white" isn't really a color, i.e. it is not light of a single wavelength, like red, green, yellow, or blue can be.
If your aim is to avoid confusing students then you should try to avoid sentences like that one. What message were you trying to get across with it? Is my new blue shirt 'not' a colour because the blue is not spectral blue?
 
  • #116
sophiecentaur said:
which cases are you referring to?
Something called heterodyning. (For two different wavelengths interfering)
 
Last edited:
  • #117
sophiecentaur said:
Did we need 114 posts to establish that?
I thought some of the posts were fairly productive. This is just an extra detail that we hadn't discussed, (that white light can never be a single wavelength). For the more advanced ones it is obvious, but perhaps there are readers who would benefit by mentioning it.

On another note, things don't need to be spectral (or a monochromatic) blue to be blue, but the blue will generally be dominated by wavelengths between 425-475 nm or thereabouts. Green 500-560 nm or thereabouts , and reds 625-750 nm. White light will normally be a balanced mixture of all three. It is a step in the right direction IMO for students to start to think in terms of wavelengths. Thinking in these terms to me is even much more important than to know some or any of the details of the CIE color map.
 
Last edited:
  • #118
Charles Link said:
(that white light can never be a single wavelength)
How could it ever be if it doesn't lay on the spectral curve?
Charles Link said:
Something called heterodyning. (For two different wavelengths interfering)

Have you the remotest idea what that's about? It is a 'thing' involving non-linear interaction of two or more signals of different frequencies but it doesn't involve "interference".
Charles Link said:
On another note, things don't need to be spectral (or a monochromatic) blue to be blue, but the blue will generally be dominated by wavelengths between 425-475 nm or thereabouts. Green 500-560 nm or thereabouts , and reds 625-750 nm.
Can you repeat that 'theory' for magenta? What wavelength is magenta, by the way?
Charles Link said:
It is a step in the right direction IMO for students to start to think in terms of wavelengths
Those poor students.
 
Last edited:
  • #119
sophiecentaur said:
Have you the remotest idea what that's about? It is a 'thing' involving non-linear interaction of two or more signals of different frequencies but it doesn't involve "interference".
With heterodyning the beat frequency (I also call that interference, besides the spatial interference patterns one gets from things like a diffraction grating) can be picked up between two optical sources closely matched in wavelength , typically somewhere in the acoustic range. It's been a while (20 years ago or thereabouts) that I worked with acousto-optic devices, but please give me a little credit for having seen some of the basics like Raman-Nath scattering. See https://www.rp-photonics.com/optical_heterodyne_detection.html

The write-up in this "link" tends to disagree with you=it says optical heterodyning simply uses a photodiode and not non-linear materials. I could present more of the work that I did with acousto-optics, but this doesn't need to turn into a contest of who is or isn't qualified to at least say something about the subject.
 
Last edited:
  • #120
Charles Link said:
I also call that interference,
"Intermodulation"? Why not use the right term?
Charles Link said:
The write-up in this "link" tends to disagree with you=it says
A photodiode is non-linear. That's how it works. That link is about detecting light and not changing colours. It's not relevant to this thread at all.

And what about my question about Magenta? Does that have a wavelength?
 
  • #121
sophiecentaur said:
And what about my question about Magenta? Does that have a wavelength?
I had to google that=magenta is not typically in my every day vocabulary, but I see it is a mixture of blue and red. Magenta is something a prism spectrometer would also split up into basically a blue grouping and a red grouping. I do think it can be worthwhile for the next generation to address items like even magenta in the discussion, rather than to have the mindset that everyone must know that since it seems to have been around for as long as I can remember. Cheers. :)
 
  • #122
sophiecentaur said:
A photodiode is non-linear. That's how it works
For EE's who work with voltage, (or electric field strength), perhaps they might consider a photodiode to be non-linear, but for an optics person, the photodiode, basically counting photons, with a photocurrent proportional to the photon count, (proportional to second power of the electric field amplitude), is about as linear as you can get. I've measured their linearity over several orders of magnitude (of incident light level), and could not detect any noticeable non-linearity. That is getting off-topic, but since the topic came up, I think it is worth addressing it.
 
  • #123
Charles Link said:
For EE's who work with voltage, (or electric field strength), perhaps they might consider a photodiode to be non-linear, but for an optics person, the photodiode, basically counting photons, with a photocurrent proportional to the photon count, (proportional to second power of the electric field amplitude), is about as linear as you can get. I've measured their linearity over several orders of magnitude (of incident light level), and could not detect any noticeable non-linearity. That is getting off-topic, but since the topic came up, I think it is worth addressing it.
I am confused by your exchange with @sophiecentaur. This link https://en.wikipedia.org/wiki/Optical_heterodyne_detection states:
"The comparison of the two light signals is typically accomplished by combining them in a photodiode detector, which has a response that is linear in energy, and hence quadratic in amplitude of electromagnetic field. Typically, the two light frequencies are similar enough that their difference or beat frequency produced by the detector is in the radio or microwave band that can be conveniently processed by electronic means."
So yes, a photodiode is indeed inherently linear in detecting energy (counting photons) as you say you measured, but it is inherently nonlinear (i.e., quadratic) in detecting amplitude. It is this nonlinearity that is responsible for mixing the amplitudes of optical signals with slightly differing frequencies that results in a measurable beat frequency. That's why @sophiecentaur correctly says that photodiode nonlinearity is responsible for producing the observable RF difference signal.
 
  • Like
Likes Gleb1964, davenn and sophiecentaur
  • #124
renormalize said:
I am confused by your exchange with @sophiecentaur. This link https://en.wikipedia.org/wiki/Optical_heterodyne_detection states:
"The comparison of the two light signals is typically accomplished by combining them in a photodiode detector, which has a response that is linear in energy, and hence quadratic in amplitude of electromagnetic field. Typically, the two light frequencies are similar enough that their difference or beat frequency produced by the detector is in the radio or microwave band that can be conveniently processed by electronic means."
So yes, a photodiode is indeed inherently linear in detecting energy (counting photons) as you say you measured, but it is inherently nonlinear (i.e., quadratic) in detecting amplitude. It is this nonlinearity that is responsible for mixing the amplitudes of optical signals with slightly differing frequencies that results in a measurable beat frequency. That's why @sophiecentaur correctly says that photodiode nonlinearity is responsible for producing the observable RF difference signal.
In optics, we would never consider a photodiode to be non-linear. That's where in some cases, we are crossing areas of specialization. I was very much a spectroscopist, and almost always worked with wavelengths. He seems to have done a fair amount of work with mixing colors, (more than I have), and has probably been familiar with the CIE color map for many years, something whose details I only got very familiar with in the last 3 weeks. We seem to disagree greatly on how to address an audience that may be largely made up of beginners. I'm trying to take it all in stride.
 
  • #125
renormalize said:
Typically, the two light frequencies are similar enough that their difference or beat frequency produced by the detector is in the radio or microwave band that can be conveniently processed by electronic means."
Exactly. The intermodulation product is not at a frequency of visible light but a beat frequency in the microwave region. The mechanism by which two different coloured light sources can give the perception of another different coloured source is entirely because of the analysis curves of the eye and the way the three signals are processed in the brain.. The photodiode bit is a complete red herring.

@Charles Link has still not acknowledged the absolute distinction between spectrum and colour.
 
  • #126
The difference in frequency is often created by an acousto-optic modulator. That allows for the original reference frequency along with one upshifted or down-shifted by the acoustic frequency. The two beams are combined onto a photodiode, and it is really a question of semantics=I know the photodiode as a linear device.
 
  • Skeptical
Likes sophiecentaur
  • #127
Charles Link said:
We seem to disagree greatly on how to address an audience that may be largely made up of beginners.
I can't think of an 'audience' for this topic except students of a specialist field in Colour reproduction systems. Anyone delivering useful lectures to such an audience would be a specialist in the field and I can't see why Charles Link keeps talking in terms of presenting his ideas to 'beginners'. It could only confuse or baffle an audience of straightforward Physics or Engineering students or possibly Broadcast Engineers.
 
  • #128
sophiecentaur said:
I can't think of an 'audience' for this topic
From what I can tell, we don't have much of an audience in any case. We are arguing about some details that could largely be due to me being very much a spectroscopist versus someone who may have had much experience in the color television industry. We can thank our lucky stars though that we can both see good enough to be able to argue what we might see when we see a color scene. I have known at least a couple of people who were not able to see, and life for them was very challenging. One was bright enough though, that she was able to tutor calculus. (She went blind around the age of 24, and at 60 she was still tutoring calculus to high school students). I thought I made some worthwhile posts throughout the thread, but perhaps I am mistaken. In any case, I am a retired spectroscopist, and that was my major area of specialization from early-on. Cheers. :)
 
  • #129
sophiecentaur said:
I can't think of an 'audience' for this topic except students of a specialist field in Colour reproduction systems. Anyone delivering useful lectures to such an audience would be a specialist in the field and I can't see why Charles Link keeps talking in terms of presenting his ideas to 'beginners'. It could only confuse or baffle an audience of straightforward Physics or Engineering students or possibly Broadcast Engineers.
Which is why Wiki would be the perfect place to store this. Anyone at any level can avail themselves of this knowledge as they see fit. That's what it's for.

The OP has said he prefers to discuss with PF denizens, because they might have a commensurate level of interest and knowledge - yet in the same breath, he is saying he's targeting it to beginners.
 
  • Like
Likes renormalize
  • #130
DaveC426913 said:
The OP has said he prefers to discuss with PF denizens, because they might have a commensurate level of interest and knowledge - yet in the same breath, he is saying he's targeting it to beginners.
I'm glad I posted what I did, but I'm not getting very much constructive feedback. Perhaps the posts (mine) are lacking in good quality content, so I may be asking for too much. In any case, it's going to be 20 degrees here in Chicago for the next week or two or more, but at least the coffee is good at Starbucks. Cheers. :)
 
  • #131
Charles Link said:
I know the photodiode as a linear device.
Yes, to intensity, but not to amplitude, which is why photodiodes can be used for optical heterodyning. Do you understand and acknowledge the distinction?
 
  • Like
Likes sophiecentaur and davenn
  • #132
renormalize said:
Yes, to intensity, but not to amplitude, which is why photodiodes can be used for optical heterodyning. Do you understand and acknowledge the distinction?
Of course. Many r-f engineers work with electric field amplitudes, but in Optics we are working with photons=energy. I'm well familiar with the classical E&M, and in Optics we simplify the formula/units and say intensity ## I=n E^2 ##, where ## n ## is index of refraction and ## E ## is electric field amplitude.

Meanwhile the photodiode response is typically specified as amperes(photocurrent)/watt which is the same as Coulombs/joule and the response is wavelength dependent, largely because the photon has energy inversely proportional to wavelength. (##E_p=hc/ \lambda ##). Silicon photodiodes are used very much in the visible and cut off in the near infrared at a wavelength around 1.0 microns. Typically a photodiode (the older larger ones) receives a couple microwatts of incident light that results in a couple microamps of current. Nowadays in cell-phone cameras I think they are working at far lower levels of power and photocurrent.
 
Last edited:
  • Like
Likes renormalize
  • #133
renormalize said:
Yes, to intensity, but not to amplitude, which is why photodiodes can be used for optical heterodyning. Do you understand and acknowledge the distinction?
There are many examples of devices with a non linear response but the design criteria will not not include very low loss at optical frequencies. A photodiode (with very low loss) would presumeably be great at counting photons. That same low loss (quantum efficiency) will make it good in the hetrodyning mode for detecting and analysing low levels of light. Two birds with one stone.
 
  • #134
@sophiecentaur I would like to respond further to your post 118. I do think the students do benefit greatly to think in terms of wavelength(s), and are hardly poor if they go in that direction. I also would encourage them to not only have an understanding of what a prism spectrometer is, but to also know some of the details of the workings of a diffraction grating type spectrometer. See https://www.physicsforums.com/insights/fundamentals-of-the-diffraction-grating-spectrometer/

Even if they do begin to think in terms of wavelength, I still think they could also benefit by learning about the CIE color coordinates and map, which is discussed in some detail on the first page of this thread, beginning around posts 16 and 20. I'm not sure where the job market is going these days, but IMO one really doesn't go wrong if they make up their mind early-on to try to be as complete a physicist as possible.
 
Last edited:
  • #135
Charles Link said:
I do think the students do benefit greatly to think in terms of wavelength(s), and are hardly poor if they go in that direction
Students are already familiar with the quantities involved in optics and general wave theory. Of course, they think in terms of wavelength and frequency. How else?

Why would colourimetry be taught in a Physics course? Colour vision is more PsychoPhysics than Physics. We all have to teach to the test if you want good exam results.

I have suggested using a computer monitor and a simple image editing package to see (on a subjective level) how R, G and B values relate to recognisable colours. Have you tried that exercise? You wouldn't even need to get up from the chair you're sitting on to do it. GIMP is vast and it is free.
 
  • Like
Likes Charles Link
  • #136
  • #137
Charles Link said:
... and it really isn't IMO written up very well in very many places.
So write it up then. No one can do a better job than the guy who is motivated to do so.

Start with Wiki. They will automatically provide feedback.
 
  • Like
Likes sophiecentaur and Charles Link
  • #138
sophiecentaur said:
I have suggested using a computer monitor and a simple image editing package to see (on a subjective level) how R, G and B values relate to recognisable colours. Have you tried that exercise? You wouldn't even need to get up from the chair you're sitting on to do it. GIMP is vast and it is free.
See https://www.luxalight.eu/en/cie-convertor that I linked previously in post 78. These days I am working with a simple Chromebook so I am very limited with what I am able to download. I did find the program in this link useful though.

Edit: I anticipate when you type in 580 nm or 590 nm into this program and get yellow on your screen, it is actually coming from a combination of a red LED and a green LED, with wavelengths (perhaps somewhat broadband) in the range of 550 nm and 650 nm respectively. It even gives you the color coordinates, (x and y), so it is basically telling you the proportions (approximately) of red and green that it used to create the color displayed on the screen. Note that the blue coordinate z=1-x-y.
 
Last edited:
  • #139
Charles Link said:
These days I am working with a simple Chromebook so I am very limited with what I am able to download.
GIMP is available on a Chromebook. Worth the effort to find the colour picker or the colour 'dropper', whichever you can find. Load a CIE chart Into GIMP and play away.
 
  • Like
Likes Charles Link
  • #140
sophiecentaur said:
Worth the effort to find the colour picker or the colour 'dropper',
I found the color picker. I'm still working on it though to get color coordinates etc. The software is not self-explanatory.
 
  • #141
Charles Link said:
I found the color picker. I'm still working on it though to get color coordinates etc. The software is not self-explanatory.
It was a long time ago that I used GIMP but I seem to remember a little 'dropper' icon which breaks out a magnified sample of the image (on the mouse position) it shows the RGB values of the central pixel. A CIE chart is interesting to scanover with the dropper.
 
  • Like
Likes Charles Link
  • #142
sophiecentaur said:
It was a long time ago that I used GIMP but I seem to remember a little 'dropper' icon which breaks out a magnified sample of the image (on the mouse position) it shows the RGB values of the central pixel. A CIE chart is interesting to scanover with the dropper.
I'm starting to figure it out thanks. If you work with RGB, those are each numbered 0 to 255, but as you change those, it changes the other values in the display, like CMY, etc.

See https://fixthephoto.com/online-gimp-editor.html
 
Last edited:
  • #143
Charles Link said:
but as you change those, it changes the other values in the display
What does that mean? As you move between areas on the CIE chart, why shouldn't more than one value change?

Did you notice that there are very few objects in a scene with one of RGB being near zero? Saturated colours are rare in everyday scenes.
 
Last edited:
  • Like
Likes Charles Link
  • #145
I've gotten very little feedback from the section on page 3 of posts 75-78, ( just one person responded there), but I think a couple readers might find that section of interest. Using the software in the link in post 75, it is easy to show that my original post, the OP, is indeed what you can get, with the one correction being that the yellow is from around 580-590 nm, rather than at 600 nm.
 
  • #146
Charles Link said:
a couple readers might find that section of interest
Did you consider that what you are saying may not be as 'right' as you think it is.? After a lot of posts between you and me, you still don't seem to have taken on board what I have been saying. You just seem to ignore very important points that I have made. Your terminology and basics are still the same as when you started on this thread. You have to allow yourself to have your ideas changed but I get the feeling that you just don't want to be wrong. Just 'bending' what you read here to fit your ideas won't get you anywhere.
 
  • #147
sophiecentaur said:
Did you consider that what you are saying may not be as 'right' as you think it is.? After a lot of posts between you and me, you still don't seem to have taken on board what I have been saying. You just seem to ignore very important points that I have made. Your terminology and basics are still the same as when you started on this thread. You have to allow yourself to have your ideas changed but I get the feeling that you just don't want to be wrong. Just 'bending' what you read here to fit your ideas won't get you anywhere.
It is certainly possible it isn't completely correct. The CIE coordinates and their vector space assume a linearity of the human response, and in that sense it isn't a perfect model. Using the CIE map though, what I proposed in the OP and post 75 is in agreement.

Perhaps I would do well to move onto some other topic though. The E&M with its vector calculus and things like magnetostatics might be worth revisiting, but that will have to be in some other new thread= and there is always the possibility that I erred in the computation of the addition of a couple of simple vectors, (in posts 75-78), but do you know that ## \nabla ( a \cdot b)=a \cdot \nabla b+b \cdot \nabla a +a \times \nabla \times b+b \times \nabla \times a ## ?, (where ## a=\vec{a} ## and ## b=\vec{b}##), a vector identity that can be useful in some E&M work. I don't know everything either, but I do try to make the posts somewhat interesting. Hopefully I didn't bore you too much. Cheers. :)
 
Last edited:
  • #148
sophiecentaur said:
Something that I only recently cottoned on to is that, in the early days, camera film had really bad red sensitivity so spectral measurements had very little information about red (or IR). The universe looked very different in them thar days.
That's also why darkrooms are lit with red light, even in TV and films. The photo paper is insensitive to far red, though we can can pick it up with our eyes.
 
  • #149
DaveC426913 said:
That's also why darkrooms are lit with red light, even in TV and films. The photo paper is insensitive to far red, though we can can pick it up with our eyes.
Very historical. I can’t bring myself to ditch my old film cameras but it wasn’t a good medium.
 
  • #150
sophiecentaur said:
Very historical. I can’t bring myself to ditch my old film cameras but it wasn’t a good medium.
Still got my Pentax K-1000 from college.
 
  • Like
Likes davenn and sophiecentaur
Back
Top