I Investigating a 3D Color Hue Phenomenon with Blender and Just Color Picker

AI Thread Summary
A phenomenon has been observed in 3D modeling using Blender and Just Color Picker, where pure primary and secondary colors maintain consistent hue measurements across surfaces regardless of lighting conditions. In contrast, intermediate colors exhibit hue shifts towards the nearest primary color in darker areas, demonstrating a measurable rotation in hue due to variable illumination. This effect raises questions about existing mathematical models that could predict hue shifts based on illumination intensity. The discussion also touches on the non-linear response of color filters in the human eye and gamma correction issues in color systems. Understanding these dynamics could enhance color accuracy in 3D graphics and modeling.
mikejm
Messages
40
Reaction score
2
I have been doing some 3D computer graphics modelling and I have observed a phenomenon I cannot find anything about. I am using Blender (free 3D program). I am also using "Just Color Picker" which is a free Windows tool that let's you check the hue/saturation/value (HSV) for any color on your screen.

The phenomenon I have observed is summarized as:
  • Objects colored with pure primary or secondary colors (red, blue, green, or cyan (blue+green), magenta (red+blue), yellow (green+yellow) when sampled demonstrate the same measured hue all over their surfaces irrespective of lighting conditions.
  • Objects with colors that have hues in between the primary and secondary colors exhibit shifting of their hues towards the nearest primary color in darker lit regions.
In HSV, the hues are described by degrees such that:
  • red = 0
  • yellow = 60
  • green = 120
  • cyan = 180
  • blue = 240
  • magenta = 300
Here is are examples of a primary (red) and secondary (cyan) colors, which if sampled anywhere on their surfaces will return hue 0 for red and 180 for cyan as expected:

red sphere.jpg
cyan pure.jpg

Here by contrast is an intermediate color between blue and cyan, where we can see that the highlights measure around 202 degrees (towards cyan) and dark regions 211 degrees (towards blue):

blue sphere.jpg

Here is a more intentionally dramatic case where we can see the sphere can measure up to 315 at the highlights (towards magenta) and 341 in the dark zone (toward red), thus giving a 26 degree hue rotation just by power of variable illumination:

purple red.jpg
Or here we see and measure the sphere clearly tilt cyan at the highlights and towards green at lowlights when the color is between these primary/secondary points:

green cyan.jpg

Or others here tilting from cyan to blue in dark or magenta to blue in dark:
blue cyan.jpg blue purple.jpg

These objects are being lit with pure white light in simulation. I am wondering if anyone is familiar with the nature of this phenomenon or if there are any existing mathematical models that describe it. ie. Models that can predict how the measured hue of a given colored object will shift from its actual color to the nearest primary color as illumination intensity or its value is decreased.

It is a strange phenomenon and hard to wrap my head around. Thanks for any help.
 
Last edited:
Science news on Phys.org
The eye may be considered as having three rather flat band pass filters centred on R G and B. Non primary colours excite two filters. Secondaries excite two filters equally. The electronic filters may have a non linear amplitude response, known as gamma. So that for instance, output equals input to the power of 0.7. In this situation the two components do not track over a range of intensities.
Gamma occurred originally due to the non linear characteristic of a cathode ray tube. Anode current is proportional to grid voltage ^ 1.5 approximately. It became customary for the camera to insert gamma correction by using an exponent of 1/gamma such as 0.7. It was known that for all colour TV systems, gamma correction produced a problem, in that R+G+B do not necessarily add up to 1.
Alternatively, for a hue slightly away from these wavelengths, maybe the weaker component falls below the noise in one of the filters in low intensity areas of the image.
 
tech99 said:
Non primary colours excite two filters.
In fact , all three sensors are involved in analysing 'most' colours.
This link shows a spectral sensitivity plot of the three analyses in the eye. Note how the B response has a peak around 800nm as well as at the blue end. Similar for the G analysis. The sensitivity in the red parts of the spectrum gives great colour discrimination in 'skin tones' where, for instance G and B contributions indicate in detail, the desaturated 'pinks and browns' of human skin pigment.
There is the classic metamerism between spectral Sodium yellow and the right mix of R and G phosphors in a TV display but it happens for all colours.
 
Thread 'A quartet of epi-illumination methods'
Well, it took almost 20 years (!!!), but I finally obtained a set of epi-phase microscope objectives (Zeiss). The principles of epi-phase contrast is nearly identical to transillumination phase contrast, but the phase ring is a 1/8 wave retarder rather than a 1/4 wave retarder (because with epi-illumination, the light passes through the ring twice). This method was popular only for a very short period of time before epi-DIC (differential interference contrast) became widely available. So...
I am currently undertaking a research internship where I am modelling the heating of silicon wafers with a 515 nm femtosecond laser. In order to increase the absorption of the laser into the oxide layer on top of the wafer it was suggested we use gold nanoparticles. I was tasked with modelling the optical properties of a 5nm gold nanoparticle, in particular the absorption cross section, using COMSOL Multiphysics. My model seems to be getting correct values for the absorption coefficient and...
Back
Top