Quatron quad pixel technology - From Sharp

In summary: The short of it is that you are right. RGB colors are a good match for our eyes' sensitivities. In other words, why aren't the filters on an RGB grid matched to the wavelength sensitivities of our eyes' color receptors? And where does this new color lie? I would think if it is on a straight line between red and green (a "true" yellow?), that it might be easier to see.In summary, Sharp has introduced the first 4 filter LCD pixel design which adds the new primary color Y(yellow) to the usual RGB (red, green, blue). Sharp claims that this addition will yield superior yellow, gold, and brass color renditions by expanding the pixel color gamut to include
  • #1
Ivan Seeking
Staff Emeritus
Science Advisor
Gold Member
8,142
1,755
https://www.youtube.com/watch?v=F_PT5yu976Y

Sharp has introduced the first 4 filter LCD pixel design which adds the new primary color Y(yellow) to the usual RGB (red, green, blue). Sharp claims that this addition will yield superior yellow, gold, and brass color renditions by expanding the pixel color gamut to include the yellow pixel...
http://www.lcdtvbuyingguide.com/hdtv/sharp-quadcolor.html

A Gimmick, overreaching, or the new standard?
 
Computer science news on Phys.org
  • #2


I've wondered that myself, Ivan.
To some extent I can see the rationale of their technique with respect to enhancing yellow, gold and brass colors. But I don't know. I have not actually seen the display which, presumably, is necessary for that enhanced effect.

If true, however, I can envision industry embracing/expanding this concept of going beyond the standard RGB for color combinations.
 
  • #3


Would a man in a lab coat lie to us?

In all seriousness I can't wait to get to best buy to check these TV's out against the other standard RGB ones.
 
  • #4


I think if this is the "Deep Color" that I've seen hints of here and there (ever since I got my new LCD tv :D), it might be interesting. They're attempting to go beyond the 32bit color scheme and use 48 bits. I wonder if this is the first move into that area?
 
  • #5


This link has some initial impressions:

http://hometheater.about.com/b/2010/03/24/sharp-debuts-aquos-quattron-tv-line.htm

I had a chance to view pre-production Quattron sets at this past CES. I will say that the TVs look good, with richer golds on brass instruments and sunflowers, but since the production standards for video and TV broadcasts are based on a 3 primary color (RGB) pallet, I am not sure that adding to the primary color pallet on the display end results in an overall better image.

It does seem that to get a bigger improvement, Sharp would also have to come out with production video cameras (or get Sony to do it) that have the yellow pixels as part of the recorded image...
 
  • #6


A typical monitor does 32 bits (8 bits per color, plus 8 luminance), which equals 4 billion colors. If the human eye can't distinguish that many anyway, what difference does it make going up to 40 bit (1 trillion) colors?

Have a look at this: http://img452.imageshack.us/img452/8343/gradient32bit3ze.png [Broken]
 
Last edited by a moderator:
  • #7


I could be wrong here, but I think the issue is whether having a dedicated yellow pixel enhances visual quality. Instead of the RGB triad we can now have a RGBY quad.

Instead of new cameras, I could see where display electronics could port the yellow to the dedicated pixel as opposed to allocating the entire RGB pixel array to achieve it.
Much like how "standard" red green and blue is done... dedicated pixels.

Just some thoughts...
 
  • #8


pallidin said:
I could be wrong here, but I think the issue is whether having a dedicated yellow pixel enhances visual quality. Instead of the RGB triad we can now have a RGBY quad.
Yes. So what does "enhances visual quality" mean? Does it mean we get colors we didn't used to get? And if so, can we tell the difference between ones we've seen before and these new colors?
 
  • #9


MotoH said:
Would a man in a lab coat lie to us?
If we can't believe a Federation officer, I am joining the Klingons where honor still means something. :wink:
 
  • #10


russ_watters said:
Yes. So what does "enhances visual quality" mean? Does it mean we get colors we didn't used to get? And if so, can we tell the difference between ones we've seen before and these new colors?
It's been a while since I've looked at it, but I've heard it described as color being a convex set in the plane. Given three base colors (e.g. RGB), you can mix them to form any color in the triangle they define, but none of the ones outside of the triangle. If you add in another base, you can get new colors, rather than just a finer discretization of existing colors.
 
  • #11


Hurkyl said:
It's been a while since I've looked at it, but I've heard it described as color being a convex set in the plane. Given three base colors (e.g. RGB), you can mix them to form any color in the triangle they define, but none of the ones outside of the triangle. If you add in another base, you can get new colors, rather than just a finer discretization of existing colors.
Ok...[researches]...

Here are two articles about that concept: http://en.wikipedia.org/wiki/CIE_1931
http://en.wikipedia.org/wiki/Color_triangle

The short of it is that you are right. This surprises me, since I would have thought the RGB colors we used were a good match of the sensitivies of our eyes. In other words, why aren't the filters on an RGB grid matched to the wavelength sensitivities of our eyes' color receptors? And how far off are they? And where does this new color lie? I would think if it is on a straight line between red and green (a "true" yellow?), that would mean it wouldn't offer anything new. And why have I never heard of this problem in photography or noticed it in real life?
 
Last edited:
  • #12
A better example is :

240px-CIExy1931_srgb_gamut.png


More colors would let you fill in more of the color space than the triangle.
You could also move the three primary colors outward (make the triangle bigger) - but this means darker blue and red filters which means more power to give the same apparent brightness.
This is why you have aRGB, sRGB etc - you optomize the size of the triangle to trade brightness for color fidelity.
 
  • #13


russ_watters said:
This surprises me, since I would have thought the RGB colors we used were a good match of the sensitivies of our eyes.
I can offer some rampant speculation.
  • There may have been technical issues (e.g. availability of cheap pigments, what frequencies were emitted by cheap chemicals) originally that have stuck around for the sake of backwards compatibility.
  • The colors are optimized for what for what is shown -- e.g. to give finer control over skin tones.
  • These colors were once mistakenly believed to generate all colors

mgb_phys said:
This is why you have aRGB, sRGB etc - you optomize the size of the triangle to trade brightness for color fidelity.
aRGB is something else; the a specifies transparency, which specifies how a given color in an image is to be mixed with the background color.
 
  • #14


Hurkyl said:
aRGB is something else; the a specifies transparency, which specifies how a given color in an image is to be mixed with the background color.
aRGB in the sense of adobeRGB gamut (as opposed to HP/MS sRGB) - this is different to RGBA (A=Alpha transparency)
 
  • #15
I've seen RGBA written ARGB and aRGB as well. I couldn't find any other aRGB when I wrote my post.
 
  • #16


russ_watters said:
Ok...[researches]...

Here are two articles about that concept: http://en.wikipedia.org/wiki/CIE_1931
http://en.wikipedia.org/wiki/Color_triangle

The short of it is that you are right. This surprises me, since I would have thought the RGB colors we used were a good match of the sensitivies of our eyes. In other words, why aren't the filters on an RGB grid matched to the wavelength sensitivities of our eyes' color receptors? And how far off are they? And where does this new color lie? I would think if it is on a straight line between red and green (a "true" yellow?), that would mean it wouldn't offer anything new. And why have I never heard of this problem in photography or noticed it in real life?

I can only comment from the perspective of using fluorescence microscopy. When dealing with colors such as those of fluorescent dyes, nothing on the computer screen ever fully matches what I see with my own eyes through a microscope. The best I can describe it is that the colors are "flatter."

So, strangely enough, I can conceptualize the ability to affect the range of colors viewed by adding another color channel. Though, I haven't seen any of these displays to really know if I can REALLY see the difference. What I do know is that I'm not willing to pay gobs of extra money for whatever difference it can produce, but can only hope it means other people will and it will drive down the prices of currently marketed RGB displays. :biggrin:
 
  • #17
You could play around with your software and monitor settings (see http://www.normankoren.com/makingfineprints1A.html) or it could be that your dyes are outside the gamut of the monitor - ie it can't display them without some new dyes on the screen.

Your eyes also respond differently looking down a microscope to what you see on a monitor in a brightly lit lab.
 
  • #18
...they also respond differently to looking through a telescope than looking through a microscope or at a monitor. Except for Mars and Jupiter, there really isn't much color depth to be seen with your eyes out in space. Most objects are too dim to stimulate your color receptors much. So when I do photography, the colors (from a CCD and RGB display) are vastly richer than you can see with your eyes.
 
  • #19
mgb_phys said:
ie it can't display them without some new dyes on the screen.
Isn't that the point of this product? It's adding a new "dye"?

Your eyes also respond differently looking down a microscope to what you see on a monitor in a brightly lit lab.
I'm aware of that, but I don't view the monitor in a brightly lit lab. When I'm working with images, I do so in a dimly lit room to better see what's on the screen.

Of course, it may also have to do with the detector rather than the monitor.

I don't know if this technology really makes a difference or not, I'm just commenting that I can envision the possibility. It's useless to look at videos using my current monitor to determine the quality of the product this ad claims. I'd have to go to someplace where the display was being sold and see it for myself to know if I could see an improvement or not.
 
  • #20
Moonbear said:
I'd have to go to someplace where the display was being sold and see it for myself to know if I could see an improvement or not.

Ditto. The general tech makes sense, but the proof is in the viewing using the actual displays.
 
  • #21
This is interesting, but I find advnaces in power consumption and B&W contrast far more interesting in the LCD side of things. After all, sharpening the yellow is really nothing more than a juke to the side from the issue of not producing a true greyscale to black. Now, something like 3Qi's screen tech is far more interesting, and practical.
 
  • #22
Why not six colors? They fit quite well in hexagonal patterns.. :)

My printer has six colors/cartridges...
 
  • #23
mugaliens said:
Why not six colors? They fit quite well in hexagonal patterns.. :)

My printer has six colors/cartridges...

True, but why not then go for it all: 16,777,216 for 24-bit true color? Just kidding!

As you increase the color array, you also increase its size. This could easily cause annoying pixelation.
Or, if you reduce the pixel size to compensate for increase in array size, you could loose luminosity.
Advances in technology will of course address both issue a little at a time.
 
  • #24
pallidin said:
True, but why not then go for it all: 16,777,216 for 24-bit true color? Just kidding!

Yah, ha, lol...

As you increase the color array, you also increase its size.

Yet with both printer heads and LEDs, as the pixel size has continued to decrease, the color array has also decreased, to the point where my photographic printer spits out 8x10s which rival that of 25 ASA film from the sixty's.

I think 25 ASA. Might be 25 DIN. All I know is that it's far more sharp, and rich, than any 8x10 print from that era.

I also do much large reproductions from my 8 mp camera. These cost much more, around $45, but when I do them right (interpixellating the image and performing similar color corrections before output), they fetch some decent prices ($300) for that $45.

Then again, I'm a very good photographer, so I capture what people want. That's the difference.

This could easily cause annoying pixelation.
Or, if you reduce the pixel size to compensate for increase in array size, you could loose luminosity.
Advances in technology will of course address both issue a little at a time.[/QUOTE]
 
  • #25
Bottom line, I think things will ultimately find their way to a six pixel "illuminon," along with a small enough pattern that no one will be able to discern at a distance of 24 inches or more.

Why do I think this? Because at this limit, we have arrived!

Oh, yes, there are both black and white limites (the technical terms) as well as the "contrast ratios" (the marketing terms).

Just beware the difference between the technical terms and the marketing terms...
 
  • #26
mugaliens said:
Why not six colors? They fit quite well in hexagonal patterns.. :)

My printer has six colors/cartridges...

Just give Gillette a few months and they'll come out with a TV that has five. Schick will follow with one that has six. Then Gillette will add a pixel color to the back of the screen for watching very small shows.
 
  • #27
For what it's worth, MaximumPC has a "debunking" of Sharp's Quattron tech:

http://www.maximumpc.com/article/fe...itor_hdtv_companies_cook_their_specs?page=0,4

MaximumPC.com said:
HDTV television and movie content is produced and color-balanced on three-color displays that are accurately calibrated to Rec.709. Sharp’s fourth primary color is yellow, and there isn’t anything for it to do because yellow is already being accurately reproduced with mixtures of the existing red and green primaries. More importantly, a Quattron display can’t show colors that aren’t in the original three-color source image. So what good is it? None, unless you like to see over-exaggerated yellows.

http://www.maximumpc.com/files/u90693/9_myths_405.jpg [Broken]

MaximumPC.com said:
Note that in our figure, the outer white curve represents the limits of human vision
...
Sharp’s yellow primary would need to lie somewhere outside of the red and green leg of the color triangle. But there isn’t much room between the Rec.709 triangle and the human vision curve, is there? For this reason, it’s difficult to see why a yellow primary sub-pixel is needed unless Sharp isn’t able to put its red and green primaries where they belong.
 
Last edited by a moderator:
  • #28
FlexGunship said:
Just give Gillette a few months and they'll come out with a TV that has five. Schick will follow with one that has six. Then Gillette will add a pixel color to the back of the screen for watching very small shows.

Oh man. I forgot I wrote this. I read it, and was like "LMAO, who the hell wrote that?" And it was Flex! I'm losing my freakin' mind. :bugeye:

EDIT: By the way, I do some work in the printing industry, and additional colors outside of CMYK are only useful in subtractive coloring (i.e. mixing ink or paint), but not in additive coloring (i.e. light).

http://www.printingforless.com/color.html
 
Last edited:
  • #29
Ivan Seeking said:
https://www.youtube.com/watch?v=F_PT5yu976Y


http://www.lcdtvbuyingguide.com/hdtv/sharp-quadcolor.html

A Gimmick, overreaching, or the new standard?

huge gimmick!

The problem would be in the recording medium...
The a professional video camera has 3 chips for recording color (RGB) which has the highest sensitivity to green to produce a response more like the human eye. (thats why professionals prefer green screens with video, there is more information on the green channel and it produces a clearer matte).

So if the camera is not recording the yellow channel then the TV must calculate the yellow channel from the available information (RGB) and I wouldn't expect there to be a huge difference until Panavision or another major camera manufacturer begins producing professional 4CCD camera.

film on the other hand has much greater color depth then digital; especially in the 4K range (for theatrical release) and you could see improvements there, however most films are processed and edited digitally, and the digital capture process relies on a 3CCD system.

I think it will go the way of beta-max.
 
  • #30
Few people know, but the technology has already gone the way of the Dodo once... Panasonic tried it in tube TV's :devil:

AO499.jpg
 
  • #31
Here's an interesting review for a European model of the Quattron, notice the calibration test:

sharple925-foer-1.jpg

http://www.flatpanelshd.com/review.php?subaction=showfull&id=1287569264

...and a quote from AVSForum regarding the review linked:

BobearQSI at AVSForum said:
The review measures delta E, which is deviation from the correct color. A value greater than 2 is a visible deviation for the correct color. 4-5 means the color is wrong. After the best calibration of the Sharp, the values are still mostly greater than 6. The yellow value is almost 8! This means if you put a calibrated Quattron next to a TV showing the correct yellow, you will easily see a difference in the color. And that's calibrated. The uncalibrated out-of-the-box chart was greater than 10, off the chart.
http://www.avsforum.com/avs-vb/showthread.php?t=1237271&page=6
 
  • #32
Wow. Nice find on that TV pic. I had no idea.
 
  • #34
Hi everyone,,
I see this video,,, It is really amazing and give a lot of information...
SO i recommended to see this video...
 
  • #35
Albern said:
Hi everyone,,
I see this video,,, It is really amazing and give a lot of information...
SO i recommended to see this video...

What video?
 
<h2>1. What is Quatron quad pixel technology?</h2><p>Quatron quad pixel technology is a display technology developed by Sharp that uses four subpixels (red, green, blue, and yellow) instead of the traditional three subpixels (red, green, and blue) to create a wider range of colors and enhance image quality.</p><h2>2. How does Quatron quad pixel technology work?</h2><p>Quatron quad pixel technology works by adding a yellow subpixel to the traditional RGB (red, green, blue) subpixel layout. This allows for more precise color reproduction and a wider color gamut, resulting in more realistic and vibrant images.</p><h2>3. What are the benefits of Quatron quad pixel technology?</h2><p>The main benefits of Quatron quad pixel technology include improved color accuracy, a wider color gamut, and enhanced image quality. It also allows for a more energy-efficient display, as the yellow subpixel requires less power than the traditional RGB subpixels.</p><h2>4. Is Quatron quad pixel technology compatible with all devices?</h2><p>No, Quatron quad pixel technology is currently only available on select Sharp TVs and monitors. It is not compatible with all devices, as it requires specific hardware and software to function properly.</p><h2>5. How does Quatron quad pixel technology compare to other display technologies?</h2><p>Quatron quad pixel technology is considered to be an improvement over traditional RGB display technologies, as it offers a wider color gamut and more realistic color reproduction. However, it may not be as advanced as newer technologies such as OLED or QLED displays.</p>

1. What is Quatron quad pixel technology?

Quatron quad pixel technology is a display technology developed by Sharp that uses four subpixels (red, green, blue, and yellow) instead of the traditional three subpixels (red, green, and blue) to create a wider range of colors and enhance image quality.

2. How does Quatron quad pixel technology work?

Quatron quad pixel technology works by adding a yellow subpixel to the traditional RGB (red, green, blue) subpixel layout. This allows for more precise color reproduction and a wider color gamut, resulting in more realistic and vibrant images.

3. What are the benefits of Quatron quad pixel technology?

The main benefits of Quatron quad pixel technology include improved color accuracy, a wider color gamut, and enhanced image quality. It also allows for a more energy-efficient display, as the yellow subpixel requires less power than the traditional RGB subpixels.

4. Is Quatron quad pixel technology compatible with all devices?

No, Quatron quad pixel technology is currently only available on select Sharp TVs and monitors. It is not compatible with all devices, as it requires specific hardware and software to function properly.

5. How does Quatron quad pixel technology compare to other display technologies?

Quatron quad pixel technology is considered to be an improvement over traditional RGB display technologies, as it offers a wider color gamut and more realistic color reproduction. However, it may not be as advanced as newer technologies such as OLED or QLED displays.

Back
Top