What are the limitations of CRT display technology? resolution, nits, ....

Click For Summary
CRT display technology faces significant limitations, particularly in resolution and brightness. Achieving high resolutions like 3840x2160 on a 34-inch screen would require increased beam current, which could lead to issues with electron scattering and mask heating. The inherent inefficiency of CRT phosphors, converting only about 30% of energy into visible light, poses another challenge compared to OLED technology. Additionally, while using thick strontium glass could theoretically allow for higher brightness levels, it raises concerns about x-ray emission and potential image quality degradation due to glass imperfections. Overall, the complexity and engineering demands of building a CRT make it a daunting project for enthusiasts.
  • #31
sophiecentaur said:
It's a Sampling Phenomenon. In order to get the brightness as high as possible, the digital screen pixels are on for as long as possible. This introduces a sinc distortion in the spatial frequency response when an object moves. I remember it was given a name Boxcar Distortion(?). This is avoided with zero width samples. I found this link which explains what I remember of sampling theory.
Wait are you saying the phosphor decay time doesn't matter for visible blur and the blur on modern displays is purely from these samples they take? So a crt with much higher decay times wouldn't look any blurrier? (I know they would have other problems like longer phosphor trails). I thought the blur had something to do with the eye interacting with light on a flat stationary surface and trying to predict the motion or something, but you are saying it's purely because of multiple width samples being taken?Additionally, this article written by the author of this website goes into much better detail than me on persistence blur vs motion blur in sample and hold displays.
https://forums.blurbusters.com/viewtopic.php?t=177

This quote I pulled from it is why I thought the blur was tied to pixel visibility:

"As you track eyes on moving objects, your eyes are in different positions at the beginning of a refresh than at the end of a refresh. Flickerfree displays means that the static frames are blurred across your retinas. So high persistence (long frame visibility time) means there is more opportunity for the frame to be blurred across your vision. That's why you still see motion blur on 1ms and 2ms LCDs, and this is why strobing hugely helps them (make them behave more like CRT persistence)."
 
Last edited:
Engineering news on Phys.org
  • #32
jman5000 said:
Wait are you saying the phosphor decay time doesn't matter for visible blur and the blur on modern displays is purely from these samples they take?
jman5000 said:
So a crt with much higher decay times wouldn't look any blurrier?
I'm glad you started this conversation because it's revived my interest.

The phosphor decay time has no effect with stationary pictures. A long decay can 'leave behind' elements of the previous frame and leave streaks or blurred back edges. (I said it the wrong way round in my previous post.)
sophiecentaur said:
CRT phosphors are given a short decay time to avoid blurring- overlap into the next pixel (even with stationary scenes, iirc)
The blurring is into the trailing part of a moving object. That bit about stationary scenes is nonsense. Long phosphor delay just gives a brighter stationary image.

The rectangular pixels are samples of the image. In many sample reconstruction operations, the result of 'boxcar' distortion can be compensated by post filtering (eg in audio DACs) but the eye can't do that so that will cause blurred edges when there's movement. Temporal effects are difficult to deal with before the camera sensor; we can't Nyquist Pre-filter the original scene so it would need to be highly over sampled to avoid those sampling products.

There's no essential difference between seeing a moving object on a screen or seeing the background when we track a moving object but I am sure that our brain video processing is different for the two cases. (tracking a moving object can make us feel unwell, for instance)

But I have reservations about the methodology in the link because the display can't (?) change the time that the pixels are switched on for. Also, the analogue decay of a phosphor can result in more than one past pixel being affected (in the extreme) but each LED pixel has no memory of its previous value. (If it does, the effect can be filtered out.)
 
  • #33
sophiecentaur said:
There's no essential difference between seeing a moving object on a screen or seeing the background when we track a moving object but I am sure that our brain video processing is different for the two cases. (tracking a moving object can make us feel unwell, for instance)
https://www.testufo.com/persistence

This actually shows how eye tracking can change how an image looks.
sophiecentaur said:
I'm glad you started this conversation because it's revived my interest.
In crt or just display tech in general? I feel disappointed that crt is dead. The best monitors had 140khz horizontal frequency and other models could run 2300x1440p at like 80hz which is beyond full hd on a small monitor. Some models even had .22mm dotpitch.

Yeah, the size is big but as someone who doesn't care about the size, I'd rather have the motion handling over the brightness of modern displays. In a near black room the colors still blow my really expensive lcd out of the water. I play more games than watch stuff so that motion quality matters massively when games are changing the entirety of the screen quickly most of the time.

I wish I'd have gotten into them a decade ago when people were offloading them for cheap. Nowadays all that's left are mostly lower quality small ones. Like I feel like display tech downgraded. That's not how technology is supposed to work. It isn't supposed to get worse. (yeah I know modern displays do some things better like ansi contrast)
 
  • #34
jman5000 said:
his actually shows how eye tracking can change how an image looks.
But he doesn't do a comparison with a CRT display and I think you may be extrapolating in that direction.
Thing is that a CRT has to use a sequential (raster) scan and the only cleverness that's available is to have interlace, which gives better motion portrayal and reduces flicker. There's nothing to say that a TV picture has to paint the pixels in that way; they can be randomly accessed and areas with motion can have higher refresh rate than still backgrounds. It's only a matter of time before this is available and standardised. But broadcasting media have an enormous lag in their development because of the need for 'compatibility' with existing equipment in the home.
Data compression is essential when people want access to multiple channels and the CRT, with its rigid scanning mode would require extra image processing to make it compatible with Digital standards.
CRTs with big images have a certain charm but they are so expensive, take up room (as big as a grand piano!), and loads of electrical power. Plus, they are too dim to be able to watch under almost daylight conditions (see the afternoon football game from a chair out in the garden) so they don't really have a place in today's world where digital displays can have amazing resolution and the possibility of upscaling scan rates. They are almost throw-away items these days and they tick virtually all the boxes in an Engineering and commercial context.
jman5000 said:
I feel disappointed that crt is dead.
My comment about Vinyl seems to have been accurate but I'm more in love with the signal processing that squeezed RG and B signals into a PAL coded signal which would fit in an RF TV channel bandwidth with a very fair monochrome version for those viewers who 'preferred black and white'.
 
  • #35
sophiecentaur said:
There's nothing to say that a TV picture has to paint the pixels in that way; they can be randomly accessed and areas with motion can have higher refresh rate than still backgrounds. It's only a matter of time before this is available and standardised.
Stuff that is filmed at 24fps, like nearly all Hollywood film and youtube videos don't gain anything by running at higher framerate than it was captured at. It won't matter if it is streamed at higher framerates. It just ends up creating duplicate images on your display (that your eye literally sees as distinct duplicate images, I'm not sure if I made that clear before). Believe me I've tried converting film to higher framerates without interpolation to try and get rid of the duplicate images the eye perceives, it doesn't work. I don't doubt lcd and oleds could match or surpass the motion handling of crts one day but who knows when that will be?

Technically we already have similar motion handling on certain lcds that use strobing backlight, but the implementation is usually bad, creating artifacts that look even worse, and the colors end up looking so washed out and dim it's not worth using. I have one of those lcds and the implementation of it is bad even though it was a very expensive monitor. If you are wondering what I mean by strobing I am talking about this:
https://blurbusters.com/faq/motion-blur-reduction/I think a large portion of these artifacts are tied to human brain doing some work on the image it perceives and therefore isn't a directly measurable attribute without using a human eye. Of course, you can still establish correlations to other display attributes using a human eye. Although if you think all the blur and duplicate images is purely tied to measured quantities, I'd love to hear a more technical explanation of it beyond just human perception.

Your explanation on boxcar blur makes me wonder, is what you describe the sole factor in motion blur on sample and hold displays or is there a human perception element outside of it?
 
Last edited:
  • #36
jman5000 said:
Believe me I've tried converting film to higher framerates without interpolation to try and get rid of the duplicate images the eye perceives, it doesn't work.
Of course, without some form of interpolation you can't up convert in any useful way. Way back, my department were developing the use of motion vectors for improving motion portrayal in 525/60 to 625/50 TV standards conversion. This link is an abstract of a paper I found which is not free but it mentions that motion portrayal can be enhanced beyond simple interpolation. If you are interested in frame rate upconversion then you might find that link interesting. A search using the term 'motion vector' should take you to some useful stuff.
You will have seen slomo sequences in sports TV coverage and they do much better than simple interpolation. In most respects, TV has far more potential for quality improvement than film can ever have.
 
  • #37
sophiecentaur said:
Of course, without some form of interpolation you can't up convert in any useful way. Way back, my department were developing the use of motion vectors for improving motion portrayal in 525/60 to 625/50 TV standards conversion. This link is an abstract of a paper I found which is not free but it mentions that motion portrayal can be enhanced beyond simple interpolation. If you are interested in frame rate upconversion then you might find that link interesting. A search using the term 'motion vector' should take you to some useful stuff.
You will have seen slomo sequences in sports TV coverage and they do much better than simple interpolation. In most respects, TV has far more potential for quality improvement than film can ever have.
Can you think of a hypothetical way to have a display that doesn't hold samples and works similar to crt method of immediately fading the image? Games cannot really be run at super high framerates so overcoming blur with more frames isn't really feasible. I wonder if fed and sed displays would of been sample and hold?

Obviously if you had high enough refresh rate display you might be able to just simulate a crt method of drawing images, but that level of display seems like sci-fi.
 
Last edited:
  • #38
sophiecentaur said:
Actually, the whole thing is full of flaws. @Baluncore has pointed out some of them.
Also, It's a bit late to be trying to improve the Shadow Mask design. That ship has sailed.
Would you mind pointing out some of those flaws? I'm not asking to spend a whole afternoon thinking through all the problems but if you immediately see something wrong with it I'd appreciate it as now I'm curious what's wrong with it.
I already noticed it's not actually open circuit by being in contact with the phosphors, but disregarding that what else was wrong with it?
 
  • #39
I found an article linking to all the other articles on the blurbuster website talking about all the sources of motion blur if anyone wants to see it. It explains it much better than I do especially if you read many of the links. The blur I was describing how to overcome with 1000hz displays is the eye tracking blur.
https://blurbusters.com/faq/lcd-motion-artifacts/

More specifically this article covers eye tracking blur: https://blurbusters.com/faq/oled-motion-blur/
 
Last edited:
  • #40
This thread has split into two now - motion is a second issue.
jman5000 said:
Can you think of a hypothetical way to have a display that doesn't hold samples
You can't avoid a sampled image these days so you are stuck with samples. Before long and if necessary, they will probably invent a solid state display with high enough power to flash the pixels at significantly less than the frame repeat period. That could mean impulse samples rather than long ones.

jman5000 said:
Would you mind pointing out some of those flaws?
There is a list of them in the earlier post I mentioned from @Baluncore.

I've been thinking more about CRTs and there are issues about the choice of a horizontal raster display. It favours horizontal motion portrayal. Your interest seems to be mostly about horizontal motion and perhaps most games are designed with more of that (?).
jman5000 said:
I think a large portion of these artifacts are tied to human brain doing some work on the image it perceives and therefore isn't a directly measurable attribute without using a human eye.
The human eye is there all the time., of course but it is not hard to make a mechanical display (large rotating cylinder ) as a reference moving image. Much easier than trying to produce fast moving unimpaired images electronically. Do you fancy making one of those and having it in the corner of your games room? :wink:

I must say, I found the blur buster display examples were very hard work to look at and there is an enormous snag that they have no reference CRT or 'real' images.
 
  • #41
jman5000 said:
Technically we already have similar motion handling on certain lcds that use strobing backlight, but the implementation is usually bad, creating artifacts that look even worse, and the colors end up looking so washed out and dim it's not worth using.
Still, modifying them: doing it right based on that direction looks more promising than the resurrection of a tech now long dead.

jman5000 said:
Stuff that is filmed at 24fps, like nearly all Hollywood film and youtube videos don't gain anything by running at higher framerate than it was captured at.
Now AI is working on that: both on resolution and fps. The results are quite surprising for a tech barely out of its cradle. Also looks like an area worthy of interest (more useful and interesting than those ch(e)atbots).
 
  • #42
sophiecentaur said:
I must say, I found the blur buster display examples were very hard work to look at and there is an enormous snag that they have no reference CRT or 'real' images.
What exactly do they need a crt for? Are you saying you don't believe the assessment that the visibility of a pixel reduces the perceived motion blur? I have a 144hz lcd and setting to different refresh rates of 60/100/144 definitely allows me to set that ufo test faster without blur and it is definitely less blurry when panning around in a game. Whatever it is seems to coincide with refresh rate so even if it isn't directly caused by refresh rate raising it seems to keep lowering perceived blur.
 
Last edited:
  • #43
sophiecentaur said:
This thread has split into two now - motion is a second issue.
Are you saying I need to make a second thread specifically for motion? I'm probably close do done bugging you now. There's not much more I can ask. Well, that's not true I could probably talk your ear off asking questions about crt but I won't.

Also, in relation to motion horizontal/ vertical on crt, does that distinction matter? I've never noticed a discrepancy between the two directions on my crt but then again maybe I never see stuff move vertically.
 
Last edited:
  • #44
Rive said:
Still, modifying them: doing it right based on that direction looks more promising than the resurrection of a tech now long dead.Now AI is working on that: both on resolution and fps. The results are quite surprising for a tech barely out of its cradle. Also looks like an area worthy of interest (more useful and interesting than those ch(e)atbots).
I hope so but it's been two decades of having a literally worse image in motion than what we had at the super high end crt monitors and while we have 4k now we still haven't caught up in motion resolution, which is kind of the whole point of a display.

I know there are theoretical ways of doing it such as having super bright lcds that can use strobing, but those still lack the deep blacks, so now you are looking at doing that with microled instead. microled isn't even available to consumers at this point.

The only other option is an oled that gets so bright that you could insert a thick rolling bar in like 75% of the frame duration, that dims the image to gain motion clarity by reducing pixel visibility duration. With a bright enough oled the blackbar wouldn't matter as long as I was okay with running at nits comparable to crt. The current black bar insertion in some oleds only has around 600p resolution in motion and ends up even dimmer than a crt.

I hope technology can keep improving. I don't know though, crt didn't keep improving ad Infinium. Who is to say we won't hit similar limitations for oled/ lcd? Do we know the limitations and it's just a matter of developing the tools to manufacture it?
 
  • #45
jman5000 said:
Who is to say we won't hit similar limitations for oled/ lcd
Just noticed that a solution got skipped here. I wonder if you have checked plasma displays (both actual parameters and potential for development) for your requirements/expectations?
 
Last edited:
  • #46
Rive said:
Just noticed that a solution got skipped here. I wonder if you have checked plasma displays (both actual parameters and potential for development) for your requirements/expectations?
Plasmas supposedly scaled poorly in terms of power and weren't power efficient enough at higher resolutions to pass regulations is something I read on the internet. You are right that plasma might have been the solution if only we'd have used faster decaying phosphors. Plasmas that were sold were typically using 4ms persistence phosphors.
 
  • #47
jman5000 said:
weren't power efficient enough at higher resolutions to pass regulations is something I read on the internet.
... to be honest, I have some doubts that CRT would pass those parts of the regulations at resolution/frequency expected these days...

For me, I see far higher chance of (partial) resurrection for plasma than for CRT.
 
  • #48
jman5000 said:
I've never noticed a discrepancy between the two directions on my crt
if you wave your hand across the screen side to side and up and down, there's a noticeable difference. But I can't repeat that or other experiments because my last CRT went away 15 years ago. There's a wealth of old papers on motion portrayal on raster scan CRTs - just dig around. Its won't have been written on computer, though!.

I have to admit that I find video games rather tiresome (but I am in a minority) so I don't really care too much about the amazing spec they need for processor and connection speeds. But I am very impressed at the eye watering performance that's available.
jman5000 said:
an oled that gets so bright
I suspect that Oleds are just a passing phase. I suspect that a bright enough device with a short duty cycle will involve high volts and high mean power consumption. That could limit applicability to non portable products. But I watch this space.

The future will be in the minds of the developers and they will ration out the advances to maximise the profits by regulating the rate that new stuff becomes available. I remember the idea that Kodak always had a set of 'new' films and processes to bring onto the market every time (and only after) another company brought out a 'copy' of the latest Kodak product.
 
  • #49
sophiecentaur said:
I have to admit that I find video games rather tiresome (but I am in a minority) so I don't really care too much about the amazing spec they need for processor and connection speeds. But I am very impressed at the eye watering performance that's available.
I'll admit I'm also getting tired of video games, only some that stand out catch my attention nowadays. But that is just it, we need super high refresh rates to lower blur from sample and hold, it's a fundamental flaw with it. Yes, more refreshes results in a more continuous image but the clarity of those images is bad because of the long persistence. My crt running at 94 refreshes flat out looks better in terms of motion clarity and smoothness than my high end 144hz lcd when objects exceed a certain threshold in speed that's not hard to exceed. The clarity is better at even 60, though I'll admit the smoothness feels less than the 144hz ld at that point, it still ends up looking better.

This isn't just for games though it applies to movies as well. Having super high framerates just produces the image duplications and they still have the blur from sample and hold as well. Running 24fps movies on a higher hz display doesn't actually decrease the persistence of it as it duplicates the frames to fit into however many hz it is. To do otherwise would result in a super dim image.

Like I said, these should be solvable problems if we can get even brighter pixels than what we have now. I know there are some displays around two thousand nits nowadays so maybe that would be bright enough, but nobody has tried using those displays in ways that improve motion clarity.
 
Last edited:
  • #50
jman5000 said:
Having super high framerates just produces the image duplications
Why would it involve something as basic as image duplications? I think your experience with increasing frame rates reflects that the processing is not top range. If you want to see how it can be done really well then look at slow mo, produced at the TV picture source - before transmission. It's very processor intensive and it builds a picture for each frame, using motion vectors for each pixel. The motion portrayal is very smooth with very little blur. I'd imagine that the cost of circuitry to do this real-time in a tv monitor may be just too much.

Rive said:
For me, I see far higher chance of (partial) resurrection for plasma than for CRT.
Haha. And heat your house at the same time. 'Electronic Tubes' use as much power for the heaters as your whole solid state display, I'd bet. But we still use one electron tube in our homes and there is, as yet, no alternative. The heroic Magnetron is a legend. I read a story that the very first one that was manufactured in a lab worked really well - the result of some brilliant lateral thinking about how electrons behave in magnetic fields
 
  • #51
Okay so modern displays seem to not be able to reach super high refresh rates for the clarity it affords. Do you think this is due to a limitation in how fast the logic can refresh the screen or could it be done if the manufacturers felt so inclined?
I feel like this is probably a dumb question that stems from me not knowing how electrical stuff works, but If it is a logic issue, as absurd as this sounds, would it be possible to use a micro crt that does the logic of painting the pixels sequentially and make some sort of analog system to convert the logic to the oled to light up the screen in the exact same way? So instead of painting a phosphor it is painting up against a grid of sensors that represent a pixel on an oled?

I watched a video on analog computers and how they can be used to do specific calculations at insane speeds for specific tasks. That's kind of what I was thinking with this since I'd want the larger display to mimic the actions of the smaller one.
 
Last edited:
  • #52
jman5000 said:
as absurd as this sounds, would it be possible to use a micro crt that does the logic of painting the pixels sequentially and make some sort of analog system to convert the logic to the oled to light up the screen in the exact same way?
You are right; it does sound absurd and I don't think you will get much further by inventing these things your head. The level of sophistication in tv monitors depends largely on how much money they want to spend on development and how much profit 'they' can make. I advise you to read around a bit more, outside your blur buster site if you want to have a better understanding.
BTW, analogue computers were once the only way to solve certain problems but digital computers have progressed in leaps and bounds. I remember, in the late 60s being shown an analogue computer that was used to investigate under water guided weapons. It was large wardrobe sized and the guy was very proud of what it could do but, at the time, almost no one had a personal electronic calculator even and all computers occupied the best part of a large air conditioned room. Things digital took off very fast after that and I'm not aware of modern analogue techniques for problem solving. (No doubt someone will put me straight about that - such is PF)
 
  • #53
The video I watched about modern analog computing was this:
The guy in the video claims modern digital computing chips were 1-4x faster than the chip they currently had but the chip they had only consumed 3 watts.
That company went bankrupt last I knew.
 
  • #54
Thread is veering off the OP topic, so it may be closed soon...
 
  • #55
berkeman said:
Thread is veering off the OP topic, so it may be closed soon...
That's fair. I don't really have any more questions.
 
  • #56
jman5000 said:
but If it is a logic issue
It is not. The last few generations of (CRT) TVs were already digital, and high performance (CRT) monitors could not be done without fine digital control already. The issue was rather the difficulty of manufacturing and the problems of wide frequency optimization for magnetic parts/drivers.
And, of course the competition they lost.

sophiecentaur said:
I'm not aware of modern analogue techniques for problem solving.
It's a tech good to flirt with but hard to actually do, so it resurfaces for time to time: especially when some new areas requiring new solutions - but then it vanishes just as fast.
This time it was with neural networks, machine learning and AI.
No real success so far.
 
  • #57
Rive said:
especially when some new areas requiring new solutions
Imo, the way forward for that stuff is with specialist 'co-processors' for commonly used functions in regular computers. But, of course, you need an extra bolt-on for every one of those functions.That's the beauty of the General Purpose Digital Computer.
Rive said:
This time it was with neural networks, machine learning and AI.
Both of those terms tend to be used out of true context. Many examples of artificial intelligence are better described as 'sheer grunt' but which term would go best on an advert?
 
  • #58
I actually am wondering, if I want to get more brightness out of x amount of phosphor, is it impossible to get y amount of brightness without z amount of acceleration voltage? Does that mean making any type of crt with higher brightness would be dangerous due to acceleration voltage generating xrays?
 
  • #59
Also, what physically happens that causes crt burn in? Could a phosphor generate x10 the nits but be exposed to the beam for a much shorter time and be just fine in turns of damaging and burning the phosphor?
 
  • #60
jman5000 said:
is it impossible to get y amount of brightness without z amount of acceleration voltage?
The beam current is also relevant (P = VI).
CRT design (particularly with three colour systems) is compromise all the way. You have to get good beam focus and colour convergence over a huge range of angles. There must also be a limit to the power density that the screen mechanism can handle.
 

Similar threads

Replies
8
Views
5K
Replies
12
Views
9K
Replies
2
Views
4K
  • · Replies 12 ·
Replies
12
Views
7K
  • · Replies 3 ·
Replies
3
Views
5K
  • · Replies 11 ·
Replies
11
Views
8K
  • · Replies 28 ·
Replies
28
Views
4K
  • · Replies 6 ·
Replies
6
Views
12K
  • · Replies 53 ·
2
Replies
53
Views
5K
  • · Replies 4 ·
Replies
4
Views
6K