Display technology: Q & A

TechonNapkins

Weaksauce
Joined
Oct 20, 2010
Messages
94
I have a number of lingering questions over the years no one could/would ever answer, so hopefully someone here can! And feel free to post your own.
 
(1) Why do CRTs seem clearer,with higher contrast, when running at a lower refresh rate? Is because the gun has more time on each phosphor on each pass?

(2) Why 720 and 1080 resolutions? If MPEG1/VCD was 240 lines, and MPEG2/DVD was 480 lines, why not make 960 lines the new HD standard? It would have made scaling much easier. Or was it to consolidate the differing NTSC/PAL resolutions?

There were more... racking my brain now...

*Update*

(3) Back before Displayport was released, I heard something about it ditching the whole refresh rate thing and addressing the pixels directly. I can't remember where I read that preview, but anyone know whatever came of that idea? Or was that previewer full of crap?

(4) Why do 6500K ("daylight") light bulbs look too blue, but the 6500K setting on monitors seem a tad on the warm side (like a 4100K bulb).

*Update 2*

(5) Can the contrast ratio, at a certain bitrate, be too high, when it comes to slow fades? (I don't mean the meaningless dynamic contrast ratios, but actual) At 24-bit color, for example, you have a surprisingly limited number of luminance levels (if maintaining the same color). Would a higher contrast ratio make fades more jarring? For example, if a scene in a movie fades from white to black slowly, would a contrast ratio of 1000:1 make each fading frame drop luminance twice as much as a screen with a 500:1 ratio?
 
Last edited:
(1) Why do CRTs seem clearer,with higher contrast, when running at a lower refresh rate? Is because the gun has more time on each phosphor on each pass?

(2) Why 720 and 1080 resolutions? If MPEG1/VCD was 240 lines, and MPEG2/DVD was 480 lines, why not make 960 lines the new HD standard? It would have made scaling much easier. Or was it to consolidate the differing NTSC/PAL resolutions?

There were more... racking my brain now...

I can't answer the first question as I don't have access to a CRT. Ask in the FW900 thread to see if someone with knowledge of halation behaviour can comment.

Secondly, the 1920x1080 HDTV standard came from a desire to see 2 megapixel images displayed in the agreed upon 16:9 standard. It is precisely meant for the HDTV viewing distance to produce video that occupies a larger area of the viewer's field.
 
(1) Why do CRTs seem clearer,with higher contrast, when running at a lower refresh rate? Is because the gun has more time on each phosphor on each pass?
..
fwiw crts at a low refresh rate are not clearer... Runing a tube @60Hz the scan-lines are visible and tiring, whereas with LCDS this does not happen.
CRTs are best at 85Hz and over, ideally 100Hz, and there is a reason they stated optimal settings. Yes I could get a 19 incher running @ 1920x1440 @ 60Hz, but not for long hour viewing. On the flip side lowering res would up the refresh rate making it much easier on the eyes.
 
(2) Why 720 and 1080 resolutions? If MPEG1/VCD was 240 lines, and MPEG2/DVD was 480 lines, why not make 960 lines the new HD standard? It would have made scaling much easier. Or was it to consolidate the differing NTSC/PAL resolutions?
.

Perhaps they wanted a standard with less-than-ideal scaling so that HD content would look even better comparatively...
 
You have to understand that each step of the way involved new technology and it took the Studios and Manufacturers to come up with a standard that they both would agree upon. The 480 standard was almost a give because it allowed current TV sets (tube sets) to use the standard without any conversions or replacements. The result was the DVD player and a world thrilled with a vastly improved picture and sound quality over the video tape system. This satisfied the industry for decades and as technology advanced, the Major Manufacturers became aware that there was hardly any profit in selling DVD machines because the market had become saturated and started to look into a new even better DVD type system. Again they worked with the Studios and this time with the TV Broadcast companies for a totally new standard. They finally came up with the 720p and 1080p standard that the Studios and Broadcasters would totally agree upon. It was mainly the Studios that came up with the new standard. Since this was going to require all hardware to be changed (Laser, broadcast frequency and equipment and TV’s) they felt free to set a different standard than just doubling the line count to 960 but still within the technology available at the time.
 
Last edited:
(1) Why do CRTs seem clearer,with higher contrast, when running at a lower refresh rate? Is because the gun has more time on each phosphor on each pass?

You should see improvements as you increase refresh rate, say from 60Hz to 85Hz. But if you push too far, to something like 120Hz, you can start to have issues with the limits in the analog electronics/cables etc.. that can start to lead to ringing in the signal making for a more fuzzy picture. This depends on the quality of your analog electronics, cables etc..

Or was it to consolidate the differing NTSC/PAL resolutions?

This. HDTV was an attempt to consolidate internations video standards. First there was a lot of compromise/argument on what the Aspect Ratio would be and the 16:9 compromise was reached that actually didn't match anything.

Next NTSC is essentially 480 and PAL is 576. Likely both would like a perfect double of the vertical resolution for HD. That would be 960 or 1152. 1080 is nearly halfway between.

720 is the Highspeed alternative. 1080/30Hz is very close 720/60Hz for bandwidth (for broadcast. There is no 1080/60Hz broadcast).
 
Thanks for the replies thus far. My heart can be a little more at ease knowing there was some logic involved in the HD resolution choice. I contacted Panasonic HO a number of years ago about this, and they basically just said, "Ask your broadcaster"... umm... thanks?

As for the CRT thing, a few more details: it was worse on some monitors than others, but it seemed to effect every CRT I ever saw. Each refresh rate step increase stabilized the image more and more (I could still see the difference b/t 75Hz and 85Hz, but after that, it looked completely stable), but along with that benefit was a two-fold negative. One, the image seemed to loose contrast: images didn't "pop" like they used to. Two, the image looked blurrier, especially as I pushed the monitor up to and beyond it's max specs (like running a 21" at 1600x1200 at 100Hz, that was rated for 85Hz).

Since the problem is two-fold, I think the cause may be two fold. The image losing that "pop" was probably due to the aforementioned "halation" effect. If the gun is scanning the screen more slowly, the overall brightness may be the same, but the scan line would be significantly brighter than the rest of the screen, as at lower refresh the non-active phosphors have more time to dim. Our eyes would retain that sense of intensity for longer than it was actually on screen. As for the loss in clarity, I suppose the gun looses precision as it's forced faster and faster.

Or am I just crazy? Has no one else noticed these drawbacks of high refresh rates on CRTs?
 
I definitely noticed a degradation when pushed too far on CRT. You are dealing with many Analog subsystems and forcing them to run faster and faster and thus closer to the limits. This goes for outputs in the graphics card, to the cabling, to the connectors, to the beam controllers.

It would be unusual if it didn't start to break down.
 
There is something call phosphor persistence, which is the time after the electron beam hits the screen does it remain lighted up. A monitor designed to run at 85 hz is going to have difference designs than a standard monitor.

Back in the old days I had a flat screen magnavox computer CRT that was a very very long persistent monitor. The colors were amazing....but motion got blurred as the phosphors were still glowing as the next scan line came along.

For a high refresh rate at a high resolution, the difference between a high speed VGA, vs a normal VGA cable was huge, a standard cable would not deal with 1600 x 1200 by 85hz refresh...as Snowdog alludes to above.
 
Updated with other questions at top. I'm especially perplexed at the color temperature setting of monitors. Whether I use the OSD alone, or use my Spyder 2 to calibrate it to 6500K, all monitors at 6500K look warmer than my high cri 5900K bulbs.
 
A CRT display has a glasspicture tube covered in phosphors, and an electron gun sitting at the back. To create an image, the electron gun moves and shoots a beam at the phosphors on the front of the tube, briefly lighting them up with an image. It starts at the top left corner of the screen and rapidly 'paints' thescreen line by line, across and down the face of the tube. Once it hits the bottom, the electron gun turns off, goes all the way back to the top left corner and starts again. This pause between refreshes is called the Vertical Blanking Interval . A CRT display refreshes itself many times a second, so that even though at any one point in time much of the screen may actually be blank, waiting to be redrawn by the electron gun, your eyes - due to Persistence of Vision - still see the previous image that was displayed there a fraction of a second ago..

LCDs don't have a refresh rate. They emulate one to be compatible with a gpu.
 
I think the film industry also impacted the switch from 5:4 or 4:3 to a widescreen format on TVs and monitors. Movies have been filmed for the longest time in a widescreen format, and had to cropped quite a bit on the sides to fit standard definition for home viewing.
 
CRT depends on the dot pitch of it as well. The lower its dot pitch the finer the pixels and the higher resolution it can show. Any CRT under 85mhz will hurt your eyes as you can notice the flickering.

16:10 has become sort of a pc exclusive ratio with monitors. But they're phasing it out now with 16:9
 
LCDs don't have a refresh rate. They emulate one to be compatible with a gpu.

I thought they did have a refresh rate, only due to how the LCD panel matix worked. It wasn't "flashed" all at the same time? Just fast enough not to matter. Otherwise, the LCD controller is the limiting factor, along with panel type, otherwise, we'd have true 240Hz++ monitors running on DP already :p
 
The new (5) question just occurred to me again last week as I watched a fading scene on a movie - I could plainly see the jerky drops in brightness as the scene faded to black. I've never compared a high contrast and low contrast monitor side-by-side so I'm not sure.
 
As to blur, I think it depends in part on the quality of the underlying electronics. My Sony GDM CRTs look quite clear at various frequencies, whereas an NEC CRT I picked up new old stock more recently is less flexible in that regard...

The quality of the cable also makes a difference....
 
Updated with other questions at top. I'm especially perplexed at the color temperature setting of monitors. Whether I use the OSD alone, or use my Spyder 2 to calibrate it to 6500K, all monitors at 6500K look warmer than my high cri 5900K bulbs.

Just a question in between, will changing to another color temperature make your monitor live less longer?
 
Secondly, the 1920x1080 HDTV standard came from a desire to see 2 megapixel images displayed in the agreed upon 16:9 standard. It is precisely meant for the HDTV viewing distance to produce video that occupies a larger area of the viewer's field.

Yep....and this makes 16:9 a great resolution for HDTV, when your FOV from afar is widescreen.

16:10 has become sort of a pc exclusive ratio with monitors. But they're phasing it out now with 16:9

Unfortunately, this is where marketing beat science. 16:9 is a terrible ratio for a close range monitor, especially since us humans are used to reading things up & down. Portrait is still the standard for paper documents, not landscape. There are solid reasons for that un-phased by marketing claims.
 
Yep....and this makes 16:9 a great resolution for HDTV, when your FOV from afar is widescreen.



Unfortunately, this is where marketing beat science. 16:9 is a terrible ratio for a close range monitor, especially since us humans are used to reading things up & down. Portrait is still the standard for paper documents, not landscape. There are solid reasons for that un-phased by marketing claims.


The 1920x1080 HDTV has an optimal viewing distance of 3.1x picture height.

As for your second claim that I have highlighted, Many existing and historical writing systems are read horizontally. As for 16:10, I agree. It provides a much improved scheme for productivity over 16:9 while still maintaining good area utilisation in widescreen media.
 
(3) Back before Displayport was released, I heard something about it ditching the whole refresh rate thing and addressing the pixels directly. I can't remember where I read that preview, but anyone know whatever came of that idea? Or was that previewer full of crap?
(

I found some information about this on wikipedia. Here's a snip:

"Direct Drive Monitor 1.0 standard was approved in December 2008. It allows for controller-less monitors where the display panel is directly driven by the DisplayPort signal, although the supported resolutions and color depth are limited to 2-lane operation."
 
the idea of 60hz / refresh rates on LCD are not the same as on CRT / Plasma, on plasma, the pixels turn on and off, on LCD it's emulated as you can't turn pixels on and off, they add black screens in between, which is why b2b / g2g response times are advertised, instead of the pixel turning off, it turns to black and back.

From what I understood anyway :p Why this is done? beyond me, I read it was due to needing that slight flicker to keep the movie / tv look going, without having things look too real :p like what the 120/240 hz TVs do with interpolation, makes it look like a soap opera.
 
I have a RIM Blackberry Playbook. The 1024x600 on 7" glossy is very, very sharp. I realize a 22" or 24" LCD will not have the same density as this device, but what is the closest thing I can get to have a monitor with similar sharpness and color saturation (achieved by having glossy)?
 
I have a RIM Blackberry Playbook. The 1024x600 on 7" glossy is very, very sharp. I realize a 22" or 24" LCD will not have the same density as this device, but what is the closest thing I can get to have a monitor with similar sharpness and color saturation (achieved by having glossy)?

probably a glossy monitor :p? lol, HP has a cheaper one that is glossy, otherwise Apple Cinema Displays are glossy as well, and the higher end ones use IPS panels.
 
probably a glossy monitor :p? lol, HP has a cheaper one that is glossy, otherwise Apple Cinema Displays are glossy as well, and the higher end ones use IPS panels.

A 21.5" 1920x1080p will have a very fine dot pitch. There are a few models out there not much bigger (22-24") with even higher resolution, thus even finer dot pitch (I know there was a Dell, I think glossy... not sure if it was IPS tho).
 
Back
Top