Martha Stewart
Gawd
- Joined
- Apr 14, 2011
- Messages
- 668
https://rog.asus.com/articles/gamin...is-the-worlds-fastest-esports-gaming-monitor/
Overkill or Necessary ?
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
I don't normally like to cite Linus, but he did a pretty good test of high refresh and it turns out 240hz provides a significant performance increase over 144hz, especially for casual gamers, so more refresh rate can't hurt.
Would this still be the case on non-sample-and-hold displays? Who knows, but strobing has proven pretty difficult to implement in conjunction with variable refresh rates. Every attempt seems to have major flaws. So maybe more refresh rate is the best way to go.
That said, higher quality strobing is arriving. Brighter color strobing in 240 Hz 1ms IPS, especially when you have sufficient refresh rate headroom. (120Hz strobing at 240Hz is higher quality than 120Hz strobing at 144Hz)Strobing is basically PWM which is eye fatiguing, requires very high frame rates minimums to do it properly which means lower graphics or simpler games. It also mutes and dulls the screen brightness and vibrance so will likely always be incompatible with HDR and HDR color volume ranges.
Long term , we need much more advanced interpolation (without significant lag and without artifacts) to fill frames eventually for something like 100fps x10 interpolated at 1000hz I hope, which would be the same pixel persistance/sample and hold blur as a professional crt ~ 1ms which is essentially "zero" blur.
https://www.blurbusters.com/blur-bu...000hz-displays-with-blurfree-sample-and-hold/
Trying to fill very high Hz ceilings with raw frame rates while at the same time increasing screen resolutions and playing very demanding games with higher and higher graphics imagery is not going to work so it looks like better interpolation multiplying a good base frame rate is the way to go in the long run. This also goes for more extreme resolutions combined from each eye's display for VR and AR going forward.
Exactly. Headroom.It's headroom. Doesn't matter if you reach those framerates, you are still getting more than 144 or 240 Hz monitors if you can play games at say 280-300 fps. Assuming the panel response time can keep up there is no reason why you would not want as high a refresh rate as possible.
In reality though these will most likely be more expensive than they are worth.
And also, 360fps provides what is essentially strobeless ULMB. 360fps = 1/360sec = 2.8ms persistence. That's getting pretty close to LightBoost (2ms persistence), but without strobing. Blurless sample-and-hold is slowly arriving FTW!
Exactly. Headroom.
You don't need 360fps to benefit from 360Hz.
1. Quick Frame Transport. Even at 100fps, your frames are transmitted to the monitor in 1/360sec. 100fps at 360Hz GSYNC is way, way, way lower lag than 100fps at 144Hz GSYNC.
2. Much less visible tearing if you use VSYNC OFF. Tearlines are visible for only 1/360sec
3. More headroom for good quality strobing. Best strobing occurs when you strobe at roughly half the max Hz refresh rate (Technical: Cramming LCD GtG in a large VBI)
4. Humongous VRR range you can drive a truck through. You want your VRR range wider than your framerate range so you don't have to worry about side effects (lag, stutter) of going outside your VRR range.
And also, 360fps provides what is essentially strobeless ULMB. 360fps = 1/360sec = 2.8ms persistence. That's getting pretty close to LightBoost (2ms persistence), but without strobing. Blurless sample-and-hold is slowly arriving FTW!
VSYNC OFF will become far less necessay once ultra-Hz equallizes latency of GSYNC, FreeSync, VSYNC ON, VSYNC OFF. The higher the Hz, the less differential between sync technologies. VSYNC OFF is simply a popular band-aid that continues to be used because of slow-scanning refresh cycles. Ultra-Hz gives the user the power to choose without worrying about lag.
And readers dismissing 360 Hz cost, remember, 4K used to cost five figures in IBM T221 days. Today, 4K is a $299 Walmart Boxing Day sale. Please at least thank 360 Hz for eventually commoditizing 120Hz, 144Hz and 240Hz. Even 120Hz is coming to your favourite smartphone this decade for example!
This is simply a refresh rate race to retina refresh rates. Intermediate Hz gradually becomes more affordable as higher new Hz slowly commoditizes previous Hz.
LCD too. Surprisingly. Remember, 144Hz LCD was a pipe dream back in the old 33ms 60Hz LCDs of the early-to-mid 1990s.Great post and good points, some of which I hadn't considered. I look forward to the inevitable 1000hz monitors, which oled is capable of today if the driving electronics supported it.
An Amazon reviewer already wrote that the XG270's strobing is superior to Sony FW900 CRT, thanks to the new Blur Busters Approved program that I worked with ViewSonic on.
What's your thoughts on XG270 vs zowie XL2746S with dyac? Does the viewsonic darken a lot more?
I recently got a 144hz. Then I saw a 240 hz and the legibility of moving text was way better but not perfect. I imagine that 360 is even better.
Yep! I don't know what is up with people's faculties... As if they don't notice or care and then rant about it online. So annoying! It is like socialism for the senses! "Everyone use shit hardware because my senses are damaged."
Yes not to beat the point to death but like you indicated - you have to be filling the new refreshes per second (Hz) with enough frames per second (fps) to match, or at least living somewhere in that new ceiling's heights which is impossible on a lot of things outside of the desktop and 2d perspective games. That's why I kept mentioning the need for more advanced high quality interpolation, at least for 1st and 3rd person games with any kind of gpu demand and especially at 4k rez or higher and as Hz maximums increase eventually on monitors (and on VR headset resolutions per eye no less). Other tech like foveated rendering, checkerboarding, dynamic resolution and such can help but they can't hit the straight up multiples that interpolation can. For example, 90fps interpolated x 3 for ~ 270fps.. capped lower for a 240hz display, or 90fps interpolated x4 for 360fps capped lower as needed on a 360hz display. Or lower motion definition at 60fps solid x 4 ---> 240fps @ 240Hz, 60fps solid x 6 interpolated ---> 360fps @ 360Hz.
Of course there are easier to render games like Halflife 2, Counterstrike, Left4Dead2, Portal, Dishonored1 , and similar game graphics in a lot of other stylized games and most VR games, as well as isometrics (mobas, arpgs, etc).
So here's a dumb question. When I had my CRT monitors, there were situations in which my older graphics card (like the GTX-560 and GTX-770) were nowhere near hitting the refresh of 85 fps at higher resolutions, and yet still the motion looked pretty clear. Is it because of how the monitors scanned out the image? IE - would rolling scan monitors make this threshold lower?
So here's a dumb question. When I had my CRT monitors, there were situations in which my older graphics card (like the GTX-560 and GTX-770) were nowhere near hitting the refresh of 85 fps at higher resolutions, and yet still the motion looked pretty clear. Is it because of how the monitors scanned out the image? IE - would rolling scan monitors make this threshold lower?
So here's a dumb question. When I had my CRT monitors, there were situations in which my older graphics card (like the GTX-560 and GTX-770) were nowhere near hitting the refresh of 85 fps at higher resolutions, and yet still the motion looked pretty clear. Is it because of how the monitors scanned out the image? IE - would rolling scan monitors make this threshold lower?
Due to the way CRTs render they break sample and hold motion blur. This is what LCDs with strobing backlight attempts to do and new LG OLEDs have a rolling scan black frame insertion that does similar thing but its goal is more to avoid the drop in brightness that occurs with black frame insertion and strobing. CRTs didn't go very bright in the first place so it's not an issue on them.
Without BFI even an OLED with immediate response time has motion blur because of the way our brain interprets the images.
OLED rolling scan can work with HDR but the chief problem is OLED light output. Rolling scan will reduce peak lumen output -- many HDR specifications have minimum light output specs -- IIRC, one of the specs (Dolby Vision, I think), specifies something like 500 nits for OLEDs and 1000 nits for LCDs.
The more refresh rate, the easier it will be to keep brightness during rolling scan, due to more rolling scan passes (of the same persistence -- "ON" duty cycle).
This article answers a lot of those questions
https://www.blurbusters.com/faq/oled-motion-blur/
Yes without BFI which is essentially PWM and is eye fatiguing especially at lower refresh rates i.e. VRR/g-sync/free-sync roller coaster of Hz+Fps rates on monitors attempting to combine BFI with VRR so that people can still utilize VRR to pump their graphics settings higher and use higher resolutions. BFI and Strobing dim the screen and mute color vibrancy, which makes them incompatible with HDR color heights.
Using a more advanced form of interpolation in the future on very high Hz monitors will reduce (at something like 360fps at 360hz or 480fps at 480hz) and eventually essentially eliminate blur at 1000fps at 1000Hz - without having to resort to BFI/strobing/rolling scan.
BFI and strobing are often quoted as lowering the brightness by around the same percent as they reduce the blur. For example, 20% blur reduction --> 20% reduction in brightness, 75% blur reduction ---> 75% reduction in brightness. That is a huge hit on screen brightness capability and will not work with HDR color brightness ranges. OLEDs are limited in brightness and sustained brightness/%window already.
However, BFI/strobing could also benefit from very high frame rates via interpolation combined with very high Hz if the black frames/strobes could keep up, with OLED response times for example. The lower the strobe rate, the more aggravating the PWM effect.
https://forums.blurbusters.com/viewtopic.php?t=3213
Either way, a much better form of interpolation combined with very high refresh rates seems to be the only way forward due to gpu limitations. https://www.blurbusters.com/blur-bu...000hz-displays-with-blurfree-sample-and-hold/
I’m always disappointed in display news coming out of CES as everyone reports the same “top 3” stories. If you can please tell us more about some of these future directions you saw, I for one am most intrigued.LCD too. Surprisingly. Remember, 144Hz LCD was a pipe dream back in the old 33ms 60Hz LCDs of the early-to-mid 1990s.
I have information that LCD will be able to go into the kilohertz territory within this human generation (aka 2030s).
However, OLED and microLED can join the party too. Just saying LCD's going to be a horse in this race for decades. Surprisingly so.
Fortunately, LCD blacks are solvable
I've even seen a cheap million-zone local dimming "backlight" (HiSense Dual-Cell LCD at CES 2020), so don't dismiss halo-free LCD black inability either. It looked better than some of the OLEDs that I saw. I saw lots of new impressive display technologies at CES 2020, personally, in person.
Yes, GtG needs to keep getting faster
The problem is GtG (for the whole GtG heatmap of all color combos) needs to be reliably well less than half a refresh cycle, in order to not dilute/interfere much with the high-Hz. But we've fallen from 33ms-50ms all the way down to 0.2ms-0.5ms GtG (for 10%->90% segment). While progress is slowing, it is not stopping there, either -- there's already some engineering paths to speed this up (and more consistently for all colors, too).