6 best 4K TVs for HDR/SDR Console and PC gaming Oct 2016

The OS may be holding it back but at least according to wikipedia, these game engines support HDR, and it's possible that others could be patched to:

=====================================================
Game engines that support HDR rendering (Games may be able to be modded/patched to support hdr monitors as well.)
High-dynamic-range rendering - Wikipedia, the free encyclopedia

Unreal Engine 4
Unreal Engine 3[19]
Chrome Engine 3
Source[20]
CryEngine,[21] CryEngine 2,[22] CryEngine 3
Dunia Engine
Gamebryo
Unity
id Tech 5
Lithtech
Unigine[23]
Frostbite 2
Refractor 2[24]
Real Virtuality 2, Real Virtuality 3, Real Virtuality 4
HPL 3
Babylon JS [25]
 
I'm waiting on two things as far as 4k goes. 1) Better than 60Hz (120Hz+) and 2) Better more powerful GPUs that can perform and run games at their highest levels. I don't want to need more than two GPUs to do this. Preferably only one GPU. I hear #1 could be next year. #2 is a couple complete GPU generations least away still.
 
The OS may be holding it back but at least according to wikipedia, these game engines support HDR, and it's possible that others could be patched to:

=====================================================
Game engines that support HDR rendering (Games may be able to be modded/patched to support hdr monitors as well.)
High-dynamic-range rendering - Wikipedia, the free encyclopedia

Unreal Engine 4
Unreal Engine 3[19]
Chrome Engine 3
Source[20]
CryEngine,[21] CryEngine 2,[22] CryEngine 3
Dunia Engine
Gamebryo
Unity
id Tech 5
Lithtech
Unigine[23]
Frostbite 2
Refractor 2[24]
Real Virtuality 2, Real Virtuality 3, Real Virtuality 4
HPL 3
Babylon JS [25]

Well, I thought they only support HDR rendering in the card, but the engine still only outputs 32-bit? You know, so they could fake larger dynamic range (bloom)?

It will still require a patch to enable 40-bit output, and you probably won't notice anything different until they make a game with HDR source textures.
 
Last edited:
I'm waiting on two things as far as 4k goes. 1) Better than 60Hz (120Hz+) and 2) Better more powerful GPUs that can perform and run games at their highest levels. I don't want to need more than two GPUs to do this. Preferably only one GPU. I hear #1 could be next year. #2 is a couple complete GPU generations least away still.
The problem with your scenario is that the challenge of devs is to whittle games down to fit real-time speeds, not the other way around. Therefore any time gpu power increases substantially, they are able to easily raise that ceiling until frame rates are down into the mud again on ultra/max at high resolutions This kind of jump can also be seen in the past with heavy handed graphics mods and massive downsampling similarly.

The other problem is that stills/screen shots sell and unless you have a high hz monitor with a powerful gpu, you can't see any difference in what you are missing vs that still graphics screen shot fidelity (or vs low frame rate youtube gameplay video on your low hz monitor). So the still shots sell on people even at 30 fps average in some cases or 60fps average rather than minimum ( which is like 30 - 90 fps) - though I find that difficult to understand personally.

I think eventually people will have to just come to the realization that graphics ceilings are arbitrarily set by devs in the first place in order to "fit" real time. I think if they were 2x or more higher people would start to find the "sweet spot" of graphics fidelity vs motion excellence and not bother shooting for the "max" since no one could really use that setting for a few years (crysis comes to mind actually). The push for higher resolution monitors and settings at the same time that sli seems to be being abandoned by nvidia, game devs, and directx is having an effect too.

Some recent examples of games that aren't really capable of high hz at high resolution, 2560x1440 in this case on a gtx 1080:
Witcher3 with hairworks OFF ~ 82fps average (which is probably a band more like 50 - 120)
Far Cry Primal on VERY HIGH settings ~ 80 fps average

I agree about waiting for 120hz+ and variable hz, modern gaming overdrive on 4k monitors but I'd probably run 1440p on them and use the 4k rez for desktop.
 
While it looks good on paper and with test patterns. But in practical use, does .015 vs 0 black really makes any difference? After playing with both sets under similar lighting condition, I would say that the 0 black is really only barely noticeable in a totally dark room and I don't watch TV in a totally dark room. I have put ambient LED on the back of all my sets to reduce eye strain and the true black no longer makes a visible difference and definitely not during daytime hours. So at the end of the day. I'll take 1000+ nits and .015 black over 660 nits and 0 black anyday,

Also, Don't try to compare any streamed 4K from varies sites. Get a UHD player and pop in Pacific Rim and you'll see true HDR in effect. Yes, The colors are over saturated by design but it really shows off the difference between a BD60/100 vs the varies streaming services so called 4K HDR. When the spotlights on the Jaegars are pointed towards the screen, they are so bright at times that you almost have to avert your eyes and since most of the actions happens at night, you'll see that extreme brights while still having excellent dark details in the same frame.

It is not just about black levels in which OLED wrecks everything else. It's about bloom/halo. An OLED can have one pixel outputting max brightness and the pixel immediately next to it outputting absolute zero light. This means bright objects on screen won't have the cursed bloom/halo that LCD's will suffer from. Even the best FALD LCD's still have this problem, there just aren't enough zones. With OLED, you effectively have 8.4 million zones. Emissive displays will always be better than transmissive displays, even if transmissive display can go brighter.
 
KS8000

First cable was a rocketfish and the second cable is now a monoprice. 3-4 feet. And the display is still occasionally blinking on and off on me with handshake problems. I tried another port on the gtx 1080 video card and it hasn't helped.

Bloody annoying. Question is whether it is worth taking it back and trying to exchange the set. This one has good IQ and no dead pixels that I've found yet. So I can either live with this bullcrap or I can risk taking it back and seeing what I get playing the lottery.

:/
 
I'm using this cable from Monoprice for 4K HDR, 4:4:4 etc. and it works without issues.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
Riptide_NVN I'd probably return it. It's doubtful two cables had the same issue, especially at 3 foot size on a modern cable there should be no problem.

What are you connecting to the TV? Have you tried a second device?
 
It is not just about black levels in which OLED wrecks everything else. It's about bloom/halo. An OLED can have one pixel outputting max brightness and the pixel immediately next to it outputting absolute zero light. This means bright objects on screen won't have the cursed bloom/halo that LCD's will suffer from. Even the best FALD LCD's still have this problem, there just aren't enough zones. With OLED, you effectively have 8.4 million zones. Emissive displays will always be better than transmissive displays, even if transmissive display can go brighter.

Yes, I'm sure you're going to get a 1 pixel bright spot on 4K movies all the time. As I said, looks good on paper and test patterns, but offers no advantage in practical in anything but a totally dark room.
 
Last edited:
KS8000

First cable was a rocketfish and the second cable is now a monoprice. 3-4 feet. And the display is still occasionally blinking on and off on me with handshake problems. I tried another port on the gtx 1080 video card and it hasn't helped.

Bloody annoying. Question is whether it is worth taking it back and trying to exchange the set. This one has good IQ and no dead pixels that I've found yet. So I can either live with this bullcrap or I can risk taking it back and seeing what I get playing the lottery.

:/

Are you connecting through anything else (like a receiver). I have my PC w/1080 going to a Pioneer 4K receiver then to a JU7500 with no problem. How often are you getting the dropouts? Did you do all the setting in nVidia control panel?
 
I'm using this cable from Monoprice for 4K HDR, 4:4:4 etc. and it works without issues.
Same one I'm using now only mine is just 3' long.

I'd probably return it. It's doubtful two cables had the same issue, especially at 3 foot size on a modern cable there should be no problem.

What are you connecting to the TV? Have you tried a second device?
I'm considering an exchange. It is not a slam dunk though. Besides being a PITA there is significant risk that I'll end up with either the same issue again or a display with IQ problems like dead pixels or light bleed.

I have nothing else to try but my gtx 1080. No other PC to drive 4:4:4 60hz via hdmi. And no set top player.

Are you connecting through anything else (like a receiver). I have my PC w/1080 going to a Pioneer 4K receiver then to a JU7500 with no problem. How often are you getting the dropouts? Did you do all the setting in nVidia control panel?
The situation seems a little better with the monoprice cable vs. the rocketfish. I've had 3 incidents so far and all have been in the windows desktop. Cable has been in for a couple days now. Connection is direct from the gtx 1080 into the samsung box on HDMI 2 like it suggests for PC input.

NVidia control panel is set to ycbcr 4:4:4 60hz. 8 bit. Not sure what else there would be to change in there.

When the display blinks it comes back and sometimes that is it. Or it may blink again within a few minutes. HDMIyo fixes the issue until the next time I have to power cycle.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
Try RGB? That shouldn't matter though. . .

You could try Blue Jeans Cable as a last resort. They're well regarded for HDMI cables with the AV crowd.
 
Same one I'm using now only mine is just 3' long.


I'm considering an exchange. It is not a slam dunk though. Besides being a PITA there is significant risk that I'll end up with either the same issue again or a display with IQ problems like dead pixels or light bleed.

I have nothing else to try but my gtx 1080. No other PC to drive 4:4:4 60hz via hdmi. And no set top player.


The situation seems a little better with the monoprice cable vs. the rocketfish. I've had 3 incidents so far and all have been in the windows desktop. Cable has been in for a couple days now. Connection is direct from the gtx 1080 into the samsung box on HDMI 2 like it suggests for PC input.

NVidia control panel is set to ycbcr 4:4:4 60hz. 8 bit. Not sure what else there would be to change in there.

When the display blinks it comes back and sometimes that is it. Or it may blink again within a few minutes. HDMIyo fixes the issue until the next time I have to power cycle.

Ignore the HDMI 2 suggestion and try another HDMI port. I have it connected to HDMI port 1 and renamed it to PC. Works the same. It will also disable PC mode whenever fed a 24hz signal and goes to movie mode. Quite handy! Dunno if the HDMI2 does the same or will it stick to PC mode.
 
The reason why PC still does not have HDR: http://www.techhive.com/article/311...r-pc-still-cant-stream-4k-ultra-hd-video.html


Hollywood being paranoid about piracy. Surprise surprise. Now, this is about movies and streaming and gaming does not technically require the go-ahead from Hollywood but it still requires a support from operating system and since Microsoft is working very closely with film industry to make a DRM for 4K HDR as piracyproof (yeah, good luck with that...) as possible I guess that is why things are moving so slow in gaming department too.
 
MaZa that is an interesting article thanks for the post. Read the whole thing just now. While we have a ways to go yet I am glad I waited and didn't get a first gen display. And speaking of waiting it sounds like it may be quite some time (as in easily a year or more) before we see UHD playback over disc or streaming directly from Windows 10. My ivy bridge has zero support for any of this but the display and the nvidia card are good to go. I'm not clear on whether a CPU upgrade will be required since the nvidia card can handle all the work anyway. May not matter though since my 3570K is finally beginning to look a little long in the tooth. Sure has lasted a good long while though.

May try the HDMI2 suggestion too. It is an annoyance to have to use hdmiyo when the handshake problem pops up but realistically the push of a hotkey typically fixes it until the next restart which isn't that big of a deal.
 
Any opinions on which 4k would be good as a desktop monitor replacement?

I sit about 2-2.5 ft away from my current 27" 1440p.
 
Any opinions on which 4k would be good as a desktop monitor replacement?

I sit about 2-2.5 ft away from my current 27" 1440p.

If you live in europe, I do recommend 43" KS7500. American equivalent (KS8500) is limited to 49" minimum, the size which I have and use it at 1 meter away (bit over 3 feet) and even then it is a bit too large but still manageable. It would be useless at 2 feet though. I guess the equivalent size Sony X800D (mind the panel lottery, some sizes may come with IPS panel instead) is the next best thing, though it does have slightly higher lag and not as high contrast ratio. Some people did complain about it having some blur despite having a 4:4:4 support but since Rtings makes no mention of such thing I think they may have just missed something. Something was not turned on or sharpness slider was not set correctly (zero is not always correct). These are the only 4K TV's that I looked at and seemed usable as monitors, others had either too high lag or not good enough support for HDR.


However if HDR is not a concern then Samsung KU serie makes an excellent basic 4K monitor. Low lag, good colors and high contrast. 24hz playback suffers from judder unless you use motion interpolation but at 60hz there are no problems.
 
Thanks for the information man, much appreciated. I am indeed in the Europe (well, for now at least - Brexit UK!!).

Do you know if the KS7500 supports 1:1 / dot-by-dot pixel mapping when fed with non-native resolutions? i.e. if I feed it a 1080 or 1440 signal will it display the image in the centre of the screen with no scaling (albeit with a black border around it)?

43" may be a little big but I'm willing to give it a go.
 
You can do 1:1 pixel mapping in software if you are using a PC. The options are in the Nvidia/AMD control panel.
 
Thanks for the information man, much appreciated. I am indeed in the Europe (well, for now at least - Brexit UK!!).

Do you know if the KS7500 supports 1:1 / dot-by-dot pixel mapping when fed with non-native resolutions? i.e. if I feed it a 1080 or 1440 signal will it display the image in the centre of the screen with no scaling (albeit with a black border around it)?

43" may be a little big but I'm willing to give it a go.

In PC mode, no. The scaler is quite decent though and it accepts nearly every 16:9 resolution you can make as custom resolution and closer you are to 4K the less apparent the pixel interpolation smoothing effect is. I play a lot of games at 2880x1620 and the amount of detail is immense. Smoothing is easily ignored in that high resolution.
 
Good to know. Thanks for the info.

Do you run yours in PC mode or Game Mode? Also, is the HDR functionality (such as it is) available in these modes?

You can do 1:1 pixel mapping in software if you are using a PC. The options are in the Nvidia/AMD control panel.

Ah yes of course, I remember that option there and I've played with it before.
 
Yes, I'm sure you're going to get a 1 pixel bright spot on 4K movies all the time. As I said, looks good on paper and test patterns, but offers no advantage in practical in anything but a totally dark room.

The explanation is WHY OLED HDR is better, not that 4K movies will have single pixel bright spots. And blooming/haloing is apparent on all images, not just in a "totally dark room". FALD is just a band-aid applied to an inferior technology.

People spending $4K-$8K on crap LCD tech is quite laughable. A technology that in ten years will be on the garbage heap of history.
 
Good to know. Thanks for the info.

Do you run yours in PC mode or Game Mode? Also, is the HDR functionality (such as it is) available in these modes?



Ah yes of course, I remember that option there and I've played with it before.



PC mode. Since I play mainly RPGs the half a frame of input lag difference between PC mode and Game mode is pretty meaningless really. Game mode does support HDR (or other way around) but currently if the PC mode supports it or not is yet to be seen. Hell, PC's itself dont yet support HDR. The link I posted earlier in this page clarifies the situation. Mind you, HDR is not some effect you just turn on (though there is HDR+ thingy thats meant to simulate HDR on SDR sources, but its shit), it is a completely new way the picture is rendered. If you ever buy a properly HDR capable screen try some of the HDR sample clips available around the internet. The lifelike image is nothing short of jaw dropping IMHO.
 
Which is why I spent ~ $1800 on a 70" vizio 4k VA screen with FALD for now. OLED 70" is astronomical in price and is in early generation right now. My 4k tv looks beautiful to me coming from a B7000 series 1080p edge lit samsung glossy VA. The FALD even with 32 zones in my model makes a huge difference, but again, I paid under $2k for a 70" not 4 or 5 k. According to the AVS forum review where they compared it directly to a F8500 plasma, the vizio series has a great contrast ratio and black depth.

I performed ANSI-checkerboard contrast measurements on the calibrated panels. The M65, in (calibrated) Calibrated Dark mode, yielded 31 fL peak white and 0.0038 fL for black, an 8157:1 contrast ratio. The plasma struggled a bit with the ANSI pattern; peak whites were 30 fL and black measured 0.0058 fL, resulting in a contrast ratio of 5172:1.

http://www.avsforum.com/forum/166-l...-led-lcd-uhdtv-official-avs-forum-review.html

I realize it isn't HDR and it's not even full 4:4:4 (passes 4:4:4 but shows 4:4:2 I believe, at least at 4k).. but anyway it looks great for movies and tv and even desktop/browsing use imo and was the smart buy for my purposes knowing that once OLED matures and becomes ubiquitous (along with more HDR and 4k content) , and 70" models are probably very high priced rather than being extreme.. most likely in 3 - 5 years.. I'll go all in. Hopefully by that time there will be some true OLED gaming monitors with full high end gaming features on the market as well for desktop use. I'm looking to upgrade to a 1440high VA of 120hz+ and variable hz in 2017 similarly to what I chose for my tv to bridge the gap until then. To me for my scenarios, OLED is not worth it yet.
 
Last edited:
The explanation is WHY OLED HDR is better, not that 4K movies will have single pixel bright spots. And blooming/haloing is apparent on all images, not just in a "totally dark room". FALD is just a band-aid applied to an inferior technology.

People spending $4K-$8K on crap LCD tech is quite laughable. A technology that in ten years will be on the garbage heap of history.


Once again. As if you're going to notice the bloom on anything but a fairly static scene or you freeze the screen and take a closer look. As I said. I have over 10 hours of viewing on the E6 under varies lighting conditions and beautiful as it is. I'll still take my KS9500 over it. In fact, the people spending 4-8K on the current OLED is the one that's stupid as the technology still have quite a ways to go.

P.S. i don't expect my 4K to go for 5 years much less 10. I'll probably pick up another large format 4k in a year or two and will keep doing that till I replace every TV in the house. By then, i'll probably be ready to replace this one with a 8K or whatever flavor of the year is then.
 
A solid FALD implementation is great on a VA. My tv gets 7625:1 contrast ratio uncalibrated , 8157:1 calibrated/tweaked. By comparison, with FALD off, uncalibrated it is 5882:1 so loses at least 1/3 of it's contrast ratio. The direct zone backlight is 32 zones so dynamic contrast actually works cleanly without the whole screen changing or popping and losing contrast overall. 32 zones means 4 quadrants of 8 direct led backlights I would think. I think the P series has double that but I think this looks amazing as it is. My edge lit samsung VA b7100 tv was a "good" screen for the time and I still use it as a rec room tv - but it definitely had some flashlighting, clouding and bloom.
dZpIb8G.jpg
 
Last edited:
Once again. As if you're going to notice the bloom on anything but a fairly static scene or you freeze the screen and take a closer look. As I said. I have over 10 hours of viewing on the E6 under varies lighting conditions and beautiful as it is. I'll still take my KS9500 over it. In fact, the people spending 4-8K on the current OLED is the one that's stupid as the technology still have quite a ways to go.

P.S. i don't expect my 4K to go for 5 years much less 10. I'll probably pick up another large format 4k in a year or two and will keep doing that till I replace every TV in the house. By then, i'll probably be ready to replace this one with a 8K or whatever flavor of the year is then.

Well I don't have much to discuss with someone who thinks an LCD is better than an OLED. Virtually the entire video industry would disagree with you.

http://www.rtings.com/tv/reviews/by-usage/movies/best

OLED for movies on RTINGs: 9.3. KS9500: 8.1. Just one example.
 
If you dropped the stuff like 3D, viewing angle (no difference for a couple or small family) and where the scores are the same or not related to picture quality. Here's what you get. They did not review the KS9800 yet so I just took the local dimming score from last year's JS9500. I would assume the rest of KS9800 would be at least as good as the KS9500 if not better.

Category LG E6 KS9500 KS9800
Contrast 10 9.2 9.2
Black Uniformity 10 9.8 9.8
Loacl Dimming 10 3.5 8
Peak Brightness 6.7 8.2 8.2
Gray Uniformity 8.1 7.7 7.7
Pre-Calibration 8.4 8.6 8.6
Post-Calibration 9.6 9.8 9.8
Color Gamut 8.6 8.2 8.2
Reflections 9 9.5 9.5
Motion Blur 9.9 8 8
24p Playback 8.6 10 10
Input lag 7.3 8.2 8.2

Total: 106.2 100.7 105.2
Average 7.59 7.19 7.51


Then you have to ask yourself which of those are more important on practical use and which only makes a difference on a test pattern/static image. the only thing that the E6 have an advantage for practical use is probably the Motion Blur, while the KS9800 offers better daytime performance and brightness.
 
Last edited:
A quick update on PC games and HDR.

Shadow Warrior 2 in PC already supports HDR and it has done so for last two weeks or so, atleast for Nvidia cards. Why I just heard about it, I dont know. I have not yet bought the game but now I really have to. Anyway, it seems like you are not limited by the operating system afterall if you want to make your game HDR capable. You just need to use Fullscreen exclusive mode and not borderless.

http://www.pcworld.com/article/3131...ce-boosts-from-nvidias-multi-res-shading.html
 
If you dropped the stuff like 3D, viewing angle (no difference for a couple or small family) and where the scores are the same or not related to picture quality. Here's what you get. They did not review the KS9800 yet so I just took the local dimming score from last year's JS9500. I would assume the rest of KS9800 would be at least as good as the KS9500 if not better.

Category LG E6 KS9500 KS9800
Contrast 10 9.2 9.2
Black Uniformity 10 9.8 9.8
Loacl Dimming 10 3.5 8
Peak Brightness 6.7 8.2 8.2
Gray Uniformity 8.1 7.7 7.7
Pre-Calibration 8.4 8.6 8.6
Post-Calibration 9.6 9.8 9.8
Color Gamut 8.6 8.2 8.2
Reflections 9 9.5 9.5
Motion Blur 9.9 8 8
24p Playback 8.6 10 10
Input lag 7.3 8.2 8.2

Total: 106.2 100.7 105.2
Average 7.59 7.19 7.51


Then you have to ask yourself which of those are more important on practical use and which only makes a difference on a test pattern/static image. the only thing that the E6 have an advantage for practical use is probably the Motion Blur, while the KS9800 offers better daytime performance and brightness.

Previously you said you had a KS9500 which was reviewed by RTings and hence part of my reply. It doesn't matter though as the KS9800 still cannot compete with OLED.

I also find it funny that when talking about these TV's for console and PC gaming as per the thread title, people conveniently forget that FALD input lag numbers are atrocious and FALD MUST be turned off for gaming modes. Your contrast numbers just went in the shit hole as the band-aid fix doesn't work anymore.

OLED has perfect blacks, infinite contrast ratio, zero bloom perfect HDR*, zero pixel transition blur/ghosting all at 60 Hz 4K 4:4:4 chroma at ~34ms input lag. No LCD can even get on the same planet as those specs. Really the only thing LCD can claim to be better at is higher peak brightness.

*HDR gaming mode firmware being tweaked, works perfect in movies.


A quick update on PC games and HDR.

Shadow Warrior 2 in PC already supports HDR and it has done so for last two weeks or so, atleast for Nvidia cards. Why I just heard about it, I dont know. I have not yet bought the game but now I really have to. Anyway, it seems like you are not limited by the operating system afterall if you want to make your game HDR capable. You just need to use Fullscreen exclusive mode and not borderless.

http://www.pcworld.com/article/3131...ce-boosts-from-nvidias-multi-res-shading.html

Sweet! I'll have to give it a try.
 
Last edited:
Then you have to ask yourself which of those are more important on practical use and which only makes a difference on a test pattern/static image. the only thing that the E6 have an advantage for practical use is probably the Motion Blur, while the KS9800 offers better daytime performance and brightness.

Contrast? Black uniformity? Local dimming? Color gamut? I.e. the things that go into good PQ both generally and in games. Those matter on more than a test pattern...
 
With FALD off my 70" 4k VA tv still gets up to 5882:1 contrast ratio, compared to a F8500 plasma which gets 5172:1. That's a drop from 8157:1 on my tv with FALD on, but it's still good. That's compared to gaming VA's which are almost always max 3000:1 (the eizo was 4850 out of 5000 though). The input lag in game mode is 17ms to 20ms and the response time is low for a tv, especially a VA tv, at around 10ms. I can also run native 120hz input at 1080p for gaming if I want off a hdmi 2.0 gpu at 17ms response time.

OLED and HDR are nice for TV/movies now but are in early generation(s). Once they are more common and a 70" comes down a bit in price I'll get one in 3 - 5yrs (hopefully the short end of that time span), and by that time we should have a lot more 4k and HDR content available. For gaming, 55" is too big for a desk monitor and even if it were set back far enough or used in a living room for some gaming - I demand 120hz+ and variable hz in any primarry PC gaming display I drop considerable money on now. Perhaps in the future oled will be higher hz (like the $4k dell oled monitor, but it lacks variable hz) and/or could do screen blanking tech vs sample and hold blur too but the brightness is low enough already so idk how that would work.

blur reduction/motion clarity increase 100fps-hz ave:
0% <- 20%(~80fps-hz) <-- <<40% (100fps-hz)>> -->50%(120fps-hz)->60% (144fps-hz)
and motion definition/path articulation/smoothness wise
1:1 -- 1.x:1 (~80+ f-hz) <--- << 5:3 (100fps-hz)>> --> 2:1 (120fps-hz) -> 2.4:1 (144fps-hz)


Hopefully in the next several years we'll get some true gaming oled monitor options that check all of the boxes. High hz + variable hz, high rez and low input lag on a oled gaming screen would be great. Some larger sizes would be nice like a ~ 40" 4k for a desk (~ 108.8ppi same as a 27" 1440).

Once a 70" oled TV with hdr is more reasonably expensive and not in the first gens I'll eventually bite on that too, with a lot more 4k and HDR to justify it. They really do look great for movies, I debated a 55" oled but 70" size and price FALD VA was the best choice for me to carry through until things mature :) . Same with a good full featured VA gaming monitor in 2017 for me I hope. Eventually OLED will be ubiquitous and the clear choice for both my living room 70" TV and also a desktop gaming oled full of all the modern gaming monitor advancements + oled's performance capabilities, not a one or the other.
 
Last edited:
Point is that a TV which can at least get up to the 1000 nit mark is nominally capable of handling the extremes of HDR10 without the aid of dynamic metadata, i.e. there's no need for dynamic mapping because the TV can accommodate the full signal that's being thrown at it, whereas something less well endowed nit-wise benefits far more from dynamic DV because the signal is being properly mapped rather than being badly clipped like HDR10 is on some manufacturers' sets.

This is something that HDTVTest proved in a recent article: HDR10 on an OLED had noticeably clipped highlights compared to the equivalent DV source material, but an LCD TV showing the same HDR10 source had far more nuance and was essentially equivalent to the DV version on the OLED for what could be seen in the specular highlights.

Many have reported that while watching UHD disc on an OLED set, the overall picture was dimmed to allow for the highlights

WHY do we need displays this bright? HDR seems like a complete gimmick to me. I do not want to see 1000 nit whites while I am watching anything. When do we actually benefit from HDR? When something is filmed under sub optimal lighting conditions, goes through post-processing, and it ends up looking passable? Why don't we, you know, just film things with proper lighting? Kubrick had no issues filming half of Barry Lyndon with nothing but candlelight, but in 2016 apparently we cannot accomplish the same thing with better technology.

Dynamic range is horseshit. What we need is more displays that can display true black, and then "dynamic range" won't matter at all because contrast will then be near-infinite. I guess when the industry is clinging onto shitty LCD tech they need blindingly-bright whites to make sure no one notices how shitty their black levels are though, right?

I'm all for better, more accurate, deeper color reproduction. 10-bit is great. 1000 nits though? The display industry can fuck off with that shit, we don't need it in our homes, levels like that are only suitable for outdoor displays, so you will never see a display that bright in my house, because I don't see any difference between that and shining a LED flashlight in my eyes.
 
WHY do we need displays this bright? HDR seems like a complete gimmick to me. I do not want to see 1000 nit whites while I am watching anything. When do we actually benefit from HDR? When something is filmed under sub optimal lighting conditions, goes through post-processing, and it ends up looking passable? Why don't we, you know, just film things with proper lighting? Kubrick had no issues filming half of Barry Lyndon with nothing but candlelight, but in 2016 apparently we cannot accomplish the same thing with better technology.

Dynamic range is horseshit. What we need is more displays that can display true black, and then "dynamic range" won't matter at all because contrast will then be near-infinite. I guess when the industry is clinging onto shitty LCD tech they need blindingly-bright whites to make sure no one notices how shitty their black levels are though, right?

I'm all for better, more accurate, deeper color reproduction. 10-bit is great. 1000 nits though? The display industry can fuck off with that shit, we don't need it in our homes, levels like that are only suitable for outdoor displays, so you will never see a display that bright in my house, because I don't see any difference between that and shining a LED flashlight in my eyes.

You'd be shocked to realise that HDR is meant for DARK ROOM VIEWING. I shit you not, HDR does not even work that well in bright room. 1000 nits as a specular highlight with HDR is very different effect from having the backlight blasting at your eyes in same brightness at constant strength.

An example from the HDR test clips, a scene with swans swimming in a pond and sun is shining and reflected on the water. In SDR the picture is pretty but still flat just like what we are used to now and not even near how we would perceive it in real life if we were there. In HDR the picture is very much like real life-like, the sun is actually glittering on water and all.

There is nothing gimmicky about HDR, it is a true evolution of image fidelity and video world from professionals to home theater enthusiasts are raving about it for a reason. Dont let your ignorance about the matter blind you.

Oh, and 1000 nits is not that bright. Sunny day outside is much, much brighter.
 
With FALD off my 70" 4k VA tv still gets up to 5882:1 contrast ratio, compared to a F8500 plasma which gets 5172:1. That's a drop from 8157:1 on my tv with FALD on, but it's still good. That's compared to gaming VA's which are almost always max 3000:1 (the eizo was 4850 out of 5000 though). The input lag in game mode is 17ms to 20ms and the response time is low for a tv, especially a VA tv, at around 10ms. I can also run native 120hz input at 1080p for gaming if I want off a hdmi 2.0 gpu at 17ms response time.

Elvn did you ever check 4K@60Hz 4:4:4 input lag numbers? Usually those are much different than 1080p/4K 4:2:0. I only game in 4:4:4 for maximum fidelity. I have not found any TV that can do a 4K 4:4:4 signal under 30 ms. As a matter of fact, I think the LG OLED's are the fastest 4:4:4 TV's out there for PC gaming.
 
WHY do we need displays this bright? HDR seems like a complete gimmick to me. I do not want to see 1000 nit whites while I am watching anything. When do we actually benefit from HDR? When something is filmed under sub optimal lighting conditions, goes through post-processing, and it ends up looking passable? Why don't we, you know, just film things with proper lighting? Kubrick had no issues filming half of Barry Lyndon with nothing but candlelight, but in 2016 apparently we cannot accomplish the same thing with better technology.

Dynamic range is horseshit. What we need is more displays that can display true black, and then "dynamic range" won't matter at all because contrast will then be near-infinite. I guess when the industry is clinging onto shitty LCD tech they need blindingly-bright whites to make sure no one notices how shitty their black levels are though, right?

I'm all for better, more accurate, deeper color reproduction. 10-bit is great. 1000 nits though? The display industry can fuck off with that shit, we don't need it in our homes, levels like that are only suitable for outdoor displays, so you will never see a display that bright in my house, because I don't see any difference between that and shining a LED flashlight in my eyes.

1000 nits not that bright. It just bring what you see on TV a bit closer to what your eyes sees in real life ( still got a long way to go). Natural sunlight with a blue sky is 7000-10000 nits, even a high overcast sky is 3000-7000 nits. bright shop lights, low overcast sky is still 1000-3000 nits. having peak brightness of over 1000 nits is not that much. those outdoor display you mention have upward of 5000 nits.

HDR make more of a difference than 4K, look around for the SDR/HDR comparison. Actually HDR is more a gamer changer than 4K (unfortunately you can't get HDR without buying a 4K set)

As far as true black is concern, you will not be able to see the difference between .014 and 0.00 black in anything other than a totally dark room.
 
Last edited:
Contrast? Black uniformity? Local dimming? Color gamut? I.e. the things that go into good PQ both generally and in games. Those matter on more than a test pattern...

Yes, but can you tell the difference between .014 black and 0.00 black, uniformity between 0.541% vs 0.327%, color gamut of Rec 2020 of 66.82% vs 68.89% in practical use other than a test pattern. The only one that does matter is local dimming but I'll trade that for the high brightness that make my display usable in a well lighted room.

None of the current 4K solution is perfect so early adopter will have to pick what's more important for their own preference.
 
Last edited:
Previously you said you had a KS9500 which was reviewed by RTings and hence part of my reply. It doesn't matter though as the KS9800 still cannot compete with OLED.

I also find it funny that when talking about these TV's for console and PC gaming as per the thread title, people conveniently forget that FALD input lag numbers are atrocious and FALD MUST be turned off for gaming modes. Your contrast numbers just went in the shit hole as the band-aid fix doesn't work anymore.

OLED has perfect blacks, infinite contrast ratio, zero bloom perfect HDR*, zero pixel transition blur/ghosting all at 60 Hz 4K 4:4:4 chroma at ~34ms input lag. No LCD can even get on the same planet as those specs. Really the only thing LCD can claim to be better at is higher peak brightness.

*HDR gaming mode firmware being tweaked, works perfect in movies.




Sweet! I'll have to give it a try.


Yea, I only have the KS9500. I mistaken the 9800 shipping weight as true weight (which would be more than my current TV can handle). Doesn't matter as this is just my 1st large screen 4K (I did buy a 2015 40JU7500 last year as a PC monitor) so there are 3 more TV to go in the house not counting my office. I'll wait and see what the 2017 model lineup brings and get that for the living room and move the KS9500 to the family room (which currently have the 60" plasma from the living room). Bedroom will be last as I hardly watch TV there.

You also did not mention that the biggest problem with using OLED for gaming and that on any game where a portion of the screen is static (health bar, mnimap. UI) image retention will start even after an hour of play and can last for a while unless you actually turn off the TV for a while.
 
Last edited:
yeah my TV is not HDR either but there is very little HDR content right now. I said awhile back in the thread that my panel is not 4:4:4 though some of the vizio's pass 4:4:4 but show 4:2:2.
(That is not the same as 4:2:0 obviously)
V08cA1e.png

game mode:
1080p @ 60Hz : 19.8 ms
4k @ 60Hz: 16:9 ms
Not sure what it as at 120hz native input at 1080p but probably similar.

I use it primarily for movies and occasionally a few slow adventure games or isometrics and any difference is not noticeable at all. When I do use it for desktop browsing 4:2:2 (or 4:4:4 passed to 4:2:2 or whatever vizio does) looks fine to me from 8' away. I use a grey windows 10 theme from deviantart and a grey color scheme on my 3rd party directory opus file manager. I also nearly always make bright white web sites medium grey with browser addons just like I do on any other monitor if that matters. Hardforum is dark enough as it is so I leave it at it's default color scheme. I haven't run test photos or patterns (other than some pretty wallpaper slide shows) in any attempt to force some 4:4:4 vs 4:2:2: or vs 4:4:4 passed "off-look" to show up. In my usage and viewing distance I've never seen any issue so far.

"It doesn't support chroma 4:4:4 unfortunately at any resolution/refresh rate. This doesn't matter for movies or even video games, you will only see a difference for a PC monitor. 4:2:2 works though.
1080p @ 120fps works (although only for the 60" model and up), which is great for gaming on a PC."

OLED is indeed gorgeous for movies no argument there and I considered a 55" one vs my 70" FALD VA at one point but chose the latter for my living room and I'm extremely happy with the 8157:1 FALD contrast level and black depth and I like the 120hz native at 1080p capability. It's the best TV I've ever had by far and was a huge upgrade from a 46" edge lit samsung VA for "only" around $1800. In a few years when 4k and HDR content is abundant and OLED has matured a bit I'll check prices on a 70" OLED HDR tv and will be willing to drop more money than I spent on this on one. Waiting on oled desktop monitors to be in full swing too, particularly a gaming one full of modern gaming technologies like 120hz and higher, variable hz, low input lag,(maybe even some kind of screen blanking with oled's speed though it might be a problem with the current brightness levels). Guessing oled wouldn't need modern gaming overdrive like a lcd :b

For my main gaming at my desktop I'm still using my 27" pg278q 8bit TN rog swift 144hz g-sync. I'm not willing to go back to 60hz or even 100hz nor drop variable hz there but I would love to add the .04 -.05 black depth of a samsung VA instead of the .13 to .14 of ips and tn screens. It seems like samsung is only releasing dp 1.2 100hz 21:9 3440 x 1440 VA monitors soon but they will probably release dp 1.3 - 1.4 144hz+ ones sometime in 2017. They just released a 1080p curved freesync VA and will release a 2560 x 1440 144hz freesync VA eventually and maybe a 4k 120hz+ later on. I also read something on TFT central about a 32:9 3840 x 1080 gaming panel but the pixel height and overall size don't seem that great. I'll probably get a 35" 21:9 144hz+ 3440 x 1440 variable hz samsung VA whenever they come out or a 144hz+ 4k VA variable hz if/when they come out (playing 1440p scaled full screen or some rez in a window), to hold me over until full featured oled gaming monitors are on the market in numbers years down the line. I opted out of the first gen VR kits too but I'm keeping an eye on that tech as well. It's just a matter of time before I ditch LCD entirely but VA is the best choice for me until the time is right.
 
Last edited:
I have a b6 oled tv once you see how amazing the blacks are you can bot go back to any lcd. So what if a lcd gets brighter watch a batman moive and see what one looks better when what is supposed to be black is 100% black thorough the whole screen. Also off angle viewing is a big issue at most people places and oled wins by a mile vs any lcd.
 
Back
Top