No network is the best way with these ad-ridden spyware TVs. Can't even turn the ads off on mine.
Even if I want to upgrade the firmware, it gets the USB stick treatment.
This sort of concern, even for what some would consider not worth worrying about, is precisely why I won't use current self-emissive tech on the desktop.
Gorgeous for entertainment purposes, undeniable. But for all other purposes I really don't want that constant thought of burn-in holding me...
Sounds like an awesome setup already.
But yeah, completely get the desire to reduce the number of bezels, preferably none. I've been on 43-50" 4k for a couple of years now and don't currently feel the need (or have the space o_O) for anything more.
I'd maybe go down in size and up in DPI to...
That rtings 444 test image isn't being shown at 1:1. It's got some scaling on it for sure, and even if it's integer scaling there are suggestions in areas that it's not playing ball with 444.
Diagonal pixels in particular look mangled.
Edit: There's a real lack of understanding that this test...
Is that involving "AI" generated frames? Call me a purist but I don't want to see that anywhere outside of videos and gaming, and even then it can go too far and introduce lag and artifacting.
Kinda hoping you mean something else. :D
I'm plenty happy to see advancements in video and games upscaling, but the desktop? Not really. I thrive on pixel level accuracy there, I don't want "AI" making weird decisions about where my pixels end up.
Heh, true. It has to end somewhere though. Each quadrupling of pixel count will get harder to justify. 16k? 32k?
4k at this point hits that "high end" sweet spot for me. 8k is overkill and would potentially harm rather than help. Maybe I'll reconsider once display tech has moved on a little...
Back on the topic of 8k, looking up RTINGS reviews of 8k TVs and then the same-gen 4k TVs, the 4k panels have better specs (contrast, response).
For both PC use (which would either require scaling or the desktop be overwhelmingly large, plus 60Hz only) and TV/console use (too distant, 4k is...
And lack of burn in. People can say what they want about mitigation, I'm not interested in babying a display and burn in is the sole reason I refuse to use OLED as a PC monitor.
It's a logarithmic curve though, is it not? The visible difference between 2000 and 10000 wouldn't be as big as the absolute value would suggest.
Not to say a display that can pull it off wouldn't be great to have. I'd still like to see progress on the lifespan of individual LEDs/OLEDs before...
Things I've learned...
OLED is a product of the devil.
LCD is pure divinity.
OLED has undefeatable black crush.
LCD has better black levels than OLED.
OLED is wholly incapable of HDR.
LCD HDR is so good it can make you better at games.
LG's OLED TVs are nothing more than office monitors...
Except he's trying to say it doesn't matter if you're into competitive gaming or not. We all NEED gaming monitors even if we don't require one. Why?
And the claim that HDR helps you be better at games is ridiculous.
As usual, nothing matters but your opinion huh.
"I'm not into competitive gaming."
"But you don't have a gaming monitor, you're doing it wrong!"
My experience with "gaming" monitors has been dire. They all had annoying compromises. Every single one.
Not to say TVs don't have compromises, of...
Well I'm not interested in playing competitively, but if I ever was is there any real reason why a display with such a low input lag would be bad for it? It being a "TV" or what its HDR capabilities are don't make a difference.
I have plenty of reason to believe you don't have the means to...
I don't care about "ranking high". I don't play competitive multiplayer games. I prefer single player games.
Besides, RTINGS tells me my display has a 5.3ms input lag at 4k 120Hz. That's pretty good and not noticeable at all to me when gaming.
Nice of you to be so obsessed with me that you...
Still don't know why this is necessary.
It's not incompetent, it's just not perfect. No HDR implementation is. I see no reason to upgrade a flagship TV that is only a couple of years old just because of a few dark scene local dimming issues.
Hahaha what a load of bull.
Today I learned HDR...
Good surround sound requires a lot more from your environment than HDR displays do, to be fair. I've dabbled with surround setups but as a music producer I'm rather attached to simple, good quality stereo rigs for mixing and I have no issue whatsoever with playing games and watching TV in this...
If a TV in the UK is spec'd for 50Hz it just means it'll accept the UK 50Hz broadcasts. Every TV I've seen here with 50 on the spec sheet can do at least 60 in reality. The only real 50Hz TVs I've ever had were ancient PAL CRTs.
It's incredibly misleading and confusing but it's safe to ignore...
Yep, the depth of these screens would be an issue for me definitely.
Some work also requires straight lines. A big reason I have avoided curved panels so far. I prefer the edges of the screen to not be pointing directly at me if it means my work isn't being distorted by a curve.
Some of these...
Why put flaw in quotes? It's a known, documented hardware flaw pointed out by several users and pro reviewers when any 43" LCD comes up. It isn't going anywhere.
Feels like another attempt to downplay the issue.
Every other LCD panel I've ever used, including the one I'm using to type this message, can display the signal sent to it without each row of pixels bleeding into the one below it.
Current AUO 43" panels cannot.
I've already said it's obvious that some can live with it but at this point no one...
There are reviews of the QN90B out there that prove the pixel row bleeding is still a thing. Your video isn't close or sharp enough to see it unfortunately. I could probably get a similar result filming the same spreadsheet with my phone on the FV43U.
It's just one of those things many people...
Interesting, as these are technically the same (always flawed) panel. It's a real shame but I've not yet seen reports of a properly fixed 43".
I don't blame the OEMs. AUO needs to sort their tech out.
And yet you want panel-based AI upscaling to compensate for lack of GPU power so 8k becomes viable?
For videos and games I can accept it. For desktop use, sorry, nope. Big nope.
I see the argument of "I remember a decade ago x resolution was considered ridiculous" but this becomes less relevant with each resolution increase.
1080p to 4k makes sense in a lot of scenarios. 4k to 8k really doesn't. And we can't keep making the same argument - 8k to 16k, 16k to 32k? Sounds...
I'm juuuust a bit close to the screen (no choice) for 50" to be sensible but I can deal. It's immersive for sure!
43" was more practical but I'm not going back to the headache that is the pixel bleeding, making text look dreadful at the top of the screen and playing havoc with my workflow.
I wish I could go down to 43". Well... I wish the flaws with the 43" panels didn't bug me as much.
I can deal with 50, have been since giving up on the FV43U, but I'm holding out for a major panel revision before going back down. Size aside this QN94A is better in so many ways for my use than...
Still don't think 120Hz should be a luxury consideration. I was sat at 85Hz in the 90s with my old CRT and that wasn't especially high end.
It's eternally disappointing that LCDs regressed to 60Hz and are only recently making a proper push higher. At this point it should be no more expensive...