42" OLED MASTER THREAD

Beyond 2m (6.6ft) without fiber, cable quality starts to matter a lot more. You are way more likely to get blinks if it’s not a great cable at 10+ ft without an active cable so do a lot of checking before you buy a long passive cable.

Yep but like you said there are also expensive fiber hdmi and even usb-c cables that come in 15' , 25' , 50', 75', 100', 150' but probably pushing it past 50 - 100'. Copper is severely limited by comparison.

. . . . . .

This hdmi one below is 50' and around $75 usd, but you can get ones that are a lot shorter like a 15' one for $50, or longer (but if you go way too long like over 100' you might have to use a power injector and it prob wouldn't be optimal for gaming at that point).

Ablink Certified 8K HDMI Cable 2.1 48Gbps 50FT, Ultra High Speed Fiber Optic HDMI Cable for HDR HDCP2.3 eARC 8K60Hz, 4K 120Hz Compatible with PC HDTV Projector, Xbox Series, in Wall CL3 Rated​


https://www.amazon.com/Ablink-48Gbps-Support-Compatible-Switch/dp/B08VNRJNRZ
MFR
5.0 out of 5 starsVerified Purchase
Works great at 4K 120Hz
Reviewed in the United States on April 5, 2022
Style: Fiber optic HDMI 2.1 cableSize: Certified 50ft 8K
THis cable lets me remotely use my gaming PC on my Sony TV from 50' away. Syncs fine at 4K/120Hz with HDR. I tried a couple of different fiber optic cables before this (one cheaper and one much more expensive), but despite claiming 4K120 support, they would not sync with my TV. Ordered this cable and it worked right away.
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
In my experience game mode on and off is a huge difference, and then on top of that, turning off the "Boost" mode also makes a big difference. I tested this out in multiple games just panning around. It's really noticeable, unfortunately.
Compared to what? At least it used to be that running in PC mode disabled almost all if not all image processing which is usually what causes input lag. At least I have never noticed much difference between game mode and PC mode when I have tried this before.

Boost mode AFAIK only works in 60 hz which is probably more relevant to console gamers than PC gamers.
 
Boost mode AFAIK only works in 60 hz which is probably more relevant to console gamers than PC gamers.

The way I think it works. Feel free to give me better details if anyone has them. :D

I believe boost mode is doubling 60hz material just like 120hz TVs would run lower frame rate media content at 120hz, by using the higher refresh rate capability to flicker or redraw showing the same frame more than once. I always though boost was frame doubling to make it a little less laggy compared to 60hz input lag and it's 16.8ms draw per frame. Running a pc game at high fpsHz you wouldn't need that. Frame doubling just doubles the same frames so you aren't getting any newer, more unique frames of action added. Frame doubling/cloning is in contrast to frame generation which uses AI to make a best guess at a "tween" frame halfway between two different frames of action, between two different world states, FoV movements, etc. In my opinion you are better off using frame generation and adjusting your settings to get high unique frames per second than running low fpsHz with boost dupes.

The frames you see with frame gen are best guesses of new information/states that you can actually act on rather than acting on the same repeated frame of action so even if the input lag was the same, you wouldn't be seeing newer frames with boost. Same goes for running higher fpsHz without using frame gen, perhaps by dialing in/down settings some. Frame gen will also look smoother than boost aesthetically (baring any possible frame gen edge artifacts), since the difference in states/action is changing ~ morphing rather than being static - repeated. It's like flipping an animated picture book's pages, and then adding twice as many pages with every other page being exact duplicates, compared to someone coming up with in-between new state drawings they've drawn on every other page instead. Some console games are 60hz limited though so it can be useful for those to cut down lag vs 60fpshz cap. Both methods lower input lag, probably lower sample and hold blur a bit more, and perhaps less VRR black flicker, etc., benefits that higher hz might give regardless. More unique frames is better though. You can't act on what you haven't seen yet (the next actual changed frame/world action state rather than a repeated one), and changes, animation cycles, more dots per dotted line curve/path etc would all be filled with a better looking result when being delivered higher motion definition. Using boost you see the same frame up to twice as long as someone with higher fpsHz~frame gen.

I've seen various reports of what the input lag from leaving frame gen on when playing higher fpsHz games on a 120hz gaming tv so idk if leaving it on would hurt performance any (some say it's the same input lag at 120hz gaming regardless), idk if it would conflict with frame gen doing it's thing or cause more artifacting,etc.. or if boost has any strobing/flickering effect.

Although boost lowers input lag values (for 60hz capped material esp.) , input lag lowered - after you decided to react and input via your peripherals - I have suspicions about the actual reaction time benefit overall since you are seeing the same exact frame twice for a 8.33ms + 8.33ms = 16.7ms period compared to someone running the same 120fpsHz gained via frame gen or by adjusting their game settings seeing a newer frame/action state every 8.33ms. An animation flip book book with twice as many pages can let you react twice as fast page numbers wise, but to the old frame's information seen on twice as many pages. Not everyone is hitting 120fpsHz minimum on the high fps gaming end of the equation either so it can be a VRR roller coaster on a lot of games. All of those factors complicate the comparison a bit.

The nature of online gaming server dynamics prob muddies the waters for online competitive play in regard to seeing newer frames of world states delivered to your screen up to ~ 8ms sooner/later and then reacting 150ms to 180ms later (human reaction time) passed through the machinations of ping times and server code interpolation and it's biases with a returned result . . . but the motion would look better aesthetically on your local end still at higher fpsHZ (raw or via frame gen). Locally (lan gaming, single player gaming) you'd probably get more of a performance advantage in games at the higher unique frame rates compared to dupes though.
 
Last edited:
The way I think it works. Feel free to give me better details if anyone has them. :D

I believe boost mode is doubling 60hz material just like 120hz TVs would run lower frame rate media content at 120hz, by using the higher refresh rate capability to flicker or redraw showing the same frame more than once. I always though boost was frame doubling to make it a little less laggy compared to 60hz input lag and it's 16.8ms draw per frame. Running a pc game at high fpsHz you wouldn't need that. Frame doubling just doubles the same frames so you aren't getting any newer, more unique frames of action added. Frame doubling/cloning is in contrast to frame generation which uses AI to make a best guess at a "tween" frame halfway between two different frames of action, between two different world states, FoV movements, etc. In my opinion you are better off using frame generation and adjusting your settings to get high unique frames per second than running low fpsHz with boost dupes.

The frames you see with frame gen are best guesses of new information/states that you can actually act on rather than acting on the same repeated frame of action so even if the input lag was the same, you wouldn't be seeing newer frames with boost. Same goes for running higher fpsHz without using frame gen, perhaps by dialing in/down settings some. Frame gen will also look smoother than boost aesthetically (baring any possible frame gen edge artifacts), since the difference in states/action is changing ~ morphing rather than being static - repeated. It's like flipping an animated picture book's pages, and then adding twice as many pages with every other page being exact duplicates, compared to someone coming up with in-between new state drawings they've drawn on every other page instead. Some console games are 60hz limited though so it can be useful for those to cut down lag vs 60fpshz cap. Both methods lower input lag, probably lower sample and hold blur a bit more, and perhaps less VRR black flicker, etc that higher hz might give too. More unique frames is better though. You can't act on what you haven't seen yet (the next actual changed frame/world action state rather than a repeated one), and changes, animation cycles, more dots per dotted line curve/path etc would all be filled with a better looking result when being delivered higher motion definition. Using boost you see the same frame up to twice as long as someone with higher fpsHz~frame gen.

I've seen various reports of what the input lag from leaving frame gen on when playing higher fpsHz games on a 120hz gaming tv so idk if leaving it on would hurt performance any (some say it's the same input lag at 120hz gaming regardless), idk if it would conflict with frame gen doing it's thing or cause more artifacting,etc.. or if boost has any strobing/flickering effect.

Although boost lowers input lag values (for 60hz capped material esp.) , input lag lowered - after you decided to react and input via your peripherals - I have suspicions about the actual reaction time benefit overall since you are seeing the same exact frame twice for a 8.33ms + 8.33ms = 16.7ms period compared to someone running the same 120fpsHz gained via frame gen or by adjusting their game settings seeing a newer frame/action state every 8.33ms. An animation flip book book with twice as many pages can let you react twice as fast page numbers wise, but to the old frame's information seen on twice as many pages. Not everyone is hitting 120fpsHz minimum on the high fps gaming end of the equation either so it can be a VRR roller coaster on a lot of games. All of those factors complicate the comparison a bit.
Simply Frame doubling actually increases input lag. Some monitors do this automatically. You can see it in RTings reviews, such as this Corsair OLED:
https://www.rtings.com/monitor/reviews/corsair/xeneon-27qhd240

On an LCD, frame doubling does give you panel response time benefit, as if you were actually running the game at that framerate. So motion looks more clear for lower framerates, due to being able to use the panel overdrive appropriate for twice the actual framerate.

However, you can run 60hz in a 120hz 'container', so to speak. And that offers latency improvements. Street Fighter 6 does this on console. It runs the animations and visuals at 60hz, but player inputs run at 120hz and turns off Vsync. Helps to improve input latency, even more. But, you need VRR to keep it from tearing.

I think Tekken 8 does the same thing. Seems to, based on the demo. If I do borderless window with a desktop refresh set to 60hz and turn off Vsync in game, I get tearing. Even with VRR turned on. But, if I set desktop refresh higher than 60 with VRR on, no tearing.
 
Last edited:
  • Like
Reactions: elvn
like this
I can understand that but what I was getting at is when comparing to 120Hz gaming(optimally 120fpsHz). Your twitch human reaction is 150 - 180ms later than a stimulus on pc, so if that ms the say 180ms reaction time ends at hits on the boost frame when playing a 60fpsHz limitation game boosted to 120Hz I can see how it could happen faster than if you had to instead wait for the full or bulk of 16.7 ms remaining even if being delivered the same 8.3m + 8.3ms frame with no changes.

However on the beginning end of the equation, you won't see the thing to react to in the first place 8.3ms sooner (e.g. a 60fpsHz at 120fpsHz via boost/dupe/120hz refreshing'container') since you are seeing the same frame 2x essentially (8.3ms + the same 8.3ms frame) compared to someone who sees a new, unique frame every one of the "2nd" 8.3ms frames that follow at that 120fpsHz (solid, for example purposes). So, theoretically, you could both react to the same thing that happened but the 120fpsHz person could have seen it 8.3ms sooner than you did. Your inputs could take the exact same time to resolve from the time they were initiated - but even though you can use foresight and intuition, knowledge of how things pans out in games, etc. - typically you can't react to what you haven't seen yet so visually you wouldn't be able to start to react until 8.3ms later since you haven't seen the new unique frame state yet.

Like I said though, ping times and online server's interpolation dynamics will muddy how often you see a new frame in online games to begin with and server's results have their own coding biases. Local gaming would prob have more measurable gain/loss.

Doubling the frame in effect can look a little smoother but without a new unique frame (even an in-between one manufactured using AI frame generation) - it's not adding any motion definition smoothness wise. It's the same state of a running cell animation twice where 120fpsHz sees the runner in a state between that and the 3rd frame. So the 120fpsHz view is much smoother, twice as articulated, twice as many dots per dotted line path/curve so to speak where 60fpsHz has half of the plot points.
60fpsHz at 120 would still cut the sample and hold blur by half though too, and might help cut down on the variance from 120hz gamma state vrr black flicker on some games.

So all things considered it seems like a good improvement for 60fpsHz capped games, but when comparing input lag vs 120fpsHz performance wise - even if the time after you initiated your input is the same length, I think the ongoing 8.3ms sooner look at how the world looks or changed might come into play in how fast you are able to react in the first place overall (at least for local games).
 
Compared to what? At least it used to be that running in PC mode disabled almost all if not all image processing which is usually what causes input lag. At least I have never noticed much difference between game mode and PC mode when I have tried this before.

Boost mode AFAIK only works in 60 hz which is probably more relevant to console gamers than PC gamers.

I mean... just compared to it off. I tried it out in Red Dead 2 and simply panning around I noticed a massive difference in how smooth it was. it was immediately obvious. Non-game mode > game mode > game mode + boost. I did all of these in the same spot in the same game. Just panned around a bit with both of them on and ran around a bit. Instantly noticeable for me, though I don't know why. I'm not sure if it was all the input lag, but the motion clarity felt better for some reason as well, even though it technically should be the same. After trying it out, I couldn't turn it off. I don't know what it does or what magic is there, but for me I notice it immediately. Unfortunately, I'm still waiting for the day when we have the motion clarity of a CRT, as that's the first thing I gamed on and I think I played a lot better on them lol.
Just to be clear, the gaming TV's OSD brightness setting is a separate thing that you keep at max for content like games and movies. You don't have to keep your desktop as bright as your osd peaks. I'm pretty sure that most of us who are keeping windows HDR on are using the HDR/SDR brightness slider in windows' hdr settings at a very low setting for desktop/app use. Games use their own metadata/curve with full brightness range that is separate from that which kicks in for the game The idea isn't using desktop/apps at 100% hdr brightness or even 80%, rather to set this slider much lower in windows settings. That would make the desktop a lot more dim compared to the peaks set in the TV's OSD.

Still, it wouldn't reduce the full hdr volume ~ color heights available once you run games and movies with the OSD at full hdr ranges but I suspect living by candlelight reading and then stepping out into the sunlight over and over could be cause a contrast issue with pupil dialation ~ eyesight adjusting to conditions. But remember that even in full HDR range of a particular screen's capabilities - the highest nit colors are mostly isolated in highlights, light sources, bright reflections in smaller parts of the scenes rather than the full field of the screen being bright all of the time. (y)

View attachment 624101

Does this actually change much? The slider seems to follow some strange, nonlinear curve. The images occasionally exchange brightness and it's weird.

I also had no issues with maxed HDR brightness today. I'm not sure why I have some days that are okay and some that are not, seems to be very on and off. Probably allergy or sleep related, or maybe both. The other thing I noticed is that Empyrion is a bit of a weird game because it takes place out in space, where you have nothing but darkness and black, and then suddenly huge sunlight bursts when you pan over to them... which again were still fine yesterday, though.



Anyway, I had to buy another HDMI cable. It's very occasional, but with this one I keep getting times when the bottom of the screen flickers green and then I have to unplug and plug it back in to get rid of that, mostly while in desktop. That tells me that at 10 feet this cable is simply inadequate. I'm going to try out the Club3D 9.8FT cable and return this one.

After trying out Hogwarts Legacy more and more on this TV, I have to say that it's going to be really hard going back to anything else. The game just looks really good on it, and blown up to 42" size at close range, it's pretty immersive. It would be a lot more immersive if this had a curve, and that's a huge pain point, but unfortunately the 45" OLED monitors just have that frustratingly aggressive matte coating. The coating on this TV, while getting the occasional reflection, just makes the picture look so much clearer when looking up at the sky and stuff. I'm still trying to figure out how the hell I'm going to get a monitor up above it to use as the second monitor, though. It's so tall that no monitor arm I know of will reach that high. I think the easiest solution is if I could find a longer pole to mount the monitor onto. I'll have to try more games on this, but I think I'll probably keep it. Just a question of whether I want to exchange to try to get rid of this dead pixel (while risking another one on the new one) or not, but I've got 90 days to figure that out I suppose.
 
Still no MLA on the C4 😂
 

Attachments

  • IMG_0832.png
    IMG_0832.png
    267 KB · Views: 1
Beyond 2m (6.6ft) without fiber, cable quality starts to matter a lot more. You are way more likely to get blinks if it’s not a great cable at 10+ ft without an active cable so do a lot of checking before you buy a long passive cable.
Absolutely. I went through a whole pile of cables when I wanted to run a 8-10m cable from computer to living room TV in my last apartment. Even some supposedly active ones just had weird issues. In the end the cable that worked was some overpriced AudioQuest HDMI 2.0 cable. With HDMI 2.1 you would be better off going for fiber optic cables for anything longer than 3m.

Issues I saw was stuff like:
  • Resolutions, refresh rates or color spaces going missing randomly. And it was not always even the highest ones which was extra weird.
  • Blinking.
  • No image at all without dropping resolution or refresh rate.
  • HDR not working, either not available or resulting in a blank image.
 
Still no MLA on the C4 😂
Overall the whole TV lineup is just another disappointment with no truly relevant improvements over last year. It's really weird when LG is supposed to release 4K 240 Hz monitors this year, why not have that tech also in the TV range?
 
Overall the whole TV lineup is just another disappointment with no truly relevant improvements over last year. It's really weird when LG is supposed to release 4K 240 Hz monitors this year, why not have that tech also in the TV range?
Maybe because of price as I am guessing that MLA might drive up prices? Have not really made any serious comparissions but I have a feling that the C series usually offers the most bang for the buck, even compared to LCD monitors. Perhaps also LG does not have manufacturing capacity enough to go MLA everywhere. I am actually more missing G-series TVs in smaller sizes
 
Maybe because of price as I am guessing that MLA might drive up prices? Have not really made any serious comparissions but I have a feling that the C series usually offers the most bang for the buck, even compared to LCD monitors. Perhaps also LG does not have manufacturing capacity enough to go MLA everywhere. I am actually more missing G-series TVs in smaller sizes
C-series is pretty expensive on release and only by the end of the year becomes good bang for the buck as prices drop and there's sales.

The real bang for the buck deal has been for those of us who bought a CX-C2, as there's still no truly better option in LG's lineup unless you are willing to go for the larger, more expensive G-series.

MLA might be a manufacturing capability thing and to have some product line differentiation.
 
Dont expect huge advancements on TVs, every upgrade from now will be marginal. That is due to how fast OLED monitors are evolving, TV and Monitors are once again becoming more distinct.
Before OLED monitors, the only way to get the OLED experience on a desktop was to look for smaller sized TVs that perform well, which is why the CX, C1 and C2 were very popular choices for PC gaming.
But now that OLED monitors are becoming more widespread and feature objectively superior performance numbers, it's time to reexamine priorities and analyze purchasing decisions better. It all comes down to "Do you want a big screen or high performance?". Both wont be possible anymore.
 
yeah it was another year of stagnant progression for the C series, I wouldn't mind it so much if you could get the G series features in a 42" version, but no.
The biggest feature I'm most interested in (but can't seem to find much info on yet) is the rumour that LG will use a new RGWB subpixel layout, that will hopefully result in cleaner text.
 
"Do you want a big screen or high+ER performance?". Both wont be possible anymore.

Yes, eventually dp 2.1 displays with the full 80 GB/s could cause a divide. HDMI 2.1's capabilities, harnessed by LG, vs dp 1.4 was probably the biggest thing to level the playing field. It might not happen again until the next update to hdmi spec and ports.

Still, there have been large "monitors" with displayport. Version 2 of the 55 inch 4k samsung ark has dp port on it so future large dp 2.1 displays might still be available in a few models at some point.

The big gains will be hz but eventually I'm hoping 8k resolution too. Also brighter and longer sustained hdr ranges on oled potentially. 8k will demand a lot of bandwidth esp for higher hz, but it can use DSC to help.

I can adapt but I'd consider at least a 32inch sized 16:9 s height minimum. I'd prefer a larger curved screen like the ark but in 8k, and glossy oled but I don't see that happening any time soon. I'd opt for a good FALD one but I do hate abraded matte layer.

In the years beyond I'm hoping MR and XR get to a more advanced stage with much higher resolution to the point where they can truly replace a high rez multi monitor setup, and they will probably have 3d holographic gaming scenes, sets, and beings at some scale composited into your real world setting too. That kind of thing will be a huge advancement. higher hz and rez are just incremental improvements of the same flat games. Smartphones have also hit a wall.
 
Last edited:
IMO, all this talk about HDR brightness is a bit of a bandaid, when virtually no PC monitors even support Dolby Vision or HDR10+, that I know of (both offer per scene/per frame dynamic metadata. And the ability to poll a display's capabilities and make tone mapping adjustments to the HDR output, so that it more closely matches the creator's intent).

PC monitors all seem to only support HDR10, which relies on a couple of static parameters, meant to somehow work for an entire piece of content. Those parameters can be messed with/messed up by a user. And any tone mapping is factory set, one size fits all. Asus usually gets praised as the best 27 and 42 inch 16:9 OLED monitors----but their tone mapping sucked. and it took them until ~ October to release a firmware which made it a lot better.
 
IMO, all this talk about HDR brightness is a bit of a bandaid, when virtually no PC monitors even support Dolby Vision or HDR10+, that I know of (both offer per scene/per frame dynamic metadata. And the ability to poll a display's capabilities and make tone mapping adjustments to the HDR output, so that it more closely matches the creator's intent).

PC monitors all seem to only support HDR10, which relies on a couple of static parameters, meant to somehow work for an entire piece of content. Those parameters can be messed with/messed up by a user. And any tone mapping is factory set, one size fits all. Asus usually gets praised as the best 27 and 42 inch 16:9 OLED monitors----but their tone mapping sucked. and it took them until ~ October to release a firmware which made it a lot better.

The LG CX supported DV and it seems pretty useless, either that or it was the games fault. Not sure if HDR10+ would be any better.
 
C-series is pretty expensive on release and only by the end of the year becomes good bang for the buck as prices drop and there's sales.

The real bang for the buck deal has been for those of us who bought a CX-C2, as there's still no truly better option in LG's lineup unless you are willing to go for the larger, more expensive G-series.

MLA might be a manufacturing capability thing and to have some product line differentiation.
Weirdly enough you seem to both agree and disagree about the C-series being bang for the buck :) Most products are more expensive when they have just been launched compared to later on, not sure why the C-series should be different.
 
Weirdly enough you seem to both agree and disagree about the C-series being bang for the buck :) Most products are more expensive when they have just been launched compared to later on, not sure why the C-series should be different.

Most products do not have discounts as steep as the C series in less than a year. The C3 went from $1600 to $800 in less than a year, what monitor does that?
 
The LG CX supported DV and it seems pretty useless, either that or it was the games fault. Not sure if HDR10+ would be any better.
CX isn't a monitor.

Its widely agreed that DV and HDR10+ are better than HDR10 or other non-dynamic solutions with per scene/per frame metadata and tone mapping. To the point that some reviewers recommend the streaming versions of Marvel films rather than the blu-rays, because the streaming versions were updated with Dolby Vision.
 
CX isn't a monitor.

Its widely agreed that DV and HDR10+ are better than HDR10 or other non-dynamic solutions with per scene/per frame metadata and tone mapping. To the point that some reviewers recommend the streaming versions of Marvel films rather than the blu-rays, because the streaming versions were updated with Dolby Vision.

So what if it isn't a monitor? Is the implementation of DV vastly different on TVs Vs. Monitors? For streaming services yeah DV has it's benefits, but for games I have yet to see a case where DV vs HDR10 is like a day and night difference.
 
Does that just make the display brighter, or are there any other benefits? I think this C3 is plenty bright enough for me obviously.

It can make the display brighter but I believe it can also be used to achieve the same level of brightness as a non MLA panel while using less energy, which would help with lifespan/burn in as the panel isn't being driven as hard. Pretty much all other sized WOLED panels have MLA so it was expected that it would make it's way to the 42/48 inch panels in 2024 but looks like that's no dice.
 
  • Like
Reactions: elvn
like this
It can make the display brighter but I believe it can also be used to achieve the same level of brightness as a non MLA panel while using less energy, which would help with lifespan/burn in as the panel isn't being driven as hard. Pretty much all other sized WOLED panels have MLA so it was expected that it would make it's way to the 42/48 inch panels in 2024 but looks like that's no dice.
Damnit LG, just give us the option to buy the G series features just at a smaller sizes. Not everyone wants a 55" and up. Everyone I know living in apartments has a 42" to 48" TV.
 
Damnit LG, just give us the option to buy the G series features just at a smaller sizes. Not everyone wants a 55" and up. Everyone I know living in apartments has a 42" to 48" TV.
This will probably depend heavily on where you live. 50+ inches are far more common over here. I currently use the 48" CX as my living room TV because I have it, but in my previous apartment had a 65" C9 and for current one would like to upgrade to 55" if the right candidate got released. Otherwise I will just keep using the CX until it dies.

I do agree that it would be nice to have the top tier features on the smaller models.
 
So those 32" 4K 240hz qdoled gen 2 are apparently releasing this Feb-March. This 42C2 served it's purpose amazingly as a stop gap for such displays. I will retire it with full honors, served me so well.
 
This will probably depend heavily on where you live. 50+ inches are far more common over here. I currently use the 48" CX as my living room TV because I have it, but in my previous apartment had a 65" C9 and for current one would like to upgrade to 55" if the right candidate got released. Otherwise I will just keep using the CX until it dies.

I do agree that it would be nice to have the top tier features on the smaller models.
As far as actual TV use, I don't think I could go back from using a projector. 170" diagonal screens are pretty much unbeatable for immersion and entertaining any guests, in my experience. I think I've definitely warmed up to having this 42" TV on my desk though. Though I wish it was curved.

Also turns out Hogwarts Legacy has random issues with shutting off the HDR setting... weird. Last few times I fired it up, I started feeling like it was sort of washed out, and then I realized that it had its own HDR setting. Turning it on instantly made a huge difference. The thing is, Hogwarts Legacy also only runs on Windowed Fullscreen, not exclusive full screen. If I'm turning on HDR in the game, while it's also on in windows, how exactly is that handled...?

I'm also having this weird issue lately, pretty much ever since I turned on HDR, where it doesn't seem like the side monitor actually turns off. Just kind of goes into a lit up black screen, while the TV Is off. Then when I move the mouse, it doesn't come on. The day before yesterday, it just wouldn't come on at all. I had to just reboot the computer. The computer itself was still reachable via the Samba share, so I know it was working fine. Today, kind of the same issue, but when I actually turned on the TV, everything went back to normal and the displays spun up, so I didn't have to restart. I don't think there's any GPU issues, I'm just kind of confused as to what's going on. I doubt it, but anyone have any ideas?

On the bright side, I do think this Club3D cable is working a lot better. I normally do use Club3D, so I should have just gone with it initially. Oh well.
 
Last edited:

View: https://www.youtube.com/watch?v=fD3PHmyGSjk

in this video, he is saying the C series does NOT have 5 yr. panel. (7:23 of the video), only the G4 and M4 series c/w 5 yr. warranty panel.


If you buy from bestbuy you can purchase a 5 yr warranty that covers all including burn In from reports. The difference in price between those models would make it worth it if you need the peace of mind but MLA has only been in the top tier line.
 
except bestbuy charges $400 for that so called "extended warranty", which is really an insurance. Further, they don't honor it. They would say there is nothing I can do. A friend bought that on his TV. The LCD TV dies, they said there is nothing they can do as their extended warranty doesn't cover it
 
The thing is, Hogwarts Legacy also only runs on Windowed Fullscreen, not exclusive full screen. If I'm turning on HDR in the game, while it's also on in windows, how exactly is that handled...?

Anyone have any idea?

except bestbuy charges $400 for that so called "extended warranty", which is really an insurance. Further, they don't honor it. They would say there is nothing I can do. A friend bought that on his TV. The LCD TV dies, they said there is nothing they can do as their extended warranty doesn't cover it

I bought mine from Costco, and they bundled the extended warranty in with it at no extra charge. Apparently from Allstate. Not sure how good it actually is, though. I think Costco and maybe Microcenter are the only retailers that I have a preference for shopping with. Best Buy is at best a third wheel and Amazon has been pretty shitty lately.
 
except bestbuy charges $400 for that so called "extended warranty", which is really an insurance. Further, they don't honor it. They would say there is nothing I can do. A friend bought that on his TV. The LCD TV dies, they said there is nothing they can do as their extended warranty doesn't cover it

Hmm what nation was he in? In USA, east coast, Best Buy geeksquad people are pretty awesome. Never had an issue like this, always have gotten my store credit back with their protection plans. And MicroCenter is even better, albiet slightly more pricey. I pretty much always get a burn in plan on my OLEDs from either of those 2 stores, makes for easy upgrades since they do get burn in and I do get my credit back.
 
  • Like
Reactions: elvn
like this
I haven't had to use one in years but I did use a bb warranty back on an old 55" rear projection tv with no problems getting a panel replaced.

From what I've read in threads they do honor the warranty and even burn in, which is why the price is like that I'd assume. They do not cover accidental damage (e.g. something smashed into the screen).

Regarding price, I was comparing it to the included warranty in the G series.

55" C3 at bb = $1499 + $ 339 warranty/insurance ( = $5.67/mo $68.04 a year insurance for 5 yrs).

55" G3 at bb = $1999 and has included warranty, but MLA is the real upgrade for the price difference.


However, terms wise......

LG warranty is:
"The 5 year panel warranty applies to every size of the 2023/2022 SIGNATURE OLED 8K or 2023/2022 OLED evo G3 TV ranges. The warranty doesn't cover commercial or abnormal use, and is only available to the original purchaser of the product when bought lawfully and used within the country of purchase"

" *In the 1st year of the warranty, panel, parts, and labor costs are covered. In the 2nd - 5th year of the warranty, only panels are covered, and labor will be charged.
**5-year panel warranty covers 88Z3, 77Z3, 83G3, 73G3, 65G3, 55G3, 88Z2, 77Z2, 83G2, 73G2, 65G2, and 55G2. "

. .

the bestbuy warranty is:

"We make house calls for TVs 42" and larger.

No need to lug your screen into a store. If your TV is 42" or larger, we'll come to your home to repair your issue. If we originally installed your TV, we'll also uninstall and reinstall it.

You'll never pay for parts and labor.


We take care of 100% of the costs of parts and labor for covered repairs, with no hidden fees.

If your screen has bad pixels or a shadow image, we'll correct it.

If you have at least three pixels that are always the same color or a ghost image that won't go away, we'll get your picture looking like new.

If your TV won’t turn on because of a power surge, we'll fix it.

If there's a power surge or fluctuation that damages your product, we'll make things right. This includes a surge caused by a lightning strike.
If the remote that came with your TV stops working, we'll replace it.

Get a one-time replacement for the remote control that was included in your TV’s original box.


If there's a failure from normal wear and tear, we'll repair it.

This could be a problem with an internal part or how the product was manufactured. It could also be caused by dust, internal heat or humidity. Accidental damage is not included.

. . . . . . . .

So you get a much better warranty for a C3 + bestbuy warranty terms wise, and for $1840 vs $1999 on the G3, but the G3 having MLA now makes it a much bigger difference in performance.


Also worth noting that the bb warranty is a fraction of the price of the unit, so if you were to score a 42" gaming oled on sale the warranty is relatively low compared to what the warranty would be on something like a 77" OLED.

Previous replay of mine from awhile ago:

"42" LG C2 for $900 + tax currently at best buy. The 5 year best buy warranty on a c2 can be had for around $36 a year. That covers burn in if you are actually concerned about it but I doubt you'd burn in before 4+ years in normal media and gaming usage with some precautions taken. $36 a year insurance , $3 a month, $180 / 5 yr."
 
Last edited:
Hmm what nation was he in? In USA, east coast, Best Buy geeksquad people are pretty awesome. Never had an issue like this, always have gotten my store credit back with their protection plans. And MicroCenter is even better, albiet slightly more pricey. I pretty much always get a burn in plan on my OLEDs from either of those 2 stores, makes for easy upgrades since they do get burn in and I do get my credit back.
you have better luck than my friend. My screen was purchased 11 yr. ago, he specified call me and tell me what happened to him. So his screen would be say 13 yr. ago. These incidents are not the kind of things anyone can forgive or forget
 
Most products do not have discounts as steep as the C series in less than a year. The C3 went from $1600 to $800 in less than a year, what monitor does that?
That was my point about the C-series being very much bang for the buck unless you really need to have it day one.
 
IMO, all this talk about HDR brightness is a bit of a bandaid, when virtually no PC monitors even support Dolby Vision or HDR10+, that I know of (both offer per scene/per frame dynamic metadata. And the ability to poll a display's capabilities and make tone mapping adjustments to the HDR output, so that it more closely matches the creator's intent).

PC monitors all seem to only support HDR10, which relies on a couple of static parameters, meant to somehow work for an entire piece of content. Those parameters can be messed with/messed up by a user. And any tone mapping is factory set, one size fits all. Asus usually gets praised as the best 27 and 42 inch 16:9 OLED monitors----but their tone mapping sucked. and it took them until ~ October to release a firmware which made it a lot better.
HP revealed an OLED monitor at CES that supports Dolby Vision.

https://tftcentral.co.uk/news/hp-omen-transcend-32-4k-240hz-oled-gaming-monitor-officially-announced
 
IMO, all this talk about HDR brightness is a bit of a bandaid, when virtually no PC monitors even support Dolby Vision or HDR10+, that I know of (both offer per scene/per frame dynamic metadata. And the ability to poll a display's capabilities and make tone mapping adjustments to the HDR output, so that it more closely matches the creator's intent).

PC monitors all seem to only support HDR10, which relies on a couple of static parameters, meant to somehow work for an entire piece of content. Those parameters can be messed with/messed up by a user. And any tone mapping is factory set, one size fits all. Asus usually gets praised as the best 27 and 42 inch 16:9 OLED monitors----but their tone mapping sucked. and it took them until ~ October to release a firmware which made it a lot better.


It's going in the opposite direction size wise from 42" OLED - but there is a 65" M8 FALD LCD from TCL that has 5,000 zones and 5000nit peak HDR, 120hz with "240hz game accelerator" in case anyone would be interested. According to the marketing it has all of the features of their 7 series plus the additions of the 8 series, and the 7 series has dolby vision.

As a true premium TV, QM7 includes a host of other technologies including a native 120Hz Panel Refresh Rate on all screen sizes, Game Accelerator 240, HDR ULTRA with Dolby Vision IQ, a 2.1 Channel Speaker System with built-in subwoofer, and an elegant adjustable height pedestal stand (65- to 85-inch sizes). With IMAX Enhanced Certification and AMD FreeSync Premium Certification, the TCL QM7 is a certified winner. The new QM7 will be available in 55” to 98” screen sizes.


At the flagship level is the QM8 which includes the incredible QD Mini LED ULTRA for Ultra High Zone Dimming with up to 5,000+ Zones. This is more than twice the number of zones of the previous flagship level for truly inky blacks. Also included is the High Brightness ULTIMATE LED Backlight with up to 5,000 peak nits, and QD color ULTRA, for incredibly dynamic images. This is 2.5 times brighter than the current “brightest TV ever.”


The QM8 includes all the great features of the QM7, plus an Anti-Glare Screen to maintain contrast in ambient room light conditions, 2.1.2 Channel Speaker System with Built-in Dolby Atmos Speakers for a wider and higher sound stage, and Next Gen TV and Wi-Fi 6 for “future proofing.” The new QM8 will be available in 65” to 98” screen sizes.

. .

The TCL M8 screens are native 120hz. They use a game accelerator mode to cut the vertical rez in half to hit 240hz.


From a reddit reply on their previous model:

Apparently it doesn't cut the resolution in half, it just cuts the *vertical* resolution in half. So 3840*2160 becomes a very weird 3840*1080.

TCL have implemented this "motion accelerator" on a few of their native 120hz/144hz panels, too (specifically, I'm looking at the TCL 65C745K, which might be EU/UK-exclusive - I know I had trouble finding any retailers carrying 120hz TCL models widely available in America over here when I was making notes of what was available some time last year. This one does 120hz, 144hz and this weird 3840*1080@240hz).
 
I have a quick survey question as I am no longer buying the 48" 4K OLED from my 43" 4K LED LCD:

How many of you own a 40" or higher LED LCD, and ditch that LED LCD and upgrade to a 42" or 48" OLED just because it's OLED?
 
Back
Top