Alienware AW3225QF 32" 4K 240 Hz OLED

Ah yes. Forgot about brightness. I'm still in SDR land so I don't really care too much about that. Most of my uses don't need HDR but good point.
It's not the be-all, end-all... but it does matter to how good HDR looks more than you might think. I was pretty surprised myself. So I have an S95B TV and have since not too long after it came out. That's where I did my HDR gaming. That's about 700-800ish nits real scene brightness. It is really good and I like it... but then I got a PG32UQX monitor. That is 1600-1700nits real scene, and actually pretty much ANY scene as it can pull that even at 50% of screen. Man it looks SO MUCH nicer. Despite not having the pixel perfect dimming and being able to notice the FALD zones sometimes, despite the much lower motion clarity, the impact of HDR games is just way more with that higher brightness.

It's still an area that OLED monitors have real trouble in. TVs have always been better and have made more gains, like the S95C which is the successor to my TV can push as high as 1000-1100 real scene, and the LG G3 can do that, maybe even a bit more. But the monitors just can't. They can hit peaks near as high as the TVs, but only at like a 1-2% window, they drop hard (particularly the QD-OLEDs) down to 10% whereas TVs can maintain their brightness at 10%.

That and burn-in are the two areas I'm most hoping to see improvements in OLED monitors. Better text clarity would be nice too, but really that's on MS to quit being sloths about and just get some new anti-aliasing patterns for Windows.
 
It's not the be-all, end-all... but it does matter to how good HDR looks more than you might think. I was pretty surprised myself. So I have an S95B TV and have since not too long after it came out. That's where I did my HDR gaming. That's about 700-800ish nits real scene brightness. It is really good and I like it... but then I got a PG32UQX monitor. That is 1600-1700nits real scene, and actually pretty much ANY scene as it can pull that even at 50% of screen. Man it looks SO MUCH nicer. Despite not having the pixel perfect dimming and being able to notice the FALD zones sometimes, despite the much lower motion clarity, the impact of HDR games is just way more with that higher brightness.

It's still an area that OLED monitors have real trouble in. TVs have always been better and have made more gains, like the S95C which is the successor to my TV can push as high as 1000-1100 real scene, and the LG G3 can do that, maybe even a bit more. But the monitors just can't. They can hit peaks near as high as the TVs, but only at like a 1-2% window, they drop hard (particularly the QD-OLEDs) down to 10% whereas TVs can maintain their brightness at 10%.

That and burn-in are the two areas I'm most hoping to see improvements in OLED monitors. Better text clarity would be nice too, but really that's on MS to quit being sloths about and just get some new anti-aliasing patterns for Windows.
No you're correct. If CRT had kept up with the technological trends then it would be the end-all-be all frankly. But that's a different argument and I probably stirred the hornets' nest on that one.

But damned if this monitor isn't close.
 
No you're correct. If CRT had kept up with the technological trends then it would be the end-all-be all frankly. But that's a different argument and I probably stirred the hornets' nest on that one.

But damned if this monitor isn't close.
I'm excited about the future of OLED. I won't be getting one of these, because of burn in worries and brightness, but I'm going to watch the improvements with interest. I think they will get real scene levels up as time goes on and then I'll probably grab one.
 
Every day I play a different game and am WOWED by this thing!

Just played Halo Infinite and OMG it looks incredible and plays amazing....I was never a big fan of Infinite multi vs MCC...but now I'm playing it like a crack addict.

Single player games benefit too because that motion clarity is nearly crt like....combined with 4x resolution over old crts and modern HDR....ohhhh muuuhhh gawwww.....this is next level
 
Smokes everything except for HDR performance that is.

View attachment 633085

These QD OLED monitors get their brightness levels nerfed too hard compared to TVs.
This result is absolutely pathetic. Even older WOLEDs blow this out of the water. Comparing them to even last years QD-OLED TVs would be a joke at best. I think I would deal with a 42" TV before buying one of these dim turds.

The 42" C2 HDR is 37% brighter in "real scene brightness". Mini LED monitors like the X32 are 125% BRIGHTER according to Rtings measurements.

And before anyone tries making comments about how it's plenty bright or peak brightness doesn't matter or no one uses brightness that high... I calibrate my monitors for 100 nits SDR. That's the brightness I use for everything outside of HDR content. That's almost certainly dimmer than you use on a day-to-day basis on the desktop. Brightness matters for HDR, and these new 32" QD-OLEDs are possibly the most disappointing monitors I have ever seen released in my 25+ years of PC gaming.
 
This monitor is really dim in person. I've been limping along with a PG32UQX waiting for these 32" OLEDs but after seeing it in person not sure if it's really a upgrade.

I dunno if it's a good or bad thing that SDR doesn't look all that different from HDR on a monitor but that's the case here.
 
This result is absolutely pathetic. Even older WOLEDs blow this out of the water. Comparing them to even last years QD-OLED TVs would be a joke at best. I think I would deal with a 42" TV before buying one of these dim turds.

The 42" C2 HDR is 37% brighter in "real scene brightness". Mini LED monitors like the X32 are 125% BRIGHTER according to Rtings measurements.

And before anyone tries making comments about how it's plenty bright or peak brightness doesn't matter or no one uses brightness that high... I calibrate my monitors for 100 nits SDR. That's the brightness I use for everything outside of HDR content. That's almost certainly dimmer than you use on a day-to-day basis on the desktop. Brightness matters for HDR, and these new 32" QD-OLEDs are possibly the most disappointing monitors I have ever seen released in my 25+ years of PC gaming.

Yeah I don't plan on using one for HDR gaming. I'll stick to my 32M2V for that. 4K + 240Hz + OLED response times is a dream for certain games though so that's what I'll be using it for.
 
It's not the be-all, end-all... but it does matter to how good HDR looks more than you might think. I was pretty surprised myself. So I have an S95B TV and have since not too long after it came out. That's where I did my HDR gaming. That's about 700-800ish nits real scene brightness. It is really good and I like it... but then I got a PG32UQX monitor. That is 1600-1700nits real scene, and actually pretty much ANY scene as it can pull that even at 50% of screen. Man it looks SO MUCH nicer. Despite not having the pixel perfect dimming and being able to notice the FALD zones sometimes, despite the much lower motion clarity, the impact of HDR games is just way more with that higher brightness.
I'm sitting on the fence right now - I really want better SDR, but it's hard to let go 1600 nits of HDR... :D
 
It's really annoying switching between two displays based on use case. Even more annoying is getting a setup in terms of space/placement correctly to best accommodate both.

People want an all around spectacular single display solution but this is once again not it. It's literally the inverse of a PG32UQX (super clear in motion, but super dim).
 
It's really annoying switching between two displays based on use case. Even more annoying is getting a setup in terms of space/placement correctly to best accommodate both.

People want an all around spectacular single display solution but this is once again not it. It's literally the inverse of a PG32UQX (super clear in motion, but super dim).

Fair enough. For me it is more annoying to constantly deal with the flaws of a single display than to just switch it up based on use case.
 
It's really annoying switching between two displays based on use case. Even more annoying is getting a setup in terms of space/placement correctly to best accommodate both.

People want an all around spectacular single display solution but this is once again not it. It's literally the inverse of a PG32UQX (super clear in motion, but super dim).
I gave up on the dream of one display handling all my needs.

MiniLED for productivity, surfing and bright HDR
OLED for FPS and dark space games.

You have to use the AW32 in a dark room, like a projector setup.
Since my office is bright as fuck, I basically only use the AW32 at night.

But HOLEEEE CHITTTTT does this thing look GEORGOUS in the dark!
Halo MCC and Infinite look batshit INSANE on this thing kicking solid 240fps in 4k HDR1000!

As terrible as the AW32 brightness is during the day, the PG32UQX motion clarity is ULTRA MOAR TERRIBLEZZZZZ and 144hz is a dogs ass!

$600 for my Innocn 27 4k160 MiniLED and $1,200 for the AW32 = $1800
which is STILL $200 less than what we paid for the PG27UQ back in 2018 and that is in BIDEN BUCKS! So literally half of what we paid for the PG27 lol you get two displays to meet all your needs like a Thai Bath House Float Girl!
 
Last edited:
I'm sitting on the fence right now - I really want better SDR, but it's hard to let go 1600 nits of HDR... :D
I mean, I find for most SDR games I just turn on the FALD dimming and it gets 90% of the way there to having good OLED blacks. Doesn't have the motion clarity, of course, but it does well enough at everything I'm going to stick with it for this generation. I'll watch with interest at what the next gen of panels offers. I'm sure they'll get 10% brightness up enough to make them compelling soon.

Because ya, having two monitors just doesn't work for me. It wouldn't be impossible but for various reason it is just not something that I want, so I want one that does the best all around and for the moment that's the MiniLEDs but it wouldn't surprise me if the next gen OLEDs, or the gen after that, see me switch to them.
 
MiniLed is more versatile for use beyond gaming....but make no mistake it's not anywhere in the league of what the AW32 can do with gaming.
 
Except HDR, of course.
Not necessarily

In Bright games and colorful Avatar Movie style scenes MiniLED HDR is king.

But in dark atmospheric games, night time scenes, space games, horror games, etc, etc, OLED HDR owns.

I've owned the PG32UQX and know how its rocks 1600+ all the way up to nearly 1800 nits with contrast cranked. But if you look at the night starry sky in a game like Red Dead 2 or Days Gone MiniLed its horrible lol

I sold my pg32uqx, but still have an Innocn 4k160 mini Led which hits about 1200 nits and has considerably better motion clarity vs the pg32uqx. I have a hard time playing games on it now that I've spent time with the AW32 in a light controlled room.
 
  • Like
Reactions: elvn
like this
I gave up on the dream of one display handling all my needs.

MiniLED for productivity, surfing and bright HDR
OLED for FPS and dark space games.

You have to use the AW32 in a dark room, like a projector setup.
Since my office is bright as fuck, I basically only use the AW32 at night.

But HOLEEEE CHITTTTT does this thing look GEORGOUS in the dark!
Halo MCC and Infinite look batshit INSANE on this thing kicking solid 240fps in 4k HDR1000!

As terrible as the AW32 brightness is during the day, the PG32UQX motion clarity is ULTRA MOAR TERRIBLEZZZZZ and 144hz is a dogs ass!

$600 for my Innocn 27 4k160 MiniLED and $1,200 for the AW32 = $1800
which is STILL $200 less than what we paid for the PG27UQ back in 2018 and that is in BIDEN BUCKS! So literally half of what we paid for the PG27 lol you get two displays to meet all your needs like a Thai Bath House Float Girl!

Agreed on the splitting tradeoffs between 2 screens thing, though we all may have our own personal preferences on tradeoffs. I've been doing two different monitor types since at least 2006.

Also vote for the controlled lighting conditions while diving into game worlds or digging in and focusing on a directed media experience. I want to be in a game world when playing, or in the theater experience for shows and movies when they aren't just background material.. . . not in a bright IRL room looking at a little diorama screen that happens to be on a table top in it. Desktop/app use is another matter, but that's why I typically use a different screen for that.


Controlled lighting gets the best results.



quiet-on-set-2-art-by-linda-woods-linda-woods.jpg


20220406_114600.jpg


20220406_112911.jpg

. .


Except HDR, of course.

Arguable. FALD looks good for what it can do, and it'll definitely go brighter in HDR and especially sustained periods 25%, 50% of screen, but it has some major tradeoffs. Per pixel emissive displays like OLED don't have hot and cold zones perimeters like bleeding watercolor in regular mixed contrast content. FALD is non-uniform tetris brickwork so the dark areas around brights are lightened, and vice versa. The fw tends to spread the contrasted zone lighting across more zones so not as harsh of a halo but it's lifting a dark area, or dimming a bright one.

There is also the fact that FALD and HDR enabled can cause input lag to go way up on FALD LCDs. That and, according to some trusted review sites, that in game mode on samsung FALD gaming tvs (not sure how it compares to the gaming monitors), the # of zones affected by local dimming is increased to even wider and the zone transitions become slower when game mode is enabled (which has to be if you don't want added input lag, a much less responsive screen).

Both FALD and OLED have some major tradeoffs. They use a number of tricks/hacks/work-arounds in an attempt to hide their shortcomings and squeeze the best picture quality and performance out of the display techs, but their tricks can't compensate completely.

. . . . . . .

These sdr scene screen captures from a few videos below, including hdtvtest's review of the pro art ucx. The images are in sdr, compressed, and the screen shotting also affected them. The reviewers had to use different iso/camera settings to show the effect in sdr. So the images are greatly exaggerated compared to what you'd see in real life, but it highlights where the FALD is lifting the blacks and dark detail.

While FALD can get very high contrast numbers on larger fields of dark and larger fields of bright/white, in mixed contrast areas it will drop those combined areas back to nearer the native contrast of the screen, 3000:1 to 5000:1 typically. 3,000:1 to 5,000:1 were ok numbers for an edge lit VA screen in previous years, but in viewing dynamic content on a FALD you are dynamically elevating and dropping the zones so the effect on uniformity is bad. The larger fields of brights and darks remain much more solidly at their enormously greater brightness/darkness level values while mixed contrast area puddles all over a scene are lifting and dimming down to near native contrast and with fluctuating elevation. The actual video most of these images are from is linked at the bottom of the quote.

FALD.HDR_thematrix.ship.command.center_1.jpg




FALD.HDR_thematrix.ship.command.center_2.jpg
..
FALD.HDR_hdtvtest_number.of.zones.blooming_1.jpg
..
FALD.HDR_hdtvtest_number.of.zones.blooming_2.jpg



FALD.HDR_hdtvtest_number.of.zones.blooming_batman_3.jpg


FALD.HDR_hdtvtest_number.of.zones.blooming_batman_4.jpg


FALD.HDR_hdtvtest_number.of.zones.blooming_davinci.app_5.jpg


"elevated blacks and a distracting amount of fluctuating elevation"
FALD.HDR_hdtvtest_number.of.zones.blooming_on.scree.OSD_6.jpg


. . .

From a different review of the ucx, capturing the lifted area around a cursor or other small detail area:


View: https://imgur.com/21tdf1f


..
From a samsung 90B FALD review (camera iso and sdr capture greatly exaggerating the effect but it shows how larger areas are lifting, (and dynamically across the screen in actual viewing).
FALD.HDR.misc.review_aquaman.plane.cargo.hold_1.jpg


. . .

HDTVTEST youtube video

Mini LED Tech Helps Asus Cram 1152 Zones into 32" Monitor, But Is It Enough? (PA32UCX Review)

 
Last edited:
OLEDs already outperform MiniLEDs in HDR content. MiniLED is only a real option for those who cannot afford OLED displays.


View: https://i.imgur.com/9ByjUoC.jpeg


They both are squeezing as much as they can out of their respective techs, using a lot of tricks. I love OLED but a FALD can do much brighter and longer sustained bright areas of the screen.

This is from RTINGs. OLEDs are definitely getting better numbers at 10%, 25% and 50% than they were before. . especially sustained window periods, (but they are still 300 - 360 nit). Phosphorescent blue oleds and MLA (micro lens array) models should help increase that some eventually.

Samsung Odyssey Neo G9/G95NA S49AG95 (FALD LCD) VS Samsung Odyssey OLED G9/G95SC S49CG95 (OLED)

firefox_mPAurNOTcA.png


firefox_xmek3RhOp6.png


So you can see the peak and sustained differences there which are very appreciable. The overall appearance of the HDR content is different overall besides those raw numbers though as the text below those numbers within the screenshot describes, plus what I said in my previous reply. Therefore, there are other things to take into consideration.

Also, in some cases the FALD LCDs are more expensive than an OLED rather than the other way around. . depending on type (gaming screen, gaming tv, form factor).

Edit: Added the aw3225qf comparison in quotes below since that is the subject of this thread.

 
Last edited:
This makes absolutely no sense because Mini LED like that Samsung you are showing costs even MORE than an OLED.
It's just fanboy factionalism. Happens with tech all the time for some reason. Some people feel the need for their choice to be THE BEST and that anyone who makes a different choice is wrong and must be hated on. This is twice now I've seen someone try to deride MiniLED as being for "the poors" as though that is a bad thing and is also, of course, incorrect.

We are just in one of those times when there's two different technologies both that are good in their own ways and so different people are going to want them for different reasons. For those of us who love tech and gaming, it is an exciting time and we wanna discuss what we love, and don't about the stuff we have. For the fanboys, however, it means you must attack anyone who makes a choice different than yours.
 
This makes absolutely no sense because Mini LED like that Samsung you are showing costs even MORE than an OLED.
What Samsung MiniLED are you talking about? The Odyssey Neo G8? That model is already OUTDATED and performs worse than current OLED offerings. Only someone insane would buy it right now, unless it was substantially cheaper than the equivalent OLED models.
MiniLEDs should always be cheaper because they perform worse in most specs.
 
What Samsung MiniLED are you talking about? The Odyssey Neo G8? That model is already OUTDATED and performs worse than current OLED offerings. Only someone insane would buy it right now, unless it was substantially cheaper than the equivalent OLED models.
MiniLEDs should always be cheaper because they perform worse in most specs.

Outdated or not your point makes no sense. Mini LEDs do not always cost less than an OLED, sometimes they do and sometimes they don't. Look at the newest Mini LED TVs and you'll find prices range from being cheaper than OLED to more expensive than OLED. So I'm not sure how you came to the conclusion that Mini LED is for poor people when literally OLEDs can be the more affordable option, which only helps to make it the better choice. If I had to choose between a $2000 PG32UQX or a $950 321URX I'm going for the qd oled on price alone.
 
What Samsung MiniLED are you talking about? The Odyssey Neo G8? That model is already OUTDATED and performs worse than current OLED offerings. Only someone insane would buy it right now, unless it was substantially cheaper than the equivalent OLED models.
MiniLEDs should always be cheaper because they perform worse in most specs.

The picture of the graph you posted was a big highlight of the G95SC OLED, and so was probably pulled from a review of that screen.

Rtings blurb on the 2023 S95SC you referenced. 5120 x 1440, 1800R very mild curve, 240hz QD- OLED

The Samsung Odyssey OLED G9/G95SC S49CG95 is a premium 49-inch QD-OLED monitor. It's a newer model than the Samsung Odyssey Neo G9/G95NA S49AG95, which uses Mini LED backlighting, and it's the second QD-OLED monitor from Samsung, alongside the Samsung Odyssey OLED G8/G85SB S34BG85. It has a 5120x1440 resolution and super ultrawide 32:9 aspect ratio with a 1800R curve, so while it has a very wide screen, the edges of the screen are brought closer to your field of view. It's designed as a gaming monitor with a 240Hz refresh rate, and thanks to its DisplayPort 1.4 and HDMI 2.1 inputs, you can take full advantage of its max refresh rate with any graphics card that supports Display Stream Compression. It supports all common variable refresh rate (VRR) formats, like HDMI Forum VRR, FreeSync, and G-SYNC compatibility.


In my reply I posted the RTNIGs comparisons of the 5120 x 1440,, 1000R, 240Hz, 49 inch s-uw G95NA (2021) to that S95SC OLED
Odyssey Neo G9/G95NA S49AG95 is a super ultrawide gaming monitor with a 49 inch screen and 32:9 aspect ratio. It's an upgraded model of the Samsung Odyssey G9 that features Mini LED backlighting, allowing it to get brighter and have greater control over the local dimming. In fact, it has the best local dimming we've seen on any LED-backlit monitor as it rivals that of TVs and helps it display deep blacks. Since it's a gaming monitor, it has a high 240Hz refresh rate with native support for FreeSync variable refresh rate (VRR) technology and G-SYNC compatibility to reduce screen tearing. It's even future-proof as it has HDMI 2.1 inputs


And the 2023 G95NC 57" 7680x2160 240Hz s-uw FALD vs the G95SC OLED
"Odyssey Neo G9/G95NC S57CG95 is a premium 57-inch super ultrawide monitor with a 1000R curve. With a 32:9 aspect ratio and 7680x2160 resolution, it's the equivalent of two 32-inch, 4k monitors side-by-side, one of the first displays of this size. It features a 240Hz refresh rate with variable refresh rate (VRR) support for gaming, and it supports DisplayPort 2.1 bandwidth, which lets you achieve its high refresh rate and resolution with a DisplayPort 2.1 graphics card. Like the smaller Samsung Odyssey Neo G9/G95NA S49AG95, it includes Mini LED backlighting with 2,392 dimming zones.




Then I added the RTings comparisons of both of those FALD uw screens to this alienware aw3225qf that is 16:9, QD-OLED,
The Dell AW3225QF is a 4k, 240Hz QD-OLED gaming monitor with a curved screen. It's the first model available in North America featuring this high-resolution QD-OLED panel, competing with other monitors with the same panel, like the ASUS ROG Swift PG32UCDM and the Samsung Odyssey OLED G8/G80SD. It has typical gaming features like support for all common variable refresh rate (VRR) formats and HDMI 2.1 bandwidth. However, what makes it different from most monitors is that it also supports Dolby Vision and has an eARC port to connect a compatible soundbar easily.




============================================

The HDR comparisons in the previous reply show much lower 50%, 25%, 10% windows and sustained windows on the oleds compared to those three FALD screens. Those windows on the OLEDs are 1/4, 1/3, 1/3 of the FALD screen numbers typically which is a huge difference in nits, but the overall HDR presentation has more to it than just the raw nits. Both techs use workarounds/hacks/tricks to try to squeeze the best presentation they can but they both have pretty big tradeoffs.

If you want to go by the prices linked off of those RTings pages:

G95NA (2021) s-uw FALD ----------> $1050 usd (amazon)

G95NC (2023) 57" s-uw FALD ---> $1800 usd (amazon, bestbuy)

G95SC (2023) s-uw OLED ----------> $1300 usd (amazon, bestbuy)

aw3225qf (2023) 16:9 OLED ---------> $1200 usd (dell, bestbuy)

====================================================

the alienware isn't a super ultrawide though so it's not exactly the same category. You could make a similar argument about the 57" s-uw as it's a much higher rez and larger screen which would bump it's price up so it's not really apples to apples either. The g95NA is from 2021 so that also isn't really good to price compare.

The best comparisons are probably 4k gaming tvs.

65" Samsung S90C QD OLED ~~> $1600

65" LG C3 OLED ~~~> $1600

65" Sony X93L FALD ~~> $1600
 
Last edited:
OLEDs already outperform MiniLEDs in HDR content. MiniLED is only a real option for those who cannot afford OLED displays.


View: https://i.imgur.com/9ByjUoC.jpeg

MiniLED like the pg32uqx and Innocn MV3V look amazing in bright scene HDR.....far better than dimly gimped oled panels like aw32.

With that said oled can look just as good, if not better with bright scene HDR as long as you crank the brightness and saturation like that amazing Samsung Oled TV did last year...omg that thing got so bright I forgot all about miniled....at 55" it was to big for me to use on my desk but my God it had the most beautiful HDR I have ever seen......was like miniled brightness and colors matched with oled infinite contrast and inky blacks
 
Tekken 8 on this monitor is a blast. Colors pop and makes everything look amazing (UE5 game).

Is there a firmware update for this monitor yet? Also, anyone else having that refresh rate issue where it resets to 120hz when using DP 1.4?
 
Has anyone gone from the AW34 to the Aw32? I was planning on making the switch but after rtings came back that the AW32 is little bit dimmer I'm on the fence now.

I don't play a lot of competitive games so the refresh rate is a bit of a wash and I love me some ultra wide but damnnnnn do I miss that 4k crispnessss.

Anyone make the switch and after using it for awhile feel it was a worthwhile upgrade?
 

I think it's this one from reddit. Interesting info in the replies I pasted below. That utility sounds neat in general.

https://www.reddit.com/r/OLED_Gaming/comments/1akc7v1/fyi_aw322qf_firmware_update_fixed_the_hdmi_21/

FYI - AW322QF Firmware Update Fixed the HDMI 2.1 Gsync Flickering​

renderTimingPixel.png

Just tried it with Darktide 40k and Valheim. Both had some pretty awful tearing before, but now all seems to be good with HDMI 2.1 + Gysnc at 240 Hz. I'll do some more experiments later but if things look good I'll stick with HDMI for the greater bandwidth and the fact that it allows for 12bit color.
Running a 4090.
EDIT/UPDATE: I ran the G-sync flicker test (thanks u/sixstringmonk and u/born-out-of-a-ball). There appears to be a little flickering in the dark gray gradients but the rest of the screen looks fine. I compared DP and HDMI and both look the same in this regard. So if you have rapidly fluctuating frametimes there may still be some flickering, but in practice it's likely a non-issue.
UPDATE 2: the original flickering (maybe "tearing" is more accurate) I was referring to in my post was related to the fact that G-sync with HDMI was completely broken at lower refresh rates and unusable with the original firmware. This is completely fixed now. There's still the gamma-related VRR flickering as mentioned above. Apologies for any confusion.
. .
Here's a utility for testing G-Sync flicker: https://github.com/MattTS01/VRR_Flicker_Test_OpenGL/releases/tag/v1_release
Edit: updated link to the one provided by /u/born-out-of-a-ball. Thanks!

https://github.com/MattTS01/VRR_Flicker_Test_OpenGL (Use at your own risk)

VRR_Flicker_Test_OpenGL​


Test application to demonstrate VRR flicker and gamma shift. The program creates a fullscreen OpenGL context on the primary monitor in the desktop resolution, renders a gradient from mid grey to black and then varies the frametime up and down between 1/120th and 1/40th of a second.


The unstable frame rate should trigger flickering on an LG CX OLED display.


It was just put together quickly in an afternoon based on an idea I had to create a reproducable way of demonstrating the issue. Given the simplicty of the graphics I just used basic OpenGL without any need for shaders.


GLFW is used for handling the OpenGL context and window creation. GLEW was added in case of future requirements.


Instructions​


Download the zip, extract and run the exe. It will launch a full screen gradient display that is static. With VRR on the LG CX you should see this flicker.


Press Escape to exit.
.
 
Last edited:
I just now found out that LG has a WOLED variant of a 32" 240hz coming. If that's the case and a manufacturer offers a glossy version it makes these QD-OLED ones kind of a hard sell.

The current LG WOLED panels also hit 250nits full field but aren't limited to only 400nits 10% window and can actually do 700+.

Also IMO, WRGB > Triangular pixel structure for text.
 
I just now found out that LG has a WOLED variant of a 32" 240hz coming. If that's the case and a manufacturer offers a glossy version it makes these QD-OLED ones kind of a hard sell.

The current LG WOLED panels also hit 250nits full field but aren't limited to only 400nits 10% window and can actually do 700+.

Also IMO, WRGB > Triangular pixel structure for text.
Yep and they will have no curve. I'm holding out myself.
 
I just now found out that LG has a WOLED variant of a 32" 240hz coming. If that's the case and a manufacturer offers a glossy version it makes these QD-OLED ones kind of a hard sell.

The current LG WOLED panels also hit 250nits full field but aren't limited to only 400nits 10% window and can actually do 700+.

Also IMO, WRGB > Triangular pixel structure for text.

Yea but WOLEDs gross though 😝
 
Triangle pixel pattern is grosser in my opinion :p.

I won't deny that triangle pattern blows ass for text/productivity....but that's why I bought Vegas spare chynee27 for a dollar ninety nine.

But using a Qoled for productivity is like being friend zoned by a sexy female.
 
Last edited:
Updated firmware but the problem still persists. After shutting down the PC and then restarting it, it defaults to 120Hz and I have to do that "Turn off HDR and then back on" gimmick to get it to 240Hz again when using the DP cable. Is anyone else having this issue or are all of you using only HDMI 2.1?
 
Triangle pixel pattern is grosser in my opinion :p.
I'm honestly on the fence on this one. The pattern itself is a bit weird with green being bigger than red and blue, so I get that. But for the most part it reminds me of the old shadow mask CRT's. So long as the pixel density is enough it shouldn't be too bad. And then there's WRGB with that white sub pixel. Wut? Get that out of here.

If only OLED's were bright/emissive enough to do a classic RGB stripe setup that we all know and love. But for obvious reasons, that's not feasible.
 
Updated firmware but the problem still persists. After shutting down the PC and then restarting it, it defaults to 120Hz and I have to do that "Turn off HDR and then back on" gimmick to get it to 240Hz again when using the DP cable. Is anyone else having this issue or are all of you using only HDMI 2.1?
Have you tried changing the DP cable? I got the monitor today, updated the firmware, played some games and shutdown my PC. Turned it back on and it still saved at 240hz. This is with HDR on in Win 11 and monitor.
 
I wish this monitor was 40-45". I want to try a 240Hz large screen. There's no way I would ever go back to a 32, but I would settle for a 43" minimum. Also if the price is right, being that it will likely burn in. Still would be interesting to try the difference between 144 & 240.
 
Back
Top