The 32 inch 4k IPS 144hz's...(Update - this party is started) (wait for it...)

It's something I think we are going to have to get used to for HDR monitors. More brightness means more power mean more heat. As we push peak brightness more and more, this is just going to be more true. For TVs, maybe they are big enough to have large passive heatinks, but I think for monitors we just have to accept fans, just like we do for our PCs themselves. I don't love it, but I don't see an alternative.
Large televisions, at least, do include active cooling. The difference is people generally don't sit 2-3 feet away from a 65" TV, so you're less likely to hear or be bothered by the fans.
 
  • Like
Reactions: elvn
like this
It's something I think we are going to have to get used to for HDR monitors. More brightness means more power mean more heat. As we push peak brightness more and more, this is just going to be more true. For TVs, maybe they are big enough to have large passive heatinks, but I think for monitors we just have to accept fans, just like we do for our PCs themselves. I don't love it, but I don't see an alternative.
The HSF assembly in my PG32UQX is damn near silent, which is more than can be said for my old PG27UQ. Either they improved the HSF design, or it has not aged enough to make noise... lol.
 
The HSF assembly in my PG32UQX is damn near silent, which is more than can be said for my old PG27UQ. Either they improved the HSF design, or it has not aged enough to make noise... lol.
Ya I haven't had any issues with mine. I can hear it, but only if I quiet down the room and get my ear right up next to the display. Likewise my previous AW3821DW was nice n' silent. Fans can be done well, they just aren't always. I'd prefer fanless just because it means they can't fuck up a fan, but I'm realistic that we are going to need them to cool ever brighter HDR displays. I mean the PG32UQX runs on like a 240W laptop power supply... Now unlike a CPU/GPU where all the power becomes heat, here plenty of it becomes light, but most sill becomes heat. While lights are WAY more efficient today than in the past, they are still mostly heat generating units.

The very best LEDs that I've seen, the Philips Ultra Efficient, output about 1 watt to light (around 800 lumens) while drawing 4.5 watts of power. That's about half a normal LED bulb, those are usually about 1 watt for 8 watts drawn, and way better than incandescent which are 1 watt for 60 watts drawn for normal or 1 watt for 85ish wats for long life.

...but it is still only about 22% efficient. That means almost 80% of the energy going in goes back out as heat.

Well same shit in our monitors. They are NOT as efficient as those ultra efficient bulbs, so the light-to-heat ratio isn't going to be as good. So if your monitor is drawing 100 watts of power to crank out a nice bright HDR picture, it could easily be giving off 90 watts or more of heat. That needs to get dealt with and the temperature needs to be kept low as high temps can wreck the TFTs that deliver power to LCDs and OLEDs.

Thus I think fans are just going to be a thing in HDR displays. Eventually hopefully new technology will again make our light output more efficient and we'll get to the point where they release a minimal amount of heat, but until that time, gotta deal with it.
 
1705093731217.png


Why is DP 2.1 a big selling point for 4K 240hz when HDMI can run it? Seems like some people are waiting for the Aorus just for DP2.1
 
Why is DP 2.1 a big selling point for 4K 240hz when HDMI can run it? Seems like some people are waiting for the Aorus just for DP2.1

Because people hear "compressed" and get all worked up about that being evil. They are convinced that, despite legit scientific research from VESA, they'll be able to see a difference because clearly their eyes are special and it'll just suck.

Wait until they find out what gamma and the PQ curve are vs linear light space :D.
 
Because people hear "compressed" and get all worked up about that being evil. They are convinced that, despite legit scientific research from VESA, they'll be able to see a difference because clearly their eyes are special and it'll just suck.

Wait until they find out what gamma and the PQ curve are vs linear light space :D.

Yeah I personally couldn't give a crap about DP 1.4 or DP 2.1, I just want my OLED to be glossy and flat so it looks like MSI will be the winner for me since they will be first to market with what I want.
 
You need to keep the monitor cool to slow the wear of the OLEDs. I don't know if we've had any testing to see if active cooling is any better than passive cooling, though.
Monitors have a massive amount of surface area available at the back. Covering that with a finned aluminium heatsink instead of insulating plastic would keep it cool passively. Of course large sheets of aluminium cost money, so no one is going to make one as long as we continue to buy whirring plastic monitors.
 
Because people hear "compressed" and get all worked up about that being evil. They are convinced that, despite legit scientific research from VESA, they'll be able to see a difference because clearly their eyes are special and it'll just suck.

Wait until they find out what gamma and the PQ curve are vs linear light space :D.
I don't think a single person here has said that DSC is visible. Seriously, this is supposed to be [H]. and yet this thread is full of people advocating for older, slower tech. Next thing you know, people will start saying that we don't need faster CPUs because 400fps is already good enough.
 
I don't think a single person here has said that DSC is visible. Seriously, this is supposed to be [H]. and yet this thread is full of people advocating for older, slower tech. Next thing you know, people will start saying that we don't need faster CPUs because 400fps is already good enough.

Because the older, slower tech can do 4K 240Hz just fine. Obviously once we get monitors that simply cannot run off DP 1.4 no matter how much DSC is being used then of course we are going to want DP 2.1 because we will actually NEED DP 2.1. And if every game in existence had an fps cap of 400fps then yeah, we actually don't need faster CPUs.
 
  • Like
Reactions: cvinh
like this
I don't think a single person here has said that DSC is visible. Seriously, this is supposed to be [H]. and yet this thread is full of people advocating for older, slower tech. Next thing you know, people will start saying that we don't need faster CPUs because 400fps is already good enough.
It's not advocating it's just being realistic about if it matters. Nobody is saying, at least I don't think, "Fuck DP 2, I won't buy a product with it we should stick with 1.4!" What we are saying is "Who really cares if these monitors have DP 2, they work fine with 1.4 and many GPUs don't support DP 2 yet anyhow." It just isn't a purchasing consideration for people at this point, they aren't saying no to it.

It isn't a situation where if you get one with DP 2 it'll last longer than one with DP 1.4. The link either can or cannot support the display resolution and with DSC, it can. So if I were shopping for one of these monitors, I just wouldn't give it any thought if it had DP2 or not. I'd look at design of the monitor, firmware, EOTF tracking, warranty support, etc but not really care if it had 1.4 or 2.
 
MAYBE Gigabyte since they are the only ones who confirmed DP 2.1 UHBR20 so that separates them from the rest, also it's glossy and flat. But then again, their previous monitors have had firmware issues so who knows how the QD OLED will turn out. HP also looks good being glossy + flat + DP2.1 but unknown bandwidth.
All models look to be glossy/semi-glossy, sucks only one of the lineup featured UHBR20.
 
All WOLEDs are matte, all QD-OLEDs are semi-glossy with the exception of the Samsung model which seem to be matte (but since it's still not polarized it will have the same issues with external light as other QD-OLEDs).
Thank god this year's opening went huge on QD-OLEDs. Normally the usual lineups are mixed with either W-QDs or W-FALDs, that was always the trend.
 
TFTCentral said there was upcoming news on glossy woled but that’s for 2nd half. In combination with MLA and proper pixel structure I’m still interested in what the Woleds look like in the 2nd half.
 
WOLEDs look really good on specs, we'll have to see the tests though to compare them to other options.
Myself I definitely prefer anti-glare of WOLEDs to QD-OLEDs coating.
 
Problem with WOLED is their color volume is crap compared to QD-OLED.
 
  • Like
Reactions: Xar
like this
It's something I think we are going to have to get used to for HDR monitors. More brightness means more power mean more heat. As we push peak brightness more and more, this is just going to be more true. For TVs, maybe they are big enough to have large passive heatinks, but I think for monitors we just have to accept fans, just like we do for our PCs themselves. I don't love it, but I don't see an alternative.

Large televisions, at least, do include active cooling. The difference is people generally don't sit 2-3 feet away from a 65" TV, so you're less likely to hear or be bothered by the fans.

Monitors have a massive amount of surface area available at the back. Covering that with a finned aluminium heatsink instead of insulating plastic would keep it cool passively. Of course large sheets of aluminium cost money, so no one is going to make one as long as we continue to buy whirring plastic monitors.


This might be a positive to using a larger 42"+ screen decoupled from a desk and set back farther away as opposed to a smaller screen right up in front of your face.

I do agree that the slim design choice is overrated. I'd prefer function over form, with a thicker chassis that has grill vents. The pro-art displays have a slightly boxier, grille-vented housing plus active cooling profile fan for example and they don't suffer aggressive ABL. Some of the brightest 4k and 8k samsung FALD LCD tvs that can do 2000nit or so are slim chassis design, no fan that I'm aware of, and suffer aggressive ABL.

Avoiding addition of heatsinks is probably a big cost savings measure from the mfg. A big metal sheet costs more, makes it heavier to ship, and also breaks that slim apple aesthetic (which like I said I'm not a fan of over function). Panasonic had, maybe still has, a full heatsink on some of it's OLED TVs in the UK according to HDTVtest though, so it can be done.

If we do have to use fans, at least for max performance, I'd at least wish that they were modular so that you could hot-swap them to remove them for cleaning or replacement, slapping them back in like cartridges/ext. drive bays. Also that you could set your own performance profiles with different peak brightness, etc. depending on what you were doing at any given time. E.g. low heat, medium heat, high heat performance/HDR range (and fan dBA) wise.

.
 

View: https://www.youtube.com/watch?v=bxMgHbMhdw8

Not much exciting honestly. LCD panel manufacturers seem to be downsizing their effort here probably thinking that they won't be able to compete with OLEDs.

OLED is great for manufacturers because once that warranty expires and people start getting burn in, consumers will upgrade. Why sell something that lasts forever? Its like how clothes washing machines are made using pot metal components now, 20 years ago the machines were made to last. Then manufactures realized if they make things designed to fail they make more money. Although OLED isn't designed to fail. It would be wise to not ever make microled.
 
Last edited:
OLED is great for manufacturers because once that warranty expires and people start getting burn in, consumers will upgrade. Why sell something that lasts forever? Its like how clothes washing machines are made using pot metal components now, 20 years ago the machines were made to last.

Unlike a clothes washing machine or a stove, (or even a living room tv traditionally at least) - a lot of pc gamers would upgrade their display within or after 4 or 5 years for better performance advancements (Higher hdr capability, VRR, higher hz, higher rez, maybe even a different aspect ratio, curve, etc.). Lifespan wise, with desktop sized LCD screens in the past you could have kept the one you are replacing as a side monitor, or would still be nice to be able to use older displays on other rigs or hand me downs for other usage or people. Some oleds are still going strong after years now though. The burn in wear evening buffer can probably last a long time if you use common sense oled usage practices, and especially longer if you use an OLED for dynamic media and gaming rather than static desktop/apps.

. .

I think in the years ahead screens are almost certainly going to go virtual on high rez lightweight glasses anyway. And with that not just virtual flat screens but the capability for games that are like scaled 3d/holographic scenes on tables and floors, in virutal break-aways of walls, via binocular screen 3d like MR headsets are starting to try to do in baby steps currently. Displays and phones, and gaming genres have all kind of hit an iterative wall in many respects for awhile now, with incremental upgrades (outside of maybe the appearance of HDR depending how you want to consider that).

There are some XR glasses now but they are low rez and in early stages so not a true desktop replacement yet. There are Micro OLEDs in glasses, oled on silicon wafer. Apple's more bulky MR headset is using them. Apple had a roadmap for a sunglass form-factor XR/MR glasses but they pushed it back to 2027.

In my opinion they look like they are still in the early stages. Most are only 1080p and have other tradeoffs. They seem more useful for watching movies on a plane or something. They can be used otherwise but with that kind of rez it's hardly a desktop replacement for me. They do try to market them and some MR headsets as a desktop replacement already but I think it still has a ways to go.

The marketing makes it look like this but at 1080p, it really isn't like that (not yet):

nl7Hiw6.jpg



Apple pushed their sunglass form factor glasses back to 2027 so I'd think the XR tech should be a lot better by then. There was some interesting advancements in MR display tech shown at ces but the more advanced stuff won't be in products for several years yet.

I think it is likely the way of the future overall as virtual screens , and also with binocular view games like holographic characters in real space, and virtual assistants, virtual parts and tools and arrows and animations for How-To's/instructional things, etc. They are still in the early stages but they'll get there.

Afaik most of them are like a 1080p 55" screen, depending how you figure the viewing angle vs rez, PPD , etc.. though some can do 120hz at least. Some of the 3dof ~ pinning the screen in place, head and eye tracking stuff (or even hand tracking) is lacking or non-existent. They will get better and higher rez in the years ahead.
 
I think in the years ahead screens are almost certainly going to go virtual on high rez lightweight glasses anyway. And with that not just virtual flat screens but the capability for games that are like scaled 3d/holographic scenes on tables and floors, in virutal break-aways of walls, via binocular screen 3d like MR headsets are starting to try to do in baby steps currently. Displays and phones, and gaming genres have all kind of hit an iterative wall in many respects for awhile now, with incremental upgrades (outside of maybe the appearance of HDR depending how you want to consider that).

There are some XR glasses now but they are low rez and in early stages so not a true desktop replacement yet. There are Micro OLEDs in glasses, oled on silicon wafer. Apple's more bulky MR headset is using them. Apple had a roadmap for a sunglass form-factor XR/MR glasses but they pushed it back to 2027.
Doubt. If it's one thing the recent VR market has taught us is that people do not like wearing things on their face for long periods of time, even if they are not much larger than glasses. I wear prescription glasses, so I'm used to it, but I still don't want to add AR capabilities to my glasses.
 
Doubt. If it's one thing the recent VR market has taught us is that people do not like wearing things on their face for long periods of time, even if they are not much larger than glasses. I wear prescription glasses, so I'm used to it, but I still don't want to add AR capabilities to my glasses.

People said the same thing about idiots starting at phones in their hands and people texting pressing phone buttons, or the people with a BT earpiece hanging on their head and now earpods are ubiquitous. 😎 Time will tell, and people will adopt, adapt, or be left behind if something takes off.

VR is a big sweaty, relatively heavy enclosed shoebox on your head and is like wearing a blind mask. That's a big difference.
 
People said the same thing about idiots starting at phones in their hands and people texting pressing phone buttons, or the people with a BT earpiece hanging on their head and now earpods are ubiquitous. 😎 Time will tell, and people will adopt, adapt, or be left behind if something takes off.

VR is a big sweaty, relatively heavy enclosed shoebox on your head and is like wearing a blind mask. That's a big difference.

So why hasn't Google's MagicLeap or Microsoft's Hololens taken off yet then? You can talk about how people said things were stupid and then they took off, but the exact opposite has happened where people said "this is the next big thing" and it was a total flop.
 
Judging by other things, I'd say because apple hasn't streamlined it enough yet and made it sexy enough.

That's kind of like comparing one of those 80's antenna phones vs today's smartphones, or giant headphone cans with antennae on them, a cassette tape walkman. Things take can take a long time for better tech to arrive that can be incorporated into the device genre, and at relatively affordable prices.

Micro OLEDs, eye tracking, gestures etc all in a much smaller, near sunglass form factor is starting to happen. They are a little clunky at the moment and only 1080p 120hz. 3dof and eye tracking, hand tracking is actually lacking or nonexistent in most of the current models too but it'll get better. Apple decided not to jump in until 2027 according to reports.
 
Last edited:
So why hasn't Google's MagicLeap or Microsoft's Hololens taken off yet then? You can talk about how people said things were stupid and then they took off, but the exact opposite has happened where people said "this is the next big thing" and it was a total flop.
Because the technology is still too expensive. Both are more than $3,000. I'll give them the benefit of the doubt until they can lower the cost. I personally don't see the benefit such a device would add to my life at any price point, though.
 
New firmware has been released. The changelog is not that clear to me.
https://www.acer.com/us-en/support/product-support/X32FP/downloads

1.fix ADW customer complaint issue
2.change OD to be adjustable when freesync is on
3.change HDR behavior
4. fix customer complaint issue, fuzzy screen of PC game Witcher3

I am trying to understand what they fixed in point 1,
point 2, OD was already adjustable when freesync is on, so don’t understand this pount,
point 3, I see no change in HDR but will try to understand it,
point 4 is pretty self explanatory even if I had no problem in TW3.
 
Ok, I found some photos from the old firmware update in june 2023 and it’s V02.00.015,
I flashed the firmware that I downloaded today from Acer website that reports a new release date 2024/01/04
but it’s the same version, V02.00.015.

I think that they only updated the release date on the website but the firmware is the same. -_-
 
New firmware has been released. The changelog is not that clear to me.
https://www.acer.com/us-en/support/product-support/X32FP/downloads

1.fix ADW customer complaint issue
2.change OD to be adjustable when freesync is on
3.change HDR behavior
4. fix customer complaint issue, fuzzy screen of PC game Witcher3

I am trying to understand what they fixed in point 1,
point 2, OD was already adjustable when freesync is on, so don’t understand this pount,
point 3, I see no change in HDR but will try to understand it,
point 4 is pretty self explanatory even if I had no problem in TW3.
I was excited to see more X32FP updates, but it looks like this is the same firmware that came out last year (v02.00.015).

All they've done is added some PDF's to the .zip file with a guide to flashing and re-uploaded it, hence the date saying 2024, correct me if I'm wrong. Everything else looks the same.

A bit worried Acer might abandon it this year while we know many things can still be improved and its not a cheap monitor either. Would be nice to see some improvements to the DDC input options for switching input that I mentioned here before. Maybe I'll try reach out to their support team? Anyway, hopefully there's more improvements to come.
 
Last edited:
I was excited to see more X32FP updates, but it looks like this is the same firmware that came out last year (v02.00.015).

All they've done is added some PDF's to the .zip file with a guide to flashing and re-uploaded it, hence the date saying 2024, correct me if I'm wrong. Everything else looks the same.

A bit worried Acer might abandon it this year while we know many things can still be improved and its not a cheap monitor either. Would be nice to see some improvements to the DDC input options for switching input that I mentioned here before. Maybe I'll try reach out to their support team? Anyway, hopefully there's more improvements to come.

Writing to support is a waste of time, they answer with pre-built answers.
You need to write to CEO and to the display boss if you want to get an.answer.

Search on the internet for "Acer leadership"
Take their names, then write to [email protected]
 
Back
Top