Samsung Odyssey Neo G9 57" 7680x2160 super ultrawide (mini-LED)

So i got 5120x2160 working on the monitor. Works well, Blbut I feel like its too much space sacrificed on the sides.

Is there a middle ground between 5120 and the native 7680 that I can try?

I thought that it did not allow custom resolutions because of the DSC or are you perhaps using and AMD GPU?
 
From info like that and watching some clipping vids on some of the stuff HDTVtest has published on youtube, as I understand it, HDR metadata gets sent to your screen.
HDR10 static metadata is reliably sent only by Blu-ray players. I'm not aware of any other sources which can be trusted to always send HDR10 static metadata. LG TVs will use the 4000-nit curve for these sources, wasting most of the peak brightness. Most Dolby Vision sources correctly send dynamic metadata. HGIG mode is always the correct mode for all content without exception. It's how the content was mastered, and how the image is supposed to look, up to whatever your display's peak brightness is. Any kind of tone mapping is a compromise and inaccurate, even if it subjectively looks better.
 
Last edited:
HDR10 static metadata is reliably sent only by Blu-ray players. I'm not aware of any other sources which can be trusted to always send HDR10 static metadata. LG TVs will use the 4000-nit curve for these sources, wasting most of the peak brightness. Most Dolby Vision sources correctly send dynamic metadata. HGIG mode is always the correct mode for all content without exception. It's how the content was mastered, and how the image is supposed to look, up to whatever your display's peak brightness is. Any kind of tone mapping is a compromise and inaccurate, even if it subjectively looks better.

Thanks. Well what I was going by was that the games, according to the hdr analysis tool in Reshade, were trying to push 10,000nit as the max CLL for static tone mapping, and the peak brightness of the screen wasn't correctly assigned to the game via the game's graphics settings menus either. So I was calling what the game HDR output was attempting to send as "metadata" (perhaps erroneously considering your reply). Maybe I should just say the game's HDR output or the game's HDR curve, or profile. HGiG for games that aren't delivered with broken HDR will look best for sure. The youtube channel personality showed in his vids that some games however push 10,000 nit max CLL for static tone mapping and some push 4,000 .. and that you could determine which by running that hdr analysis tool, kind of like a fps meter with readout in the upper left corner. I think God of war and Horizon Zero dawn were both showing 10,000 for example.

In the Reshade filter, he set the max CLL For Static Tone Mapping value correctly to what the game was being shown as by the analysis tool (10,000 or 4,000), then set the Target Peak Luminance to that of his screen. In HGiG mode of course.

Without doing that, the in game graphics settings sliders (e.g. peak brightness setting) in broken HDR games were not doing it correctly so the hdr is broken on games like those by default, with a 10,000nit or 4,000nit curve. Allowing such broken HDR games to operate in their default HDR functionality results in the HDR range being stretched upward and squashed downward from the middle on the analysis graph. So things like bright skies with clouds in those games on the player's screen are clipped to white like blobs without detail, and the detail in darks is muddy and much less discernable. After applying the Reshade HDR filter with the proper values entered, it shows a lot more detail on the top end and in darks rather than hard clipping the top and muddying the lows (though not as perfectly as if the devs had done a better job with the hdr in the first place like some other games that have good HDR).

1*xe-SQPql7vmGQDlO28bL5g.gif


For people unaware of the Reshade

EndlesslyFlowering (Lilium)

HDR shader method, or unwilling or unable to use it for whatever reason, or running consoles - if they have a broken HDR game, that game is probably already suffering cons of running Dynamic Tone Mapping anyway like clipped highs and lost details, so I don't blame people for trying DTM out of desperation. Since the reshade method is available and I'm aware of it, personally I'd try that as a work-around for such broken HDR games since it would avoid clipping for the most part by comparison and deliver more detail on the top end and bottom end than the default broken HDR in those games.

It's a shame that games are released in such a state but it is what it is. I usually try to stick with games that have good HDR but there are some big titles that are broken so it's frustrating to a lot of people.
 
Yes, fixing it at the source with Reshade is the correct approach. The display should be left in HGIG mode whenever possible.

Actually, even if the game outputs up to 10000 nits, it should still look correct in HGIG mode on an 800-nit display. You should lose detail only in the extreme highlights. When it looks broken, the issue is the rendering is incorrect for the mid-range, not the peak brightness target.
 
  • Like
Reactions: elvn
like this
Yes, fixing it at the source with Reshade is the correct approach. The display should be left in HGIG mode whenever possible.

Actually, even if the game outputs up to 10000 nits, it should still look correct in HGIG mode on an 800-nit display. You should lose detail only in the extreme highlights. When it looks broken, the issue is the rendering is incorrect for the mid-range, not the peak brightness target.

Would be great if that was true but the games are really screwed up in their HDR output between the two settings somehow, and as evidenced in my previous replies in this thread and the youtube HDR vid of cyberpunk - they clip the highs and muddy the lows. Other games that don't have "broken HDR" do HDR great after using/applying their in regular game menu HDR settings. (E.g. Elden Ring's HDR settings have a Middle brightness, Peak brightness, and saturation slider that all seem to function properly).

I understand what you mean though, like if the 4k bladerunner 10,000nit hdr disc/rip was sent, the TV should be able to static tone map it down to 800nit on a 800nit display according to how for example LG set it to in their firmware for an LG OLED, breaking the range down and compressing it intelligently. These particular HDR games are screwed up though.

The Max CLL for Static Tone Mapping is outlined as being necessary to set in that plasmatvforgaming channel's instructions, but the other side of it is that the Target Peak Luminance of the display needs to be set for the filter too. There are some other tweaks you can do for the mid range in reshade if desired to taste per game but I think those two settings are the most important to prevent the clipping to white of very bright skies and their clouds, bright light sources and near areas, etc.. and lost detail in the darks.

The before/after toggling of the reshade filter via hotkey in his videos shows how bad the default broken HDR games look and how much better they look with the filter applied. (Obviously he changed his camera setting in order to show it in SDR, it would look different overall in actual HDR).


Clipped detail in the sky


horizon.zero.dawn.HDR.plasmatvforgaming_before-1.png



After he set the proper range, (way better but still would have even more detail if the devs didn't screw it up so bad)


horizon.zero.dawn.HDR.plasmatvforgaming_after_1.png

He also showed some before/after of the filter in a cyberpunk video that was published in HDR


Idk how much different the result would be if you left the Max CLL for Static Tone Mapping at 10,000 and attempted to let the tv do the compression after just settting the Target Peak Luminance in the reshade filter to that of your screen. Could be interesting to see the difference on that, if any. Both of the aforementioned settings may be required to be set in order to get the curve/range and best end result in details via reshade on the broken HDR games though (that's the way to do it according to the instructions anyway).
 
Last edited:
Decided to give the 57" another try after having returned it a few months ago (as mentioned in this thread) and maybe they have improved things as I find this time around the overall experience is better. I don't really find the grain as noticeable as last time and also the overall stability of the monitor seems better. Of course, this time around I know some of the workarounds needed to get it to work (like connecting it twice to the same PC in order to work around some DP/HDMI limitations etc). This unit is going back no matter what as Samsungs horrible QC showed its face again with 2 out of 4 screw holes being broken on the back for the stand but hopefully that should be something that can get sorted with a new unit (perhaps a few of them based on experience).

From a strict PQ perspective OLED is still better in most regards, including actual PQ as well as being glossy, having almost perfect viewing angles, being flat or almost flat (as they don't need to be curved) but there are definitely some practical advantages to the Neo G9 57" when you, like me, do perhaps 95% work and 5% gaming. Of course, there are also blooming, haloing etc. but that is to be expected with only 2000 dimming zones. With the new 32" panels coming out now with OLED, it will be interesting to see if the Neo G9 57" is short lived as I imagine there will soon be a OLED G9 57" but only time will tell.
 
With the new 32" panels coming out now with OLED, it will be interesting to see if the Neo G9 57" is short lived as I imagine there will soon be a OLED G9 57" but only time will tell.
My fear is that a future OLED G9 57" will use the smart TV crap from the current OLED G9. It is missing for example the 21:9 + 11: 9 PbP split, and every setting is buried deep in the smart TV options.

I don't think the Neo G9 57" is necessarily shortlived. Samsung still sells the CRG9 despite having the G9, Neo G9 and OLED G9. People are still buying those because they are getting cheaper each year.

I'll probably wait for those 45" 5120x2160 models or more significant sales on the G95NC.
 
Decided to buy one of these and one of the 32 inch Alienware 240HZ panels that just came out. Both should be delivered next week and im curious to see if the praise of OLED over the last few years can really compete with the brightness of Mini LED and how much I have liked it lately.

Absolutely dumb that this thing can only run at 120HZ with my 4090 though.
 
Decided to buy one of these and one of the 32 inch Alienware 240HZ panels that just came out. Both should be delivered next week and im curious to see if the praise of OLED over the last few years can really compete with the brightness of Mini LED and how much I have liked it lately.

Absolutely dumb that this thing can only run at 120HZ with my 4090 though.
Really interested to hear your thoughts on the comparison! This monitor sounds like a work productivity dream and I had the G8 NEO so despite me not liking AG coating it was tolerable on the G8 for work productivity. The lack of DP 2.1 on the 4090 and horror stories of peoples failing G9-57s have scared me away for now.
 
Decided to buy one of these and one of the 32 inch Alienware 240HZ panels that just came out. Both should be delivered next week and im curious to see if the praise of OLED over the last few years can really compete with the brightness of Mini LED and how much I have liked it lately.

Absolutely dumb that this thing can only run at 120HZ with my 4090 though.

I'd recommend controlled (dim) lighting conditions in order for it's HDR to scale properly to your eyes/brain. If you put it in a bright room it would need 2x or more brightness just to reach the same SDR brightness levels to your eyes. (Matte screen's abraded layer also gets "activated" by bright enough ambient lighting and raises blacks to grey-blacks).

That's because we see everything relatively, and our pupils palliate and adjust to different conditions as part of that too. So putting screens right next to each other with one brighter than the other would similarly skew your impression. You'd have to try them separately (on / off and time to adjust to each).

Cameras also have their own biases and adjust things relative to the brightest thing in a captured image. So those pictures you see online of monitors right next to each other in the same room are pretty meaningless. You'd have to take separate pictures with the camera settings adjusted to get as close as you could to what you see in real life. Most pictures and videos are still not posted in HDR either. Then there is the fact that cameras still have some biases, picture compression can do things, online forums can alter things, and everyone viewing the picture on their own device has their own device limitations, calibrations/tweaks, and room lighting etc.
 
Decided to buy one of these and one of the 32 inch Alienware 240HZ panels that just came out. Both should be delivered next week and im curious to see if the praise of OLED over the last few years can really compete with the brightness of Mini LED and how much I have liked it lately.

Absolutely dumb that this thing can only run at 120HZ with my 4090 though.
As has been mentioned over and over (and probably what you meant), it can run at 240 hz, just not at native resolution.

Having compared the Neo G9 57" to both OLED C2/C3 and the OLED G9 my conclusion is that....it is just as you would expect. What is the best choice mainly depends on what you prefer and what you prioritize.
 
Last edited:
Biggest difference is that the 57" will look like washed out trash in comparison due to the huge color volume difference. It also has horrible vertical viewing angles so if you use dark mode in browser or do work in an IDE, it's a horrible eye sore with the bottom or top washing out and losing gamma. Also, overall uniformity is really bad like all of these super curved Samsung VA's.

Besides that the next biggest thing is the 120hz vs 240hz which is a huge jump in motion clarity from LCD to OLED.

When I had it I used 16:9 and 21:9 res to accommodate games based on performance and rarely used the full 32:9. I was hoping it could be a nice jack of all trades type display but the panel quality itself is just really poor especially if you've used a nice IPS or OLED before.

I would grab it at $1299 since for productivity there really is nothing like it but when it comes to gaming, image quality > aspect ratio all day and that's where the OLED blows it out the water.

Edit: it's HDR performance is both better and worse than my Neo G8. It is much brighter in larger window sizes but the colors are worse.
 
Last edited:
Biggest difference is that the 57" will look like washed out trash in comparison due to the huge color volume difference. It also has horrible vertical viewing angles so if you use dark mode in browser or do work in an IDE, it's a horrible eye sore with the bottom or top washing out and losing gamma. Also, overall uniformity is really bad like all of these super curved Samsung VA's.

Besides that the next biggest thing is the 120hz vs 240hz which is a huge jump in motion clarity from LCD to OLED.

When I had it I used 16:9 and 21:9 res to accommodate games based on performance and rarely used the full 32:9. I was hoping it could be a nice jack of all trades type display but the panel quality itself is just really poor especially if you've used a nice IPS or OLED before.

I would grab it at $1299 since for productivity there really is nothing like it but when it comes to gaming, image quality > aspect ratio all day and that's where the OLED blows it out the water.

Edit: it's HDR performance is both better and worse than my Neo G8. It is much brighter in larger window sizes but the colors are worse.

Ironically since a curved screen could theoretically have all practically all of the pixels on axis and pointed directly at you compared to a flat screen - the uniformity, and also geometric distortion, is usually worse on curved screens because none of these curved screens are designed so that you'd be sitting at the center of the curvature, especially ultrawides where if you sat at the center of curvature, you'd shrink the screen height to your perspective to a short narrow belt.

If you could sit at the center of curvature, all of the pixels would instead be pointed directly at you, "on axis", but you really can't with most of the curved screens.

1000R(adius) = 1000mm = ~ 40" to center of curvature. You could probably do that on a 55" 16:9 ark if you decoupled it from your desk on it's own stand and set it that far away, but for most curved screens it's not really doable. 1800R(adius) = 1800mm = ~ 71 inches so even worse. 1800R is not really curved much at all in the first place though.


Think of the pixels on the screen like small laser pointers. In a room with a fog machine you'd see the shafts of laser light. When sitting at the center point of the curve, all of the lasers would be on axis to you and pointed directly at you so that for the most part you'd be seeing the points of light. The nearer you sat than that, the more you'd see the shafts of the light beams more sidelong. From your nearer position, the farther the pixels were from center of the screen, the more of the side of the laser beams you'd see. In a graduated fashion the pixels would be more and more off axis the farther they were away from the center and towards the outer ends of the screen.

This will make the screen distorted. Practically all uw and super ultrawides are designed lacking an aggressive enough curve and/or long enough semi-circle segment screen length to be able to sit at the center point of their curve without making the screen look short and belt like. (Outside of maybe the adjustable curve model monitor that could do up to 750R ~> 30" center point but I didn't like the overall specs of that screen). So practically everyone is sitting with the center point way behind them with current curved screens.


724204_monitor-curve-radus-_small-schematic_nearer-A_1.png



The ark is a big 16:9 so it is actually tall enough where you could mount it on a floor tv stand and get enough distance to be near the center point with the screen not being shrunk to a narrow belt to your perspective. It's rez is too low for it's size imo though. Perhaps someyear we'll get an 8k available in that format for higher PPD and quads of something near 4k real-estate (maybe a little less if scaled slightly for clarity/visibility).

Theoretically, they could design an uw or s-uw screen better so that you still get immersion on the sides when sitting at the center point of a curve. For example, the red line being a 120degree arc of a semicircle in the image below. The aggression of the curve/center point would have to appropriate in order to provide enough height to the screen though whatever the screen's actual height dimension might be.



monitor_curved.120deg.red.line_1.png
 
Last edited:
Biggest difference is that the 57" will look like washed out trash in comparison due to the huge color volume difference. It also has horrible vertical viewing angles so if you use dark mode in browser or do work in an IDE, it's a horrible eye sore with the bottom or top washing out and losing gamma. Also, overall uniformity is really bad like all of these super curved Samsung VA's.

Besides that the next biggest thing is the 120hz vs 240hz which is a huge jump in motion clarity from LCD to OLED.

When I had it I used 16:9 and 21:9 res to accommodate games based on performance and rarely used the full 32:9. I was hoping it could be a nice jack of all trades type display but the panel quality itself is just really poor especially if you've used a nice IPS or OLED before.

I would grab it at $1299 since for productivity there really is nothing like it but when it comes to gaming, image quality > aspect ratio all day and that's where the OLED blows it out the water.

Edit: it's HDR performance is both better and worse than my Neo G8. It is much brighter in larger window sizes but the colors are worse.
Agreed. Dark mode on the Neo G9 57" is a no go, it is like watching one of those horrible monitors we had in the 90s. It is funny that a massive curve only seem to be important for Samsung when it is a monitor with really poor viewing angels. For mainly entertainment, I would never consider this monitor. For mostly work and occational entertainment things might change though.
 
Agreed. Dark mode on the Neo G9 57" is a no go, it is like watching one of those horrible monitors we had in the 90s. It is funny that a massive curve only seem to be important for Samsung when it is a monitor with really poor viewing angels. For mainly entertainment, I would never consider this monitor. For mostly work and occational entertainment things might change though.
This is why I bought the 49" OLED G9 instead. It has an 1800R curve instead of that insane wrap around curve. It's curved enough to be immersive without being absolutely ridiculous.
 
This is why I bought the 49" OLED G9 instead. It has an 1800R curve instead of that insane wrap around curve. It's curved enough to be immersive without being absolutely ridiculous.
The only usage I could personally see where something with such a massive 1000R curve would be preferable is some kind of simulator like spacesim, racingsim etc. This includes for productivity where many people still seem to think that the curve makes the edges easier to focus on, completely forgetting that the curve also pushes the edges forward even though it makes the screen a bit less wide.
 
For me, it would be "all" or nothing. If I can't sit at the center of curvature, then the pixels farther from center will be off axis causing non-uniformity and geometric distortion.

700R(adius) = 700mm = ~ 28" view distance

800R(adius) = 800mm = ~ 32" view distance

1000R(adius) = 1000mm = ~ 40" view distance


the curve also pushes the edges forward even though it makes the screen a bit less wide.

IF you were able to sit at the center of curvature, then all of the points on the screen are the exact same distance away from you. That's because in that scenario you are sitting at the middle of the radius, the center of the curvature of the circle that the screen would be an arc of. That's what curves could do if they were done in a way that I consider the right way to do it.

Unfortunately, practically none of the curved screens to date have been designed to be used from the center of curvature (you maybe could on the 55" 16:9 ark at ~ 40" away if you decoupled it from the desk, at around 62 PPD though since it has the height to spare).



monitor_curved.screen_center.of.circle.curvature_facing-screen_1.png


.
When you sit at the center of curvature, all of the pixels are equidistant from you and pointed directly at you, like the bottom dot.
When you sit nearer, like the top dot location, the pixels farther from center are more and more off axis from you the farther from center that they are, pointing at a location far behind.
902903_reflection-light_facing-monitor_1.gif


.

The top traslucent example is sitting far inside, away from the solid example at the center of curvature
1000R_sitting.far.inside.of.focal.point_1.png


.
Sitting near to and at the center of curvature
1000R.Curve.Schematic_1.png


.
theoretical screen with added degrees of width for greater immersion even when sitting at center of curvature (red line).
1000R.Curve_120deg-curve_A.png


.
theoretical screen with a more encompassing wide viewing angle (red line)

1000R.Curve_180deg-curve_A.png


.
 
The xenon flex was one of the few curved gaming screens that had a higher possible curvature but it had some other tradeoffs like being 1440p, and a high price among other things.
It could do up to 800R(adius) = 800mm = ~ 32" view distance to center of curvature. 700mm (28") to 800mm (32") would allow people with a deep enough desk to sit at the center of curvature or near to it. I'd prefer the screens were taller and even wider/longer though.

edit: the LG 45GR85QE-B was also 800R, 800mm ~ 32" but it has a fixed curve where the xenon was adjustable all the way to flat. Both screens are oled 3440 x 1440p, 240hz.

from: https://videocardz.com/press-release/corsair-xeneon-flex-is-the-first-bendable-oled-gaming-monitor
XENEON-FLX-2.jpg






Looks distorted from where the picture was taken with the camera, but it wouldn't be from near where she is standing. This was a prototype oled at ces years ago. An 8k version of this kind of thing, even if not quite as tall as that one, could be great.
646907_417264_IlB5Ect.png



I'm looking forward to really high rez XR/MR glasses (separate screens per eye providing binocular 3d "holographic" elements on screen) within the next 10 years, for things like this where the virtual screen curvature can be equidistant from you. However also for large vitual screens floor to ceiling, virtual break aways of walls and in "space" to game/media worlds, characters and things moving around in real space, and also scaled "holographic" fields of games on tables and floors, etc. Imo phones and games have hit an iterative wall for years now.

XR.glasses.sunglass.style.form.factor_1.jpg

*marketing image. Most XR glasses are only 1080p and somewhat clunky usability so far but they should improve over the following years.
 
Last edited:
Her neck position, while still not being able to actually focus on the edges of the screen, is the main reason why I don't see a massive curve as good in the long run for work etc. Ie when you would actually place something near the edges that you are supposed to look at more than occasionally. For gaming, where "immersion" is a thing, it all changes of course. Should also add that if you, for whatever reason, can not use virtual desktops, things might change even for productivity.

This is of course a mix of personal opinions and experience over many years. If we actually had desk to match our curved monitors, and thus started turning our chairs rather than our heads, things might change also. But at least for me, that has never happened, and I often find myself sitting with my neck like the woman in the image above, and have been for way too long.
 
Her neck position, while still not being able to actually focus on the edges of the screen, is the main reason why I don't see a massive curve as good in the long run for work etc. Ie when you would actually place something near the edges that you are supposed to look at more than occasionally. For gaming, where "immersion" is a thing, it all changes of course. Should also add that if you, for whatever reason, can not use virtual desktops, things might change even for productivity.

This is of course a mix of personal opinions and experience over many years. If we actually had desk to match our curved monitors, and thus started turning our chairs rather than our heads, things might change also. But at least for me, that has never happened, and I often find myself sitting with my neck like the woman in the image above, and have been for way too long.

The central viewing angle is around 60 to 50 deg so you can see everything in that more or less with subtle eye movement, any side lengths beyond that would be more like multi-monitor usage where you tilt the side screens toward you, but instead would be a uniform screen with all of the pixels pointed at you and with no middling bezels of course.

When you are using desktop/apps, just like in a multi-monitor array, you are staring at a window for a much longer period of time usually - so the ergonomics aren't as bad looking aside at a slight angle compared to if you were ping-ponging back and forth rapidly when playing a game, but that could depend on the individual of course. Like you said, depending on the game in the game scenario, the sides for immersion wouldn't necessarily be looked at directly and could remain in your peripheral.

. . . . . . .

That said, you could still sit at the center point of curvature of a curved screen without pushing much of the screen outside of your central viewing angle if the curvature was appropriate for doing that.

The central human viewing angle is the green 30+ 30 deg = 60 deg.


............
CORSAIR-XENEON-overhead.facing.downward_1.jpg

Field-of-view-comparisons-The-field-of-vision-of-a-human-showing-the-binocular.png



. . . . ... . . . . . . . . . . . . . . . . . ..
horizontal.field.of.view_55inch.curved.screen_A.png





If we actually had desk to match our curved monitors, and thus started turning our chairs rather than our heads

I do use a separate kidney shaped shaped desk on caster wheels which looks kind of like that graph above but with a round cutout for the seating area in the middle bottom. It's decoupled from my monitor array so I can move it fwd/back, or right up against the monitors to stow it when not in use. However I also can rotate it slightly toward one side monitor + the middle one or the far side monitor + the middle one ( P L P setup currently ) if I want to focus on something for a longer time once in awhile. So it is possible to do. I don't mind turning my head a little to the side monitor on one side or the other usually though, for static material, something I'm looking at for more than a short glance. That is, if sitting far enough back from the array where it's not a huge bend to look.

But that is in regard to a more encompassing length screen far into your periphery, or an exceptionally wide triple array of large monitors/gaming tvs.


As shown in the first three pictures above at the beginning of this reply, a shorter length curved screen can fit in closer to the 60 deg viewing angle while still sitting at the center point of the curvature. Anything beyond that could be immersion in games. On a longer screen at that same curvature, window management software could keep most or all of your app windows within your central viewing angle, or even swap them dynamically using stream deck buttons, etc. so what you are focusing on is in your central ~ 60 deg view.

This example also shows the solid field being your central viewing angle, at ~ 40" view distance for a 1000R curve. It would be 28" to 32" view distance for 700R to 800R curves.


screen_1000R.curved_sitting.close.to.optimal_mini.schematic_1.png


Same curve, still sitting at the center of curvature so all of the pixels are on-axis pointed directly at you - but representing a longer screen with the red line, for immersion into the periphery for games.
screen_1000R.curved_optimal.view.distance_longer.90deg.screen_mini.schematic_1.png


I'd prefer the 2nd example, with a considerably tall enough screen. I could use window management with a stream deck to corral my window position(s) into my central viewing angle if I don't want to look to the sides on desktop/apps, yet I'd still have the extra length of the arc of the circle into my periphery on each side for immersive games. The main thing is that the pixels are all on axis to you, pointed directly at you. When they aren't, it causes geometry distortion and non-uniformity issues.

With good window management software, the stream deck buttons can do a lot (stream deck even has it's own window managmenet plugins too). With displayfusion app's hotkeys tied to a stream deck, you can shuffle window positions, set up window positions to teleport apps to locations by button, set up window position home locations for apps, or trigger one of several global saved window position profiles of multiple app windows with a single hotkey/button. You could probably even work out a way to "rotate" window positions by x degrees left or right or just cycle between portions of the screen on button presses. So you you wouldn't need to rotate your desk with a single long curved screen, you could just move the contents/windows quite easily once set up to do that.

There are a few other window management apps, or the built in windows 11 drag stuff but they don't manage quite as much or in the same ways. I use my stream deck's buttons all of the time to move windows/apps around.
 
Last edited:
I recall reading somewhere that our eyes are actually much "stronger" in vertical orientation compared to horizontal, ie we have easier to look up and down using our eyes than left right, as we would then use head movement much more. Perhaps that is reason for at least myself often finding that I seem inclined to move my neck rather than my eyes in horizontal orientation. Now, I can vouch for the science here, but at least i my case, when having a vertical setup, I tend to move the eyes much more to flick between the screens than when I have a horizontal arrangement.
 
I recall reading somewhere that our eyes are actually much "stronger" in vertical orientation compared to horizontal, ie we have easier to look up and down using our eyes than left right, as we would then use head movement much more. Perhaps that is reason for at least myself often finding that I seem inclined to move my neck rather than my eyes in horizontal orientation. Now, I can vouch for the science here, but at least i my case, when having a vertical setup, I tend to move the eyes much more to flick between the screens than when I have a horizontal arrangement.


For all the examples I've shown of how it could work well, like I said before, nearly all of the curved screens available do not provide the ability to sit at the center of curvature in a viable, usable way as outlined so I can understand how people might be turned off by the available screens. Sitting far inside of the center of curvature, shorter than the radius of the arc of the screen, is not what I'm championing here.

.
edit: My setup has
side portrait screens that are about 1/3 taller than what I'd consider a neutral comfort zone, but I tend to tilt my chair w/headrest back slightly to compensate, (and that angle is across a 40" or so distance). I don't put the most watched stuff up on the top shelves, and I have a lot of ability to swap window positions around on the fly with the way I set up my stream deck buttons. Overall I find it easier to focus on each separate portrait screen changing my view horizontally for deskop/apps, etc., one at a time, and use the center landscape oled for media and gaming. When desired I can drop my desk back a little farther to incorporate more of the side screens into my viewpoint since it's an island desk on caster wheels, decoupled from the screens. While gaming, I usually just angle my gaze to either side screen to text people or browse, or change videos or whatever. I don't mind turning my head and maybe spinning chair slightly, maybe 15 to 20 degrees, as long as I'm focusing on something rather than ping-ponging. To me, between my chair and a little head turn it's less of a stretch than looking upward in my particular layout, but they both have their limits.
 
Last edited:

Nice video. I'm glad I held up on this one.

I can't believe that it's been 6 months since its release already. Nvidia has still not acknowledged that their HDMI 2.1 can't do 240 Hz with this display but AMD can. I bet the 5090 will silently work at 240 Hz over both DP 2.1 and HDMI 2.1 yet the HDMI 2.1 support never comes to the 4090.

The MacOS support is even more ridiculous. Apple just refuses to show actual capabilities for their ports and just frames everything around their own 5K/6K displays, yet the bandwidth is not there it seems even on the latest and greatest M3 Max. It can do 7680x2160 but apparently not with scaling. Apple scaling is a naive "2x target res" so e.g "looks like 6400x1800" (120% scale) would be rendered at 12800x3600 and I believe by default Apple's framebuffer tops out at 7680x2160 but hacks like BetterDisplay can get around that.

I'm also not looking forward to a future OLED version of this having that Tizen bullshit. It looks awful in that video and it doesn't even support the same PbP options.

This year I've gone with my M2 Max Macbook Pro as my main computer with 2x 28" Samsung G70A 4K 144 Hz screens. Works alright. I've set my ITX size desktop PC as a gaming system with the living room LG CX 48" OLED TV.

I'm probably going to just wait for those 5120x2160 to come to market next year and use a 4K + 5120x2160 ultrawide setup.
 
Nice video. I'm glad I held up on this one.

I can't believe that it's been 6 months since its release already. Nvidia has still not acknowledged that their HDMI 2.1 can't do 240 Hz with this display but AMD can. I bet the 5090 will silently work at 240 Hz over both DP 2.1 and HDMI 2.1 yet the HDMI 2.1 support never comes to the 4090.

The MacOS support is even more ridiculous. Apple just refuses to show actual capabilities for their ports and just frames everything around their own 5K/6K displays, yet the bandwidth is not there it seems even on the latest and greatest M3 Max. It can do 7680x2160 but apparently not with scaling. Apple scaling is a naive "2x target res" so e.g "looks like 6400x1800" (120% scale) would be rendered at 12800x3600 and I believe by default Apple's framebuffer tops out at 7680x2160 but hacks like BetterDisplay can get around that.

I'm also not looking forward to a future OLED version of this having that Tizen bullshit. It looks awful in that video and it doesn't even support the same PbP options.

This year I've gone with my M2 Max Macbook Pro as my main computer with 2x 28" Samsung G70A 4K 144 Hz screens. Works alright. I've set my ITX size desktop PC as a gaming system with the living room LG CX 48" OLED TV.

I'm probably going to just wait for those 5120x2160 to come to market next year and use a 4K + 5120x2160 ultrawide setup.
The main interest in this monitor is, at least frome me, that it is one of its kind. If it had been glossy, I might have kept it (assuming Samsung actually managed to build one without obvious faults) even though it is a FALD/LCD.

The 240 hz limitation actually never bothered me, as I for anything that would really benefit from 240 hz, mostly fast paced gaming, I find even 32" to be to big. Then of course it would be a nice bonus if we could both have 240 hz in native resolution as well as have GPUs that could give us 240+ FPS at that resolution. For productivity you an do 120 hz at native resolution which is good enough for me. But I would still have preferred a normal 16:9 8K monitor instead.

If it had been a brighter OLED version, it would be a no brainer. Considering my real usage, rather than my intened one, is probably 95% productivity and 5% gaming, the actual tradeoffs with OLED being much superior for anything entertaninment related might not be worth that much.
 
Last edited:
My friend is offering to straight trade his 57" for my 32" MSI QD-OLED. I think I'm going to do it.

My usage is also heavily productivity leaning (75/25) and as much as I'm not a fan of the VA panel used, it's the only real 1 monitor solution that does it all.

The unit I had at launch was horrific in terms of uniformity with DSE, vertical bands and splotches all over. At $2000 I don't think it's at all worth it but for under $1000 the choice is obvious.
 
My friend is offering to straight trade his 57" for my 32" MSI QD-OLED. I think I'm going to do it.

My usage is also heavily productivity leaning (75/25) and as much as I'm not a fan of the VA panel used, it's the only real 1 monitor solution that does it all.

The unit I had at launch was horrific in terms of uniformity with DSE, vertical bands and splotches all over. At $2000 I don't think it's at all worth it but for under $1000 the choice is obvious.
I guess someone has to ask the obvious question - what is the main reason you want to trade away your MSI? I assume that being a member here, you probably knew what you bough before you got it (I assume it is the new 4K one, don't remember MSI having made any other QDOLEDs but could be wrong).

Just be prepared that while good for an LCD, when it comes to colors etc, the 57" will be noticable worse while still not bad of course. But you probably already know that.
 
I guess someone has to ask the obvious question - what is the main reason you want to trade away your MSI? I assume that being a member here, you probably knew what you bough before you got it (I assume it is the new 4K one, don't remember MSI having made any other QDOLEDs but could be wrong).

Just be prepared that while good for an LCD, when it comes to colors etc, the 57" will be noticable worse while still not bad of course. But you probably already know that.
Main reasons are I hate OLED for productivity and the more games I play the more I dislike how dead HDR looks compared to mini led monitors.

Until OLED gets brighter and I can keep a taskbar up 8 hours a day without paranoia, I think I'll stick with LCD.
 
Yea Oled has to be the secondary monitor until it gets brighter, more robost and more equa RGB patterns.

I have been tempted about a million times to ditch my two 27" 4k160s for this UW, but they just raised the prices again so Ill go back to waiting lol

Id trade the oled to your buddy....then buy another oled to use on the side for gaming lol

Me personally, I'm just having a hard time with the concept of giving up the high ppi and sweet AG coating on my 27s.
With the AW32 & Chyne27 I have dual 4k workspace with 160hz minimum and second chynee gives me full 4k120 dedicated for videos and lappy on side is good for lower res twitch/chatterbate streams!
1709426230234.png
 
Last edited:
The 240 hz limitation actually never bothered me, as I for anything that would really benefit from 240 hz, mostly fast paced gaming, I find even 32" to be to big. Then of course it would be a nice bonus if we could both have 240 hz in native resolution as well as have GPUs that could give us 240+ FPS at that resolution. For productivity you an do 120 hz at native resolution which is good enough for me. But I would still have preferred a normal 16:9 8K monitor instead.
I'd love to have the 240 Hz capability if I want to play at e.g 4K 16:9 which I know can run at above 120-144 Hz that my current displays have, in the right game. It just sucks big time that Nvidia is full of shit with their HDMI capabilities.

If it had been a brighter OLED version, it would be a no brainer. Considering my real usage, rather than my intened one, is probably 95% productivity and 5% gaming, the actual tradeoffs with OLED being much superior for anything entertaninment related might not be worth that much.
Yeah for me productivity would also be a big reason, but at the same time I see a lot of the same quirks that I had on my CRG9, like messing with custom resolutions for less than 32:9 aspect ratios but with the added quirks of DSC thrown in.

For work I'd run it in PbP mode and that might work on MacOS with scaling as it uses two display outputs, but not sure if my preferred 21:9 + 11:9 setup would work with scaling because MacOS is a real turd in this regard. HDR/VRR support is of course lost in this scenario.

The price has come down to 2099 euros here in Finland so that's more acceptable at least, but I've got about 800 euros in dual 28" 4K 144 Hz screens that do largely the same thing for work purposes. I'll need to give one back to my wife later this year when she comes back from abroad but by that time it's getting closer to end of 2024 so Black Friday sales etc might drive the price down further.
 
I'd love to have the 240 Hz capability if I want to play at e.g 4K 16:9 which I know can run at above 120-144 Hz that my current displays have, in the right game. It just sucks big time that Nvidia is full of shit with their HDMI capabilities.
I guess you mean as in considering this monitor as it can infact do 4k@240hz even with my 3090 (so I assume the 4090 could as well) on both DP and HDMI from what I remember.

Yeah for me productivity would also be a big reason, but at the same time I see a lot of the same quirks that I had on my CRG9, like messing with custom resolutions for less than 32:9 aspect ratios but with the added quirks of DSC thrown in.
4K is built in already, and I was able to find one other less than that which didn't upscale (GPU scaling did not work with 240 hz from what I remember) as I really think that 32" is way to big for fast paced gaming like Doom, CS etc. (at least at normal distances).

For work I'd run it in PbP mode and that might work on MacOS with scaling as it uses two display outputs, but not sure if my preferred 21:9 + 11:9 setup would work with scaling because MacOS is a real turd in this regard. HDR/VRR support is of course lost in this scenario.

The price has come down to 2099 euros here in Finland so that's more acceptable at least, but I've got about 800 euros in dual 28" 4K 144 Hz screens that do largely the same thing for work purposes. I'll need to give one back to my wife later this year when she comes back from abroad but by that time it's getting closer to end of 2024 so Black Friday sales etc might drive the price down further.
I agree that this is way to expensive for what it offers, considering that we can probably all agree on OLED actually having better PQ in general for entertainment. It is not like there is too much new tech in there.
 
I guess you mean as in considering this monitor as it can infact do 4k@240hz even with my 3090 (so I assume the 4090 could as well) on both DP and HDMI from what I remember.
I thought a lot of people had difficulties getting 240 Hz at any res working with an Nvidia card on these? I've tried it in a store on a miserable 3050 (enough for desktop use) and remember I couldn't get 240 Hz to activate at 4K either.
 
I thought a lot of people had difficulties getting 240 Hz at any res working with an Nvidia card on these? I've tried it in a store on a miserable 3050 (enough for desktop use) and remember I couldn't get 240 Hz to activate at 4K either.

There seems to be quite a few people that say it does not work and also experience problems but I got it working on both DP and HDMI at 4K@240hz with my 3090. Not sure if the 4090 would have limitations the 3090 does not but seems strange.
 
There seems to be quite a few people that say it does not work and also experience problems but I got it working on both DP and HDMI at 4K@240hz with my 3090. Not sure if the 4090 would have limitations the 3090 does not but seems strange.
But you still have to toggle it to 240 Hz, then set to 4K res for it to work? And going back to full space, drop down to 120 Hz from the OSD?
 
But you still have to toggle it to 240 Hz, then set to 4K res for it to work? And going back to full space, drop down to 120 Hz from the OSD?
Well, yes and no. This is the reason I decided connecting the monitor twice to my PC, one for native resolution@120hz and one for 4K@240hz. For some strange reason Samsung does not allow saving settings per input, but by running the "gaming" connection in HDR and the other one in SDR, you can get around that somewhat, like if you want to have different local dimming settings for work and play. Even weirder is that the 120/240 hz setting/toggles is in fact per input.

Of course, having to do all this is in itself a failure on Samsung's behalf, but it is still doable.
 
Back
Top