Samsung Odyssey Neo G9 57" 7680x2160 super ultrawide (mini-LED)

This vid seems to have some info about it if you can parse the closed captions. Might be worth checking out anyway.


View: https://www.youtube.com/watch?v=NgE9pJ-OnXc


Oh wow that's the exact same setup I'd be doing - 49" Neo G9 on top and 57" Neo G9 on the bottom with that stand. It looks pretty hideous tbh... hmm. Wondering if I should just sell the 49". The floor to ceiling pole is out of the question as my game room wouldn't accommodate it.

Would it make sense to wall mount both monitors? But the freedom of movement/tilt would be severely limited. Yikes... will have to try it out and see. thanks for the Reddit link btw.
 
For those that have on, buy a Tobii for foveated rendering and let us know how it affects framerates…. Should work with this width :)
 
Why is it in HDR game the blacks do not look as inky black as a HDR video viewed on this 57" monitor? I had this same problem with my old miniLED NEO G9 49" when I had it a while ago.

Take a look below.
In a HDR video on youtube, blacks is very inky black, literally almost like an OLED. Very impressive to me.
20230921-195858.jpg


But in HDR game, blacks don't look anywhere near as inky black, see the difference below.
20230921-201507.jpg


But on my OLED C242, the black areas are inky black in both HDR games and HDR videos. But on this Neo G9 57, it only seems to be that way in HDR videos (literally looks OLED black).

I tried with and without Contrast Enhancer.
 
That video I re-linked said that game optimizer gives the screen a blue tone so it might be turning the blacks to blue/grey blacks.

I'd mess with the game optimizer on/off, and otherwise I'd probably use reshade to change parameters, increase contrast/black level in it and maybe change the tone away from blue. Nvidia freestyle is similar to reshade but only supports certain games and requires geforce experience to be running but works similary with less parameters/sliders. Some game's anti cheat will be triggered potentially using reshade though but as long as you aren't playing online it prob won't. I haven't had a problem really. Use at your own risk
 
That video I re-linked said that game optimizer gives the screen a blue tone so it might be turning the blacks to blue/grey blacks.

I'd mess with the game optimizer on/off, and otherwise I'd probably use reshade to change parameters, increase contrast/black level in it and maybe change the tone away from blue. Nvidia freestyle is similar to reshade but only supports certain games and requires geforce experience to be running but works similary with less parameters/sliders. Some game's anti cheat will be triggered potentially using reshade though but as long as you aren't playing online it prob won't. I haven't had a problem really. Use at your own risk

But I don't get why the HDR videos are inky black with the same monitor settings, but ingame its not.
 
Also I realised, that with contrast enhancer enabled on high, HDR games look way better. It has that OLED like vibrance to it and makes me love the monitor so much more now!
 
The part of the video I was referring to is where he said, at least in closed caption translation :

"HDR gaming, ECO mode - excellent color reproduction. Highlight details are also well represented, but the dark details are average. "

"After switching to game mode, dark areas are significantly improved but the overall tone will be bluish."
 
Actually in regards to my post above about the blacks not being deep in HDR games. It might just be a Resident Evil 4 thing. That game might just have crap HDR implementation. Because I just tried out Dying Light 2, which uses Windows HDR, as it doesn't have native HDR, and the blacks looked very deep and inky in it, just like my OLED C242. Ontop of that, it had very punchy contrast and brightness.

This monitor is absolutely gorgeous in HDR now after a week of messing with the settings. I think it is actually the first time since using an OLED display for gaming since 2016, that I can actually say it beats the OLED picture quality in HDR games.

I have used many gaming displays, and for a long time, I would always go back to an OLED display because it couldn't be beat for image quality. I think this is the first time I will say it has been beaten and I won't go back to my C242.

Wait till you guys try it, you won't be disappointed!
 
Is the desktop washed out with HDR enabled? For the love of God I hope that's fixed because it's remained a problem with FALD Samsung monitors since the previous Neo G9 and breaks auto HDR.
 
Is the desktop washed out with HDR enabled? For the love of God I hope that's fixed because it's remained a problem with FALD Samsung monitors since the previous Neo G9 and breaks auto HDR.
No it is not washed out.

Like I mentioned, I was playing Dying Light 2 (windows auto hdr) and it looked phenomenal in HDR picture quality. It actually looks way better than other natively supported HDR titles! So this monitor definitely handles the Windows Auto HDR very well!
 
Actually in regards to my post above about the blacks not being deep in HDR games. It might just be a Resident Evil 4 thing. That game might just have crap HDR implementation. Because I just tried out Dying Light 2, which uses Windows HDR, as it doesn't have native HDR, and the blacks looked very deep and inky in it, just like my OLED C242. Ontop of that, it had very punchy contrast and brightness.

This monitor is absolutely gorgeous in HDR now after a week of messing with the settings. I think it is actually the first time since using an OLED display for gaming since 2016, that I can actually say it beats the OLED picture quality in HDR games.

I have used many gaming displays, and for a long time, I would always go back to an OLED display because it couldn't be beat for image quality. I think this is the first time I will say it has been beaten and I won't go back to my C242.

Wait till you guys try it, you won't be disappointed!

RE4 actually has one of the best HDR implementations I've seen.
View: https://www.youtube.com/watch?v=BciStFDR9eY
 
Isn't broken HDR on the Neo G9s at launch kind of a yearly tradition to be if not fixed but at least improved after 2-3 FW updates?
 
I have my "dual Samsung G70A" setup on my desk again so I did some retesting at 7680x2160 @ 120 Hz Nvidia surround.

Setup: Intel 13600K + 32 GB RAM + 4090

Cyberpunk 2077 v2.0

Using Path Tracing.

Outside V's apartment building, at the crosswalk.

ResolutionSettingsFramerate (average)
7680x2160DLSS Quality+FG41
DLSS Balanced+FG48
DLSS Performance+FG57
DLSS Ultra Perf.+FG62
NativeUnplayable, the game can't handle 7680x2160 + path tracing
3840x2160DLSS Quality+FG72
DLSS Balanced+FG85
DLSS Performance+FG104
DLSS Ultra Perf.+FG135
Native21

The just released Cyberpunk 2077 v2.0 version looks pretty damn incredible.

But man that performance hit at 8Kx2K is rough, and the FOV distortion is pretty bad too. Even with a bezel in the middle it does look very immersive but I'd rather try to get custom resolutions working and play this at 5120x2160.

If any of you have the display and the game, please test if you get better framerates than I do.
 
I have my "dual Samsung G70A" setup on my desk again so I did some retesting at 7680x2160 @ 120 Hz Nvidia surround.

Setup: Intel 13600K + 32 GB RAM + 4090

Cyberpunk 2077 v2.0

Using Path Tracing.

Outside V's apartment building, at the crosswalk.

ResolutionSettingsFramerate (average)
7680x2160DLSS Quality+FG41
DLSS Balanced+FG48
DLSS Performance+FG57
DLSS Ultra Perf.+FG62
NativeUnplayable, the game can't handle 7680x2160 + path tracing
3840x2160DLSS Quality+FG72
DLSS Balanced+FG85
DLSS Performance+FG104
DLSS Ultra Perf.+FG135
Native21

The just released Cyberpunk 2077 v2.0 version looks pretty damn incredible.

But man that performance hit at 8Kx2K is rough, and the FOV distortion is pretty bad too. Even with a bezel in the middle it does look very immersive but I'd rather try to get custom resolutions working and play this at 5120x2160.

If any of you have the display and the game, please test if you get better framerates than I do.
I already did with pretty much same specs as you, and that is spot on fps that I get too at those various settings.

Unfortunately thats what you will have to live with at full res.

Well atleast until next gen GPUs.
 
I already did with pretty much same specs as you, and that is spot on fps that I get too at those various settings.

Unfortunately thats what you will have to live with at full res.

Well atleast until next gen GPUs.
Thanks for confirming!

Cyberpunk is the "new Crysis" really, so not being able to run it well on a freakin' 8Kx2K display is pretty expected. I think it's still pretty incredible you can have a playable experience as long as you either drop down to DLSS Performance for 50-60 fps or are ok with console 30-40 fps range. Frame generation is really a must for this title.
 
Thanks for confirming!

Cyberpunk is the "new Crysis" really, so not being able to run it well on a freakin' 8Kx2K display is pretty expected. I think it's still pretty incredible you can have a playable experience as long as you either drop down to DLSS Performance for 50-60 fps or are ok with console 30-40 fps range. Frame generation is really a must for this title.

The fact that you can't even hit 60fps with FG means your base fps before FG must be in the 30s or 40s. FG might make the game appear smoother but it's going to play awfully sluggish with such a low base fps.
 
I'm actually surprised a 4090 can even manage 41 FPS with frame gen + path tracing using DLSS. I was expecting low 20's.
 
I played CP2077 with Performance DLSS+FG, and even though i only hit 55-62fps, it actually felt relatively smooth and completely playable.

On jedi survivor, with Performance DLSS + FG, i get roughly 70-100fps, but it feels far more sluggish in comparison to CP2077.
 
I played CP2077 with Performance DLSS+FG, and even though i only hit 55-62fps, it actually felt relatively smooth and completely playable.

On jedi survivor, with Performance DLSS + FG, i get roughly 70-100fps, but it feels far more sluggish in comparison to CP2077.
Jedi Survivor is a frame pacing mess and probably why it feels so bad.

I would rather play @ 4K with black bars on the side before resorting to DLSS performance because its such a huge hit to image quality.
 
Jedi Survivor is a frame pacing mess and probably why it feels so bad.

I would rather play @ 4K with black bars on the side before resorting to DLSS performance because its such a huge hit to image quality.
At first I too disliked DLSS performance but after a short time I didn't even notice it and actually felt like it looked good lol
 
Holy smokes, the high contrast and high sustained brightness in this part of jedi survivor was quiet a delightful experience

20230922-213723.jpg


Jumping through the cliffs and floating balloons in this part of the game is amazing picture quality on this monitor.
 
There has not been a 2x performance gain gen on gen for over 10 years. Even the 3090 to 4090 which is considered the biggest leap of the last decade was not 2x. There is no way the 5090 will be 2x a 4090 sadly.

Screenshot-20230922-224539-Chrome.jpg


See I told you there will be something special about the next-gen GPUs, they will leap like never before. Everyone is now getting into high refresh + high resolutions monitors, and Nvidia knows this.

I know, I know, its just a rumor....
 
View attachment 600436

See I told you there will be something special about the next-gen GPUs, they will leap like never before. Everyone is now getting into high refresh + high resolutions monitors, and Nvidia knows this.

I know, I know, its just a rumor....

I'm sure it can be twice as fast....probably in certain AI workloads or something lol. Or perhaps when using an exclusive DLSS 4.0 version that only works on RTX 50 series? Because that's literally how Nvidia makes performance claims now with RTX 40 series, by using DLSS FG which the 30 series cannot even do....Anyways I'm going to stick to the more realistic outcome of perhaps 1.5x-1.7x when compared apples to apples in gaming workloads. And since we're in the mood of posting rumor stories now...here's one for you lol:

1695403640693.png


Notice how both your rumor and my rumor story contains the words "UP TO" and "NEARLY", not "FASTER ON AVERAGE".
 
Guys, can this 57" Neo G9 do 4K UHD (i.e. 3840x2160) in PIP/PBP mode @ 240hz? In other words, can I hook up one rig to one "side" at 4K 240Hz and the other at the same resolution & refresh rate?

Also, is there a KVM switch that I can use for this monitor that can do the full resolution 7680x2160 @ 240Hz with two PCs?
 
Guys, can this 57" Neo G9 do 4K UHD (i.e. 3840x2160) in PIP/PBP mode @ 240hz? In other words, can I hook up one rig to one "side" at 4K 240Hz and the other at the same resolution & refresh rate?

Also, is there a KVM switch that I can use for this monitor that can do the full resolution 7680x2160 @ 240Hz with two PCs?
I assume that you mean a docking station and not a KVM-switch? You would have a hard time even finding a GPU that could handle this monitor at max spec.
 
Is anyone else seeing no $500 gift card on Amazon US for preordering? Wonder why it was removed.
 
Guys, can this 57" Neo G9 do 4K UHD (i.e. 3840x2160) in PIP/PBP mode @ 240hz? In other words, can I hook up one rig to one "side" at 4K 240Hz and the other at the same resolution & refresh rate?

Also, is there a KVM switch that I can use for this monitor that can do the full resolution 7680x2160 @ 240Hz with two PCs?
PbP mode is limited to 120 Hz afaik.
 
View attachment 600436

See I told you there will be something special about the next-gen GPUs, they will leap like never before. Everyone is now getting into high refresh + high resolutions monitors, and Nvidia knows this.

I know, I know, its just a rumor....
We are likely nearly two years away from release so any rumor that comes out atm is IMO equivalent to bullshit.
 
PbP mode is limited to 120 Hz afaik.
I assume that you mean a docking station and not a KVM-switch? You would have a hard time even finding a GPU that could handle this monitor at max spec.

Yes, a docking station. I am planning on using the 57" Neo G9 as my "main" display with the 49" Neo G9 on top as the accessory display. I'm also planning to use my 3rd rig (X99 Rig) with these displays so I'd like to just use the same displays, keyboard/mouse, and gamepad for both PCs.

Both the main rig (UberRig) and the 3rd PC (X99 Rig) have RTX 4090s - the question is, what docking station can I use to use the same peripherals for both PCs?

kasakka not too concerned with PBP - though it's nice to know 4K 120Hz is doable.

I mainly want to use this monitor setup with two PCs without having to switch cables physically between the computers even though the cases will be right next to each other.
 
We are likely nearly two years away from release so any rumor that comes out atm is IMO equivalent to bullshit.
Still another 2 years!?

Every nvidia flagship generation update has been a 2 year cycle. Why do you think it is now going to all of a sudden be 3-4 years for the next gen update?
 
Still another 2 years!?

Every nvidia flagship generation update has been a 2 year cycle. Why do you think it is now going to all of a sudden be 3-4 years for the next gen update?

It's actually not 3 years. RTX 4090 launched Oct 2022 so if the 5090 comes out at CES 2024 which is Jan that's only about 2 years and 2 months. This is supposedly the new roadmap from nvidia:


1695446314967.png
 
.....

At least according to some articles. You never know for sure but from what they are saying, if it ends up true, I wouldn't expect availability until early 2025 unless that changes.:

https://www.extremetech.com/gaming/nvidia-confirms-geforce-rtx-50-series-launching-in-2025

If you've been watching the parade of lackluster RTX 40-series reviews and thinking, "I'll just get the 50-series in 2024," that is not going to happen. The company has released a new roadmap that clearly states that whatever comes after its current Ada Lovelace architecture won't appear until 2025. This is a clear deviation from its two-year cadence between new architectures. Though that might be bad news for folks not impressed with the 40-series, it does open the door to new "Ada" GPUs in 2024, possibly with Super or Ti branding.

Nvidia made the announcement at a recent event highlighting its AI achievements, according to Tweaktown. The company displayed a roadmap for the first time showing how its architectures would scale over time, with them all getting a "next" version in the future. For Ada Lovelace, that translates to "Lovelace Next" in 2025, which is reportedly code-named Blackwell. This is a surprising development simply because Nvidia has been able to hit its two-year cadence for all of its past launches, and given how rapidly technology is progressing, anything longer than that seems like a gamble. Still, it launched Ada in 2022, Ampere in 2020, and Turing in 2018, so the pattern is quite clear.

. . .

original article:

https://www.tweaktown.com/news/9212...tx-50-series-set-to-launch-in-2025/index.html

The roadmap refers to the technology as "Ada Lovelace Next" alongside "Hopper Next" and "Grace Next," so there are no clues as to what architecture will be called. It highlights a 2025 release, so we might have to wait longer than usual before seeing the GeForce RTX 5080 and GeForce RTX 5090.

Traditionally new GeForce hardware and architecture arrived every two years, give or take a few months. The Pascal-based GeForce GTX 10 Series arrived in 2016, the Turing-based GeForce RTX 20 Series in 2018, the Ampere-based GeForce RTX 30 Series in 2020, and the Ada Lovelace-powered GeForce RTX 40 Series in 2022.

But this doesn't mean there will be a three-year wait, as the placement of the existing GPU architectures on the roadmap aligns with the time of year they launched. "Ada Lovelace Next" looks to be on track for early 2025, with "Hopper Next" set to arrive in 2024. That said, this also points to NVIDIA focusing on getting its next-generation AI hardware out before the release of its new line-up of GPUs for PC gaming.

Read more:
https://www.tweaktown.com/news/92120/nvidias-ada-lovelace-successor-the-geforce-rtx-50-series-set-to-launch-in-2025/index.html


. . . .


https://www.techspot.com/news/99225-nvidia-roadmap-shows-rtx-5000-cards-set-2025.html


View attachment 593682

HardwareLuxx editor Andreas Schilling shared a roadmap revealed by Nvidia during a presentation on its H100 GPUs. It shows the company's future products and their launch dates.

Previous reports claimed that the Hopper Next architecture, which the roadmap shows arriving next year, could be used in both enterprise and gaming cards. But the arrival of Ada Lovelace-Next the following year kills off those claims.

Nvidia didn't reveal the official name of the Lovelace successor. A lot of people claim it will be Blackwell, but that could be the successor to Hopper.

Delaying the release of the RTX 5000 series by a year likely makes sense from Nvidia's point of view. Lovelace has been heavily criticized for its high prices, especially during these times of economic hardship. Nvidia might be hoping that the economy will have improved by 2025, making an expensive graphics card less of a luxury that few can justify buying.

.........
 
I mainly want to use this monitor setup with two PCs without having to switch cables physically between the computers even though the cases will be right next to each other.
Then the easiest way would be to simply use the built in KVM functionality and not use PbP mode where you have to pick which USB-B port is in use and it's afaik not that fast to toggle. Without PbP you can assign PC1/PC2 USB-B port per HDMI input and it should switch when you switch inputs.
 

Dude, we cannot wait till then for the 5090. We can barely keep up today with the 4090 with all these high refresh and high res monitors. We need the next gen now, not in another 1.5 years. Lol

Bloody Jensen Huang, already enjoying his 57" G95NC at 8K at 240hz because he already has a prototype 6090.
 
The alternative is to wait for the Q3 2025 version of this monitor. It will probably have double the zone count, be brighter and get full bandwidth DP 2.0 port. There will probably be an OLED version announced by then too.

It's crazy to think after all the stagnation we have a monitor that is ahead of the GPU curve.
 
The alternative is to wait for the Q3 2025 version of this monitor. It will probably have double the zone count, be brighter and get full bandwidth DP 2.0 port. There will probably be an OLED version announced by then too.


Back2theFuture_I.Have.To.Tell.You.gif



You can always, at any point, wait for something else for 2 - 3 years heh. Or buy something in the meantime, or be happy with what you have now.


In 2025 I'll hopefully be looking at multiple manufacturers bringing 8k gaming tvs to market in competition with each other. If was going to wait that's probably what I'd wait for.

Hopefully 120hz 8k on a large 55" to 65" for a ton of desktop/app real-estate. If making a wish list, if lucky one might be capable of higher hz at 4k and 4k+ uw resolutions (similar to how you can run higher hz 1440p on a 4k), and be able to get higher fps than the full 8k for some games at 5k and 6k if not higher hz. All of those on the 8k screen 1:1 pixel if a mfg can pull that off would be great. 1000R curve like the ark would be a big plus. Sure some gpus and monitors might have full dp 2.1 80gbps by then too like you indicated- but prob not gaming tvs unfortunately since they stick with hdmi versions (even the original 1000R ark lacked any displayport but the refresh version will have a single dp 1.4). Apparently DSC can muddle non-native resolution capability if the mfg doesn't take steps to develop around it for other resolutions though, so that could continue to be a problem for many who want that incl. me.

I'd honestly rather have a full 8k version of this kind of screen in an ark-like form factor that could do high hz on the bottom half in uw "letterboxed" and other non-native resolutions 1:1 pixel, but for now this seems like the best step toward 8k/fractional 8k screen resolutions with high hz gaming capability we are probably going to get for a year or two so it's kept me intrigued so far.

. . . . . . . . .

It's crazy to think after all the stagnation we have a monitor that is ahead of the GPU curve.

Seems like that but I think 1440p was ahead of the curve for getting over 100fps for awhile there even though we had 144hz 1440p screens. Even with a 1080ti sc I ended up using two in sli for witcher 3, GTA5, shadow of mordor, dishonored , etc back then just to get over 100fps. 4k was definitely ahead of gpus and ports. I'd never bother but people were using 4k tvs at 30Hz with no vrr/g-sync at first, then 60hz 4k similarly. We just got to where you could run 4k well on the 3000 and 4000 series at 120hz or so fairly recently ports/bandwidth and gpu power wise, (esp. considering availability during pandemic for a lot of people during 3000 series for awhile). Mfgs for the most part also hit pause on 8k screens for a year or two - otherwise there would probably be some other very demanding 8k gaming screens/tvs like this 57 uw right now at 8k 60hz and eventually 120hz. It seemed more stagnant before we started getting 1080p 120hz (~2009) , 1440p 144hz g-sync (~ 2014) and onward. That history was mostly jumps resolution and port bandwidth wise though, and g-sync. Contrast (FALD/OLED), ubiquity of VRR, then the jump to HDR, even low input lag on gaming tvs vs historically, etc were big leaps more recently.
 
The alternative is to wait for the Q3 2025 version of this monitor. It will probably have double the zone count, be brighter and get full bandwidth DP 2.0 port. There will probably be an OLED version announced by then too.

It's crazy to think after all the stagnation we have a monitor that is ahead of the GPU curve.
Just last year we were in a "4K 120 Hz is more than enough" situation. Then the 4090 dropped and changed the whole thing. Now I want 4K 240 Hz OLEDs, more 4K+ 240 Hz LCD models etc.

I expect DP 2.1 will be a similar situation to HDMI 2.1. Manufacturers are skimping on including the full speed 48 Gbps HDMI 2.1 port if they can get by with a 40 Gbps controller which is cheaper. So I fully expect DP 2.1 UHBR13.5 displays to dominate the market eventually. Full speed UHBR20 will probably have slow adoption considering there are no GPUs that support it atm, next gen at the earliest.
 
Do we know that AMD might not surprise us with a GPU that can actually compete with Nvidias high end cards that could be available before the 5090?
 
Back
Top