Where are the 8K Monitors?

Or on the flip side, it ends up being something almost no company implements just like a lot of features. We still barely have any physics based simulations in most games, it's messed up that the most advanced physics based gameplay is in Tears of the Kingdom on the Nintendo Switch. AI has huge potential but whether companies start utilizing it after the hype dies in a few years is another question. I expect it will be more of a thing in the content creation than it is in gameplay.


Knowing TV companies, it's either disabled for game mode, or it's on for everything and causes higher input lag because lag is not really a concern for movies and TV.

This sort of tech might also work best when paired with either streaming apps as an API or middleware layer, or BluRay players where you could analyze both past and following frames to figure out what to do with the current frame, and to also do the processing ahead of time before the viewer sees it. E.g a streaming service has streamed 25% of a film when it starts playing it, then there's a ton of footage the processor could handle in the background and then just present the enhanced resulting frames. This would remove any realtime demands for it.

I don't know if that will be true on this model. I think tv's default upscaling is being replaced by AI. So if the default upscaling of the tv in game mode is better than the traditional non-AI upscaling methods, even obviously not as good as the full featured media processing, then it will be a gain and a win. Just like you turn a lot processing off in game mode on regular tvs, but you are still able to upscale (in fact, upscaling of 4k is forced on samsung 8k tvs, there is no 1:1 pixel letterboxed 4k signal capability).


. . . .

The thing is, there are bandwidth limitations even using DSC (display stream compression) - so getting a high hz signal, for example a 240hz 4k 10bit 444(rgb) HDR signal to the tv first over the existing port and cable limitations using DSC, and THEN using hardware on the TV to upscale could be more efficient and get higher performing results than sending a pre-baked (4k upscaled to) 8k signal from the nvidia gpu at lower hz. Unless they started putting nvidia upscaling hardware on gaming monitors/tvs someday perhaps.
So for pc gaming on a gaming tv, putting AI upscaling hardware on the tv sounds like it could be very useful if done right.

I use a 2019 shield regularly which has a 2018 AI upscaling chip in it, but that's not the same as having a more modern AI chip in the display itself, due to the bandwidth limitations of ports and cables in regard to PC gaming on a gaming TV. More modern AI upscaling may be faster, cleaner, and provide more detail as the generations progress.

Also, in regard to media, the shield does an ok job to 4k but it is a big leap to do 8k fast enough, clean enough, and with high enough detail gained.

For PC gaming, the bandwidth savings by upscaling on the display side would be important, especially with the way nvidia has allotted bandwidth on their ports up until now, (60hz 8k max, 7680x2160 at 120hz max, limits on multiple 240hz 4k screens) - though that could change to where a single port could get full hdmi 2.1 bandwidth with the 5000 series potentially, I hope so at least.
If it's better than tv's traditional non-AI upscaling overall, then it's a win. Especially 4k to 8k local content, including 4k resolution output gaming, (and even variable bit rate "4k" streaming stuff) - where it's working from fairly high detail to start with.

This is my thougths as well as mentioned above, and the reason why I think that as a PC monitor, the difference to the QN900C might not be all that big unless the actual panel is changed or the OCB had real updates to support 240 hz if that is not fake.

For various reasons, I guess quite few people would actually considering either as a PC monitor so one of us might have to bite the bullet here and get one to test ourselves :)

Someone in the avsforum thread brought one home last night. He said he's setting it up tonight. It's an 85" one no less lol. 💰💰💰💸 So assuming he's using it as a living room tv but I'm interested in his take on it once he has it up and running. Vega also has one on order I believe.

Personally I'm waiting out the big samsung price drops that happen even at the 4month period, let alone 6 - 8 months out. They really charge a massive early adopter fee on stuff like this. Very curious of people's experience with them in the meantime though, and looking forward to a detailed RTings review among others.


Samsung price history talk:

I'm eagerly waiting on some more detailed reviews where they at least put it through it's paces on a powerful pc gaming rig, but I'm in no hurry at that price tag, plus the 5000 series gpu won't be until 2025.

Since they always ask a bigger early adopter price, it's worth waiting it out, kind of like buying a car at the end of the model year, b/c by then they'll have dropped a lot. It's usually not even that long for samsung to drop price. Plus some qualify for samsung discount on top of that (though that removes the ability to add a best buy warranty by buying from them, even though you can pick up the samsung purchase at best buy ironically).

https://pangoly.com/en/browse/monitor/samsung

firefox_OQpHrQLAV1.png


. .

firefox_jyZXsVmDtp.png
 
Last edited:
I don't know if that will be true on this model. I think tv's default upscaling is being replaced by AI. So if the default upscaling of the tv in game mode is better than the traditional non-AI upscaling methods, even obviously not as good as the full featured media processing, then it will be a gain and a win. Just like you turn a lot processing off in game mode on regular tvs, but you are still able to upscale (in fact, upscaling of 4k is forced on samsung 8k tvs, there is no 1:1 pixel letterboxed 4k signal capability).
AI was basically the main thing Samsung pushed for with the QN900C though.
 
Yeah they pushed multi input type stuff on the ark gen 1 too, but they didn't really deliver on it functionally until gen 2. 🤷‍♂️
 
"I may eventually run my pc rig from a basement storage room again someday, which I did years ago to a dual monitor setup using older gens of hdmi/dvi and mini-dp along with a usb ext. cable. They do also make fiber usb-c cables, so it would be easy to do a usb-c fiber run to a hub at my peripherals desk remotely from the pc along with the display cables being on the same remote run. As quiet as PC fans and case hardware can be, they will never be as silent as when they are in a different room 25' - 35' away - unless you turn the pc off of course. ;)
Also heat in the display+peripherals room, and in some cases air quality, hot components blowing hot air over and out of a hot box full of pcb/capacitors/plastic/insulation/rubber/dust."
Yes !!
Exactly this !

I run my main computer like a thin client. The only things in the work area are monitors+peripherals, totally silent.
I have the rig itself in my attached garage (North facing). It's always cold, for the last 15 years I have actually heated this garage by running folding ( https://stats.foldingathome.org/donor/id/3325) :ROFLMAO:
I only need 15 foot runs for the displayport cables and usb cables so no problems there, but the TV location is another 20 feet.
-------

I have a lot of homework to do it looks like. There's an 8k reciever in this mix too.....

Thanks for the input everyone !
I'm certainly excited to play with 8k.
 
  • Like
Reactions: elvn
like this
Owner's thread for the 900D up on avsforum with some early takes. Owners so far are more living room media leaning and not setting a PC up on them, but the OP said game mode is benefiting from the AI upscaling chip/tech even in game mode.

https://www.avsforum.com/threads/20...ssions-qn800d-owners-welcome.3298856/#replies


This was linked there recently:

https://www.whathifi.com/advice/samsung-qn900d-vs-lg-z3-which-8k-tv-is-the-most-stunning

That review seems all over the place with feature comparisons. A very brief, glancing review. I don't know that it's being accurate. They made no mention at all about 4k 240hz gaming mode, interpolated or not it is a marketed feature and should be covered or at least mentioned in the article. I don't see any comparison of 25% and 50% screen nits in HDR, and sustained, even just a viewer perception take on it without numbers. 25% - 50% and higher screen brightness/color volume on OLEDs is poor so is one of the major tradeoffs between FALD and OLED tech currently. No mention of the screens outer surface, glossiness, etc. There was also no mention of ABL (on either tv). They did say that the 900D is 20% to 30% brighter than the 900C.


They mentioned the AI upscaling detail gained on the 900D at least, and in game mode too, so that was nice to hear:


The QN900D is one of the best upscaling TVs we've ever seen. Textures like hair and grass look sharper, denser and more three-dimensional, with nary a hint of processing or artificiality. It's a genuinely next-gen experience the likes of which you won't get on any other TV.


The extra resolution breathes new life into old games too. Again, it's handled without adding any noise or elements that look artificial.


And if you are fortunate enough to watch any native 8K content on it, you're in for a treat. Our test reel looks amazingly lifelike – it's more like looking through a window than watching a TV.


Contrast is nothing short of remarkable for an LCD TV – it creates almost OLED-like black colours along with OLED-beating brightness at the other end of the spectrum. The fact that it can also control its light down to impressively localised levels for an LCD TV helps to inject even more definition into the minutiae that makes 8K special.


Thanks to its Quantum Dots, its colours can go incredibly bright without losing saturation, but it always maintains its subtle shading abilities.


Downsides? The backlight on the Standard setting can be a little inconsistent, and the 8K upscaler can be a little too aggressive at times. But selecting the Movie preset will solve both issues.


- - - - - - - - - - - -

Link to their focused review of the 900D again: https://www.whathifi.com/reviews/samsung-qe75qn900d


We have seen good 8K upscalers before, from Sony, LG and, of course, Samsung. But thanks, presumably, to the massive contribution of AI, the precision with which the QN900D adds the tens of millions of extra pixels required to turn 4K and HD into 8K feels almost mystically impressive.


This is especially true in highly textured areas such as hair, grass, trees, the fabric of clothing, face details and so on. It’s not just that this sort of image content looks sharper, either. It also looks denser and more three-dimensional – more organic and natural, even. All with little or no hint of processing unpleasantness or artificiality to disrupt the extra connection with the picture that such image density creates. Provided, anyway, you make a couple of small tweaks to the Standard and Movie picture presets that we will get to later.


As well as being one heck of a satisfying party trick in its own right that delivers the most detailed, sharp-looking pictures we’ve seen on a TV, this unprecedented level of 8K upscaling performance also makes a clear-cut case for 8K being not only a worthy part of the AV landscape despite the lack of 8K content, but a feature genuinely capable of delivering a ‘next-gen’ experience.
This isn’t just true with video sources, either. Good 4K-level gaming graphics also look dazzlingly great after passing through the QN900D’s brainbox, enjoying new levels of sharpness, detail and depth that breathe new life into old favourites. The way Samsung’s set does this without making the resulting pictures look noisy or full of processing side effects genuinely makes you feel more immersed in what you’re playing.
 
Last edited:
I have read those, looks very promising. I'm pulling the trigger on the 8k change.
This is gonna be expensive.
I still run 1070's on my 4k video edit system, they run flawlessly. I guess I'm buying new video cards....haha !!!!
My whole environment is spec'd for 4k60, everything has to change, even the wiring.
------
I have many questions now but I think I'll start a new thread for it. We need to center on 8k displays again here !
I happen to need a pair of 32"s

:ROFLMAO:
 
He suspects that the 240hz is processing. However he's not sure exactly what it's doing with the frame rate in 4k 240. Will require a deep dive at some point.


View: https://www.youtube.com/watch?v=STyLUgh4L9M

Not sure if it is the camera or YT but that gaming footage look quite "flat" to me with regards to colors and contrast, just like MiniLED tend to look compared with OLED. That of course is perhaps not so strange, but we are still talking about "prime MiniLED" here.

So I guess the real news here is that there really are no news, at least not in this regard. We are still talking about 30+ million pixels handled by 2000 zones (for the 75"). A QN900C with an updated One Connect Box.

Would imagine that this is in game mode also, which usually cuts down on processing.

Edit:

Those slow panning "nature shots" in the start and end of the video looks better, making me even more convinced that we are again seeing game mode cutting down on image processing / PQ. Which is kind of expected.
 
Last edited:
From the comments, the parameter output is great, and supposedly helped by the AI chip along with the FALD shaping. Some are saying it's 20% to 30% brighter than the 900c, but I'd have to see some real testing of that. I expect it to be a little brighter but if it's by that much I'd be pleasantly surprised . The AI upscaling detail added, especially in media, "to make 8k worth it" , particularly at nearer viewing distances, is the big upgrade selling point (and price point) on these, with some other performance enhancements brought by the newer gen AI chip as well, and along with some kind of 240hz 4k tech, interpolated in some way potentially. 8k worth of desktop/app real-estate is a desired feature for pc but yes the 900c has that already.

The game mode won't get all of the image enhancements, just like any gaming tv, but the reviewer did say it increased the detail in 4k when upscaled to 8k by some amount, just not as much overall pq quality/increase as compared to media modes of course. Will have to be confirmed in more in-depth review somewhere. The jury is still out on the exact 4k 240hz methodology and performance too. The reviewer seemed confused by it. He said it felt smoother and more responsive but he couldn't get over the fact that it broke his fps meter that he relies on.

I found my 48cx's game mode dull compared to other named media picture modes too when I got it, in SDR games that is, HDR games looked great. I ended up bumping the saturation up a little more than I'd like, then using reshade to tone it back down on a per game basis to where I wanted. Bumping the saturation in SDR game mode doesn't affect HDR game mode's settings on LG OLEDs, they are two different named picture modes.

. . . .


I don't think you can go by youtube videos, or even side by side comparison shots of tv posted on forum. Every time I see side by side pictures of screens in the same frame for example, I roll my eyes. Cameras are biased by things with different output values so will often show one thing as the foundation and the other more pale or overly bright/blooming. Pictures will always be innaccurate but sbs is the worst way to compare screens in pictures. Also, a lot of pictures and videos of HDR gaming TVs are shown in SDR, even when the person posting them is playing or watching HDR material on the screen. There are way too many camera biases and differences, and between different types of cameras/phone cameras... a lot of variations in the whole chain between your eyes in person, the camera's biases and limitations based on what it is capturing in the frame, file formats/compression (dynamic compression on youtube.. plus people would post things in 8k on youtube just to get a higher bitrate 4k since the default 4k is so watered down so that should tell you something), browsers, and end user's displays (type, model , calibration/levels, viewing environment) . You have to go by the impressions that are relayed by someone you think you can trust, and better yet, you have to go by actual numbers derived from testing hardware which we will hopefully get from RTings in the long run.
 
Samsung Motion Rate 120, Sony MotionFlow 960, LG TruMotion 240, etc.


Now Samsung "Motion Xcelerator 240Hz" (Xcelerator was not a typo lol).

From samsung's 900D product page:"⁴240Hz is limited to 4K resolution and requires compatible content connection from compatible PCs. Motion Xcelerator 240Hz is sometimes called Motion Xcelerator Turbo 8K Pro.".

Though, it's named that for the 144hz mode on the 900c too, and that is native 120hz/144hz supposedly. Same kind of naming conventions on 1080p/1440p 120hz being capable on 4k 60hz tvs in the past also.



. . Samsung could be using VRR/free-sync to cap the fpsHz of the tv to 120 and then double it, showing that as a 240Hz signal to the nvidia gpu and windows. If so, you'd only be losing 120fpsHz vs 144fpsHz 4k on other gaming tvs motion definition wise. In that scenario, you'd only get 120fpsHz of motion definition rather than 240fpsHz worth. Frame doubling should still cut the sample-and-hold blur (due to the way our eyes work vs refreshes) down more like native 240fpsHz does though, and your inputs might feel a bit more responsive, depending.Not saying that is what is happening.

. . . Another possibility is that It could be using some kind of in-between framing, "tweening" interpolation like tv soap opera interpolation on tvs does to manufacture a frame based on buffered frame(s). If so, it could possibly be done better by samsung's 3rd gen AI chip. Nvidia frame gen does something similar with AI but nvidia's gpus are way more powerful than a TV chip I'd think. Just guesswork but considering the fake hz naming conventions in previous gens of different manufacturer's tvs, I'd think this was some kind of interpolation method. Though, it's named that for the 144hz mode on the 900c too, and that is native 120hz/144hz supposedly.

If it is some kind of interpolation applied (to say 120fpsHz), the questions would still remain -
. .How well does it work with the modern hardware and AI chip the 900D has?.
. . What exactly is it doing?
. . How well is it performing, and what quality and added detail if any is in the 4k to 8k upscaling in 4k 240hz mode? How does the 4k 240hz upscaled to 8k on the 900D look compared to: a high performing 4k 144hz gaming tv or 240hz 4k native gaming display that is using DLDSR to downsample from 8k using nvidia's AI? That and compared to those without DLDSR. What are the performance differences between those scenarios I just mentioned? Does using DLSS (+ frame gen) or not using those affect the quality of the 4k to 8k upscaling done by the 900D? If so, how? Looking forward to a deeper dive on 4k 240hz if/when he gets a review sample. Eventually the truth will come out from sources. I'm very curious about the answers.
 
Last edited:
===================================

there are some remaining questions about the 900D that hopefully will come out in the wash as more people get these, and eventually once respected review sites do more fleshed out reviews of them:

================================


. . 1. . For pc gaming, can the 900D do 120fpsHz 8k (off of a capable/future gpu via DSC), and, perhaps more importantly, is it 240fpsHz 4k capable (now) ?
Unless it specifically has modes saying the hz, you'd probably need a 5000 series nvidia gpu in 2025 to find out about 8k 120hz. Nvidia's bandwidth allocation on their 3000 - 4000 series gpus only allows for 8k 60hz so you won't be triggering anything past that using a nvidia gpu currently. Even if they gave the nvidia 5000 series gpu full hdmi 2.1 bandwidth on one port, at 8k 10 bit 444/rgb using DSC 3x (3:1) , you'd top out at around 115fpsHz max. Unless they used DSC 3.25 (or you chose to run 8bit or 4:2:2 chroma or something.)
Question is, can it do 4k 240Hz like they are claiming or is it some kind of interpolation? Would need a hdmi 2.1 PC/laptop to to test that. If it's not true 4k 240hz + upscaling, what is it doing and how well does it perform?
Samsung does have a 7680 x 2160 G95NC super-ultrawide that can do 240hz using DSC, so it's not crazy to think they might be able to do one twice as tall at 8k 115 Hz - 120 Hz via DSC (off of a capable gpu). The current gens of nvidia gpus can't get 240hz off of that super ultrawide either, though some amd gpus can.
Even if it is using some kind of AI / TV interpolation, how exactly is it doing it and what is the performance and picture quality like when using it? How does it's 4k 240hz upscaled to 8k look compared to 8k native, and how does it look and perform compared to a native 4k 240hz screen, a native 4k 144hz tv? (including frame rate gains/loses, detailed gained/lost, PQ/performance trade-offs).

.. 2 . . Is the AI upscaling chip being utilized at all in game mode to upscale 4k to 8k?
If so, at nearer 60 to 50 deg viewing angles, is it adding detail and doing cleaner upscaling (in game mode) compared to 4k gaming tvs that lack the new gen 3 AI chip? Is the FALD performance worse (wider zone spread and/or slower transitions) in game mode? Does it lose black detail or clip/blow out the top brightness detail in game mode? Specifically, in regard to well designed PC HDR games with good HDR implementations.

.. 3.. How is the uniformity in PC desktop/app usage as a giant 4x 4k desktop space? How do solid fields of color, backgrounds, workspaces and interfaces look? Text + text sub-sampling quality? How well is media displayed in a window on a pc desktop as compared to full screen media modes?

.. 4...When near viewing of full screen media in media modes, where it fills your central viewing angle at 60 to 50 degree horizontal, is the 900D getting increased details via it's AI chip compared to native 4k gaming TVs ?
How is the AI-gained detail of 4k content at ~4' away for a 65" or ~ 5 feet away for an 85", where any gain would be more appreciable, compared to a native 4k resolution screen ? The primary marketing push for the 900D, (and it's price point) is that it finally makes 8k "worth it" due to the detail gained by the advanced AI upscaling / machine learning adding detail to the gap between 4k native content and 8k resolution.

That and the detail provided to lower/dynamic bit rate streamed media, where it is less than even the 4k uhd disc quality.

. . 5 . . Is tiled multi input working? Specifics?
Is the one connect box allowing multiple inputs? What resolutions can be input and at what resolution are the tiles/layouts on the 8k 900D? E.g. can it do a quad of 4k inputs 1:1, or a 7680x2160 space on the bottom with two 4k tiles on top? What layouts/arrangments is it limited to specifically? What Hz are the multi-inputs capable of or limited to?

. . 6 . . . Is Forced upscaling to full 8k defeatable? Can you run native 4k, 5k, 6k, ultrawide resolutions letterboxed at higher fpsHz than 8k 60? Previous samsung 8k gens can't do that.
. . 6a . . Is DLDSR (nvidia supersampling) functional on the 900D? Can you disable DSC in any way if you had to? DLSDR is dynamic supersampling, downsampling higher rez to your screen's native resolution, but it's using AI machine learning to do it more efficiently now.

. . 7 . . General HDR media performance with AI upscaling
How clean is it? Artifacting in complex, fine detail swirling patterns? Moving chain link fences, camera on rails shots running through trees/foliage, etc. ? Can it clean up noise and other grain that appears in some content? Does it have good detail in blacks and unclipped detail in high volume color and whites? How do subtitles look? Does scene content bleed/glow into black bar letterboxing? How do AI enhanced 4k HDR10 movies upscaled to 8k look parameter wise compared to a comparable 4k + dolby vision capable screen running the same movies in DV?

. . 8 . . Does it have ABL? If so, how aggressive is it?

. . 9 . . Is there noticeable dithering on the 900D?
Specifically, is dithering visible when using a 900D nearer at a 60 to 50 deg horizontal viewing angle where it fills your central view without being pushed into your periphery? If so, is it less visible with VRR active (which was a work-around on previous samsung 8k models)? If it's there, how visible is it in each of those modes? (Though using VRR mode might result in disabling some or all of the other AI/media picture improvements so might not be a good work-around overall).


==================================


That's all I can think of at the moment. Some big things like those have not all been confirmed yet. I'm not trying to push anyone in threads to answer these questions. They will come out in a RTings or other review eventually. I'm just very curious about those answers if any owners happen upon them at some point in the meantime. Not all of them are deal-breakers but I'd like to know what it can do and how it performs. Lmk if you think of any other major questions about it. If I see anyone giving some solid answers, I'll post them here.
 
Last edited:
Only 1200 nits when the old 4K was able to do 1600 :bored: I guess it's just not possible to maintain the same brightness or increase it when you are going from 4K to 8K. It's the same case on the Samsung 8K TVs where they are dimmer than the 4K counterparts.
You probably could, with a better cooling solution, but I bet they haven't figured out a cost effective one which doesn't involve loud fans and forced air.
 
Some of the glancing reviews so far are saying the 900D is 20% to 30% brighter than the 900c. Won't know if that is true until RTings and others do a fully fleshed out review showing real world performance.

The 900C's HDR movie mode (8.7) . . Game Mode (8.8)


1000005466.jpg



To your point though, the brighter samsungs suffer aggressive ABL, most likely due to the fact that they stick with the slim design look. I'd rather a backplane heatsink, a little boxier vented grill housing, and a few user selectable active cooling profiles, but it is what it is. Pros and cons.


Sony might also release a 8k tv but idk if it will be available in 65 inch and it'll probably be even more cost prohibitive than the 900D.

Some of the desktop monitors seem to adopt lower brightness limits compared to gaming tvs, with some notable exceptions like the pro art ones but they do have boxer vented housings, heatsinks, and internal fans on a set active cooling profile.
 
Last edited:
My TV guy says yes the successor to the 2023 Sony Z9K is coming, but not out yet.
8K miniLED , but 75" is the smallest.
Expect 10k$
 
32" way too small for 8K

Depends on your usage scenario. Very high PPD is nice for art and images, and graphics work.

Besides, if you keep any same-resolution screens filling your central viewing angle at around 60 degree to 50 degree wide, they will all fill the exact same amount of your FoV and have the exact same perceived pixel sizes/density.

Any 8k screen of any size at a 64 deg viewing angle = ~ 119 PPD.
Any 8k screen of any size at a 60 deg viewing angle = ~ 129 PPD
Any 8k screen of any size at a 50 deg viewing angle = ~ 154 PPD


I agree that it would be nicer to have a larger one so that you could sit a little closer like a 4x 4k array of screen space without being physically near to the screen surface, but lacking something like a 1000R curvature I wouldn't want to sit ~too~ close b/c the pixels at the sides would be way off axis from you compared to a real multi-monitor setup where you would typically turn the side monitors inward so they are on axis to your eyes.


. . . .

A 65" 8k like the 900D viewed at 45" away screen surface to eyeballs (21" or less gap behind a 24" deep desk) would give you 64 deg viewing angle width and ~ 129 PPD.

A 32" 8k like the PA32KCX viewed at ~ 23" away screen surface to eyeballs would give you the same viewing angle and the same perceived pixel sizes (129 PPD), same perceived screen size filling your personal FoV.

. . .

Maybe you could sit at something like a 75 deg to 80 deg wide viewing angle with some of the screen split into your peripheral for multi-monitor like desktop/app use though.

10 - 15 deg <<<(50 - 60 deg central)>>> 10 - 15 deg peripheral

and do a little head turning when using one as a multi-monitor style scenario, but the farther pixels on the last 10 - 15 degrees of each end of the screen would be off axis more.
Sitting nearer than that would be too aslant and off axis in the periphery imo on a screen w/o a decent curvature.

These would look pretty much the same:

On a 65" 8k, 75deg to 80 deg wide viewing angle = 37" view distance to 34" view distance.
On a 32" 8k, 75 deg to 80 deg wide viewing angle = 18 " view distance to 17" view distance


I wouldn't watch full screen media and games sitting that close. It would be nice to have a larger screen just for the spacious aspect in the room though too, rather than sitting right up near a panel, but that's my personal preference.
 
Last edited:
32" way too small for 8K

For gaming probably ? Q3 only needs 1024/768
----
I'm doing video, specifically ski videos. I currently shoot 4k60 gopro and 5k60 DSLR.
I'm more than likely getting the QN900D next month, but as everyone said, there's no content.
That means I need to upgrade my entire process flow to 8K.
Camera
Go-Pro's (soon)
Video Cards
I need some 32-34" 8K monitors for the video edit machine.

:ROFLMAO:
 
Perceived size is relative to viewing distance. A 32" 8k filling your central 60 to 50 degree viewing angle looks perceptually the same size and pixel density, width to your perspective as a larger 8k at 60 to 50 degree viewing angle. The difference is how much space is between you and the face of the screen (and perhaps how well you can focus near-sighted or far-sighted was if you have astigmatism/aged eyes and aren't wearing corrective lenses for whatever reason).

You could sit a little closer to a larger 8k without pushing it farther into your peripheral and being aslant and off-axis on the far sides if it had an appreciable curvature, but these screens are flat.

Modern games benefit from 4k+ resolution if you have a flagship gpu to drive them (with dlss + frame gen as necessary). 4k upscaled well enough with good performance would be great. 8k at 60hz isn't interesting to me. If a nvidia 5000 series allows 8k 10bit 444/rgb at115hz or more of of a single hdmi 2.1 port via DSC, and the 900D then allows 115hz or more with DSC functionality on the screen, then I'd mess with 8k for certain games but otherwise, no.
 
As the QN900C had its price lowered substantially (with some additional edu discounts etc) and the QN900D as expected did not turn out to be that revolution it was initially touted to be by some "reviews", I now find myself again with a 65" QN900C on my desk. Will be interesting to see how long it will survive this time :D If only Samsung had made a bit more effort with regards to using it as a PC monitor and not just a TV. There are definitely things that is really best in class here, but also other things that really isn't. To be continued :)

One interesting thing to note, as I have evaluated this TV before, is that "packaging" seem to have changed and maybe other things as well, as I can now do 4K@144hz with no problems at all so far using the same PC that last time had all the One Connect Box problems, and I am even using the long cable. Maybe Samsung actually changed some things here from the early QN900Cs. Based on memory, I believe that the cables for the One Connect Box has changes as well, now looking a bit different and also a bit thicker. That is from memory though.
 
As the QN900C had its price lowered substantially (with some additional edu discounts etc) and the QN900D as expected did not turn out to be that revolution it was initially touted to be by some "reviews", I now find myself again with a 65" QN900C on my desk. Will be interesting to see how long it will survive this time :D If only Samsung had made a bit more effort with regards to using it as a PC monitor and not just a TV. There are definitely things that is really best in class here, but also other things that really isn't. To be continued :)

One interesting thing to note, as I have evaluated this TV before, is that "packaging" seem to have changed and maybe other things as well, as I can now do 4K@144hz with no problems at all so far using the same PC that last time had all the One Connect Box problems, and I am even using the long cable. Maybe Samsung actually changed some things here from the early QN900Cs. Based on memory, I believe that the cables for the One Connect Box has changes as well, now looking a bit different and also a bit thicker. That is from memory though.

Might depend on your expectations, and your usage scenario needs in regard to the 900D. As you know from following the AVSforum thread, the owners there said the 900D's AI upscaling upscales 1080p material with appreciable detail increase, also some word on gaming. Still no word on the quasi 4k 240hz gaming capability in any great detail yet however.

It's way overpriced at release though. Definitely worth waiting it out. Even if you wait 4 months or so, flagship samsung monitors and tvs drop a considerable amount, let alone if you wait toward year end, black friday, etc. It's tough with samsung b/c they have no real competition to their top tier monitor formats like each time a majorly different newer ultrwawide format gets released, and obviously the 8k gaming tv space.
 
Might depend on your expectations, and your usage scenario needs in regard to the 900D. As you know from following the AVSforum thread, the owners there said the 900D's AI upscaling upscales 1080p material with appreciable detail increase, also some word on gaming. Still no word on the quasi 4k 240hz gaming capability in any great detail yet however.

It's way overpriced at release though. Definitely worth waiting it out. Even if you wait 4 months or so, flagship samsung monitors and tvs drop a considerable amount, let alone if you wait toward year end, black friday, etc. It's tough with samsung b/c they have no real competition to their top tier monitor formats like each time a majorly different newer ultrwawide format gets released, and obviously the 8k gaming tv space.
Seems like few have yet to compare it with the QN900C from what I have seen, while it is naturally more of an upgraded from the QN900A as even the QN900B was that. My intended usage is also only as a PC monitor, perhaps flanked by an 240 hz OLED even though my actual gaming is much less than the gaming I always hope to do.

Plan to compare the QN900C side by side with the Neo G9 57" if I can just find the time and space for it. The QN900C being glossy with no grain and also flat are big advantages as well, especially when wall mounted above/behind the desk. The Neo G9 is more of a furniture. Of course, there are quite a few drawbacks as well with the QN900C as a desk PC monitor.

I kind of see it as a 5k3k monitor with additional space available for things you might not really work with. If only there was a way to disable that scaling (at least above 60 hz). Being a brightness nut, I am running it at max brightness in a almost dark room and just loving it, even though I am maybe 80 cm away from it :D
 
Last edited:
A few big questions still remain that would potentially be huge gains over the 900C, beyond the 900D already supposedly adding detail to 4k via AI that the 900c lacks, and potentially being 20 to 30% brighter (not crazy that it might be somewhat brighter than the 900c, though it might be less than that, will see what RTings says eventually), and perhaps having better motion quality in media, plus AI enhancement of fast moving objects like balls in sports. Also might perform better in game mode will have to see. (Some samsung TVs in game mode use a wider FALD spread of zones and slower transitions).

. . What exactly is the functionality and quality of the "240hz" 4k , and how well is it AI upscaled to 8k by comparison to the 900c (and with greater motion clarity vs sample-and-hold potentially to boot if refreshing more often), and compared to 4k native screens?

. . Will the nvidia 5000 series be able to output full hdmi 2.1 bandwidth on a single port, unlike the 3000 - 4000 series that can't even do 240hz 7680 x 4320 (even though a full bandwidth hdmi 2.1 port using DSC is technically able to) ? If so, will the 900D allow 8k 10bit 444/rgb HDR at 115hz - 120hz via DSC from a nvidia 5000 series gpu ?
 
Last edited:
A few big questions still remain that would potentially be huge gains over the 900C, beyond the 900D already supposedly adding detail to 4k via AI that the 900c lacks, and potentially being 20 to 30% brighter (not crazy that it might be somewhat brighter than the 900c, though it might be less than that, will see what RTings says eventually), and perhaps having better motion quality in media, plus AI enhancement of fast moving objects like balls in sports. Also might perform better in game mode will have to see. (Some samsung TVs in game mode use a wider FALD spread of zones and slower transitions).

. . What exactly is the functionality and quality of the "240hz" 4k , and how well is it AI upscaled to 8k by comparison to the 900c (and with greater motion clarity vs sample-and-hold potentially to boot if refreshing more often), and compared to 4k native screens?

. . Will the nvidia 5000 series be able to output full hdmi 2.1 bandwidth on a single port, unlike the 3000 - 4000 series that can't even do 240hz 7680 x 4320 (even though a full bandwidth hdmi 2.1 port using DSC is technically able to) ? If so, will the 900D allow 8k 10bit 444/rgb HDR at 115hz - 120hz via DSC from a nvidia 5000 series gpu ?
Until proven wrong, I expect the QN900D to be better than the QN900C but more of an evolution than a revolution. The marketing is kind of the same as it has been every year with mostly vague claims and new invented expressions and Samsung pseudo science. Like this 240 hz stuff.

Things probably change in favor of the QN900D if you plan to use it as a TV rather than a monitor. When lag is factored into the equation, I am guessing that most of that fancy AI processing etc. might not be active anymore.

As I know I will be upgrading way to early regardless of what I get, it seems wise to play the price/performance game a bit albeit in the higher tier :)

Seeing the pros and cons of the QN900C and the Neo G9 57", I just can't help but to wonder what monitor Samsung could produce if they actually combined that into one monitor/TV.
 
. . Will the nvidia 5000 series be able to output full hdmi 2.1 bandwidth on a single port, unlike the 3000 - 4000 series that can't even do 240hz 7680 x 4320 2160 (even though a full bandwidth hdmi 2.1 port using DSC is technically able to) ? If so, will the 900D allow 8k 10bit 444/rgb HDR at 115hz - 120hz via DSC from a nvidia 5000 series gpu ?
I would expect the 5000 series to silently upgrade the HDMI capabilities to match the 7900 XTX, maybe with a full speed DP 2.1 port unless Nvidia decides to nickel and dime us again.

I expect the answer to the second question is no. Samsung is unlikely to have the incentive to support above 8K @ 60 Hz on these even if the panel is capable of higher refresh rates, when the GPUs to support it don't exist. I don't know what the max res for 7900 XTX is, but I expect it is also limited to 8K @ 60 Hz even though in theory DP 2.1 could do 144 Hz @ 10-bit with DSC.

It is a TV first and foremost after all, so expecting "PC only" capabilities is too optimistic.
 
It is a TV first and foremost after all, so expecting "PC only" capabilities is too optimistic.
Which is kind of ironic as that is the only real usage I can see for 8K at the moment. But I guess that the number of people willing to pay for an 8K 65" to use at the desk is limited :)
 
Which is kind of ironic as that is the only real usage I can see for 8K at the moment. But I guess that the number of people willing to pay for an 8K 65" to use at the desk is limited :)
Absolutely. 8K would be better off marketed and designed for e.g programmers, stock traders, people managing a ton of different Excels and so on who need a lot of desktop space for work, and would probably pay for the right product, say a 55-65" Samsung ARK 8K.

Instead they push it for "rich people who just want something big and expensive to put in their McMansion."
 
Absolutely. 8K would be better off marketed and designed for e.g programmers, stock traders, people managing a ton of different Excels and so on who need a lot of desktop space for work, and would probably pay for the right product, say a 55-65" Samsung ARK 8K.

Instead they push it for "rich people who just want something big and expensive to put in their McMansion."
Must say that the quality of text and other "desktop stuff" is really remarkable good on the QN900C (and the B/D). That glossy coating and the wide viewing angle really does help a lot with removing the grain and also combating the poor viewing angles of the Neo G9 without it having to be curved. My main gripe from a work/productivity standpoint is that in order to get 8k@4:4:4, some dithering seem to be introduced that make solid backgrounds appear with vertical stripes, which kind of reminds me of scan lines of the Neo G9. It also has problems with text closer to the edges appearing "double" (as reported by Rtings) but I would say it only really becomes noticeable outside of the 5k3k area, which at least to me is outside of the area where I would want any windows that I am supposed to more than glance at (besides maybe playing some video etc in case this problem is not relevant anyway).

That lack of matte coating grain really does wonders, and I would rate the QN900C above the Acer X32, CM GP27 and also the Neo G9 57" based on memory just because of that. It is a personal preference though and probably depends a lot of what your room looks like and where you stand on matte vs glossy in general. The viewing angles are really the best I have seen on an LCD and even rivals OLEDs, at least in the usable work area as defined above.

Quite a few cons as well as mentioned before, will do a wrap up of that in a few days time but they are what you can expect, undefetable scaling, some dithering, chicken wire effect from the FALD and depending on how you look at it, reflections/glossy. And also the fact that it consumes a hefty amount of power and therefor also produces quite a lot of heat.

Edit:

The coating is probably best described as semi glossy, almost regret pulling of the super glossy plastic protection film (that Samsung apparently put on some but not all of their TVs/monitors) but I know that in the long run those reflections will kill your eyes and it also had the usual "pull here" sticker. That ultra wide viewing layer does wonders for viewing angles but it does introduce a bit of moiré pattern near the edges when sitting as close as at least I need to in order not to enable scaling (same PPI as a 4K 32"). With 33+ million pixels you can of course always consider enabling scaling, but that just feels wrong even though at least 10 million of those pixels only display the desktop as I run out of windows :)
 
Last edited:
Must say that the quality of text and other "desktop stuff" is really remarkable good on the QN900C (and the B/D). That glossy coating and the wide viewing angle really does help a lot with removing the grain and also combating the poor viewing angles of the Neo G9 without it having to be curved. My main gripe from a work/productivity standpoint is that in order to get 8k@4:4:4, some dithering seem to be introduced that make solid backgrounds appear with vertical stripes, which kind of reminds me of scan lines of the Neo G9. It also has problems with text closer to the edges appearing "double" (as reported by Rtings) but I would say it only really becomes noticeable outside of the 5k3k area, which at least to me is outside of the area where I would want any windows that I am supposed to more than glance at (besides maybe playing some video etc in case this problem is not relevant anyway).

That lack of matte coating grain really does wonders, and I would rate the QN900C above the Acer X32, CM GP27 and also the Neo G9 57" based on memory just because of that. It is a personal preference though and probably depends a lot of what your room looks like and where you stand on matte vs glossy in general. The viewing angles are really the best I have seen on an LCD and even rivals OLEDs, at least in the usable work area as defined above.

Quite a few cons as well as mentioned before, will do a wrap up of that in a few days time but they are what you can expect, undefetable scaling, some dithering, chicken wire effect from the FALD and depending on how you look at it, reflections/glossy. And also the fact that it consumes a hefty amount of power and therefor also produces quite a lot of heat.
:)
My take on the whole glossy vs matte is that I don't care as long as matte coating is not the awfully grainy type seen on some displays. I don't mind the one on my dual Samsung G70A 4K 28" IPS displays.

All the other stuff are going to be more issues, but going back to 60 Hz is a tough one. If I put one of my displays at 60 and leave the other at 120 or 144...just moving the cursor around feels so bad at 60 Hz. So I'd love to see 8K go to at least 100-120 Hz in the future. 120 -> 144+ Hz is less of an issue for me because it's not as noticeable change compared to 60 -> 120 Hz.

The sheer size of these things is the other problem. If you stacked two of the Neo G9 57" models you end up at ~63" TV size and with 1000R curve, that would probably be acceptable to use even if it is pretty huge. But for flat, I don't see myself using a screen above 55" as a desktop display, you need to push your table back quite a bit and I don't have the space to do that.

I think I like the idea of a large 8K screen more than the reality, because I'm pretty fine with the amount of desktop space on a dual 4K display, which is why I'm still considering the Samsung 57"...just want to spend less than 2099 € on it if I can.
 
Agree 100% on avoiding abrasions ~ matte abraded AG outer layer. It might seem like a small facet to avoid a screen for but it's a big deal to me and makes the idea of these screens much more appealing. Also agree about the ridiculous pricing of their 8ks and flagship monitors at release, and that a 8k 1000R ark would be a desirable form factor. Even the arks were ~ $3600 at release however. They dropped a lot over time, by $1000 within 3 months, down $1500 in 6 months, and I've seen them on sale for $1500 - $1800, (especially if you qualify for a samsung discount). The regular G9 dropped $1000 in a few months also. I agree that the 900C is in a good spot price wise now, if you want an 8k right now. I'm holding out on seeing what the 900D can deliver (prob have to see someone like RTings do a very detailed review of it as well as following/awaiting the price history graph over months). That, and waiting on a 5000 series on the gpu end too though those kinds of purchases wouldn't have to be in the timeframe necessarily.

Will have to wait and see about the functionality on those two questions.

Since the g95nc has a 240hz at 7680x2160 (4k+4k) capable hdmi port it's not impossible for samsung to have hdmi 2.1 capability higher than 60hz at 8k on the 900D using DSC, but it might not due to possible cost savings (profit vs customer base usage) as was suggested. No details on that anywhere.

The exact details of what the 240hz ~ "quasi" 240hz mode does and how well and cleanly it does it (+ at what detail levels after upscaling, also taking into consideration potential "240hz" level of sample and hold blur reduction as compared to lower fpsHz screens), and how that all compares to native 8k 60hz (just for comparisons sake, 4k upscaled vs 8k native detail wise, I have no interest in 60hz gaming) . . and how that all compares to other 4k 120hz and 240hz gaming screens. Those are still hugely important questions to me, and probably more important than the 8k gaming capability in the nearer term , that is, at least if the 4k "240" results are positive.

I'd also like to know at what hz can it do multi-input tiles, and in what layouts/sizes. I think the ark (gen 2) can do 120hz in multi input, which is a little less than it's 165hz full screen at 4k.

. . . . . .

Otherwise sure, using more than one screen, keeping a different one just for gaming is my default go to. However this being 65" minimum (and price wise) makes that idea a lot less appealing as the "non-gaming" display in a multi-display setup. Increased view distance would help like I always say, so it's not impossible to get a decent viewing angle and viewing ergonomics. Probably around 45" screen surface to eyeballs, which is a 21" gap or less behind a 24" deep desk, depending where your head ends up. So it's doable. However, if I got a 8k screen primarily for desktop/apps, I'd essentially be swapping my 48" oled central screen's duty as a media and gaming screen "stage", for a larger, primarily desktop/app screen. To me, the whole point of spending that much money on a 8k in a pc command center scenario would be to replace a multi-monitor setup for the most part, like the ark was marketed as (though it being quad of 1080p rez meant it didn't fit that bill). If the gaming functionality isn't there, if the 900D doesn't cover all of the bases at least well enough to my liking, then it isn't worth the cost, even if it dropped to 2500 - 3200 usd. If 8k in my setup was just added desktop/app real-estate with a different screen for gaming, I'd probably stick with multi-monitor setups with a central 4k gaming tv rather than a 8k screen, though I would hope that larger gaming tvs would start to do 240hz 4k at some point as a worthy upgrade. I'd rather not go back to smaller desktop monitors if I can avoid it, and even the 57" g95nc is too short to viewing perspective for my taste (plus has abraded surface).

Essentially, in an upgrade to 8k over my central 48" 4k oled I'd want to add a lot more desktop/app real-estate, more desktop/app use on the central screen (without concerning myself about oled best usage practices), good HDR media capability, and importantly -> 4k HDR gaming upscaled to 8k that is better quality/performance overall than 4k 120hz/144hz gaming on a 4k gaming tv, (or 165hz on an ark I suppose). I'm not expecting it to be on par with a 32" native 4k 240hz display for gaming but I would want it to be a gain from 4k 120/144hz gaming tvs while getting all of the desktop/app real-estate outside of gaming. Non-native gaming spaces at high hz would be great too but that doesn't seem to be available. If capable of 115 - 120hz 8k on a 5000 series gpu that would be a big pro in the balance as well. Glossy is a big facet too though like I said. The ark and a lot of other FALD screens have abraded (scratched) outer layers which makes them much less appealing to me overall.
. . .




Pricing history of the ark and the neo G9.
Since they always ask a bigger early adopter price, it's worth waiting it out, kind of like buying a car at the end of the model year, b/c by then they'll have dropped a lot. It's usually not even that long for samsung to drop price. Plus some qualify for samsung discount on top of that (though that removes the ability to add a best buy warranty by buying from them, even though you can pick up the samsung purchase at best buy ironically).

https://pangoly.com/en/browse/monitor/samsung

firefox_OQpHrQLAV1.png


. .

firefox_jyZXsVmDtp.png
 
Last edited:
My take on the whole glossy vs matte is that I don't care as long as matte coating is not the awfully grainy type seen on some displays. I don't mind the one on my dual Samsung G70A 4K 28" IPS displays.

All the other stuff are going to be more issues, but going back to 60 Hz is a tough one. If I put one of my displays at 60 and leave the other at 120 or 144...just moving the cursor around feels so bad at 60 Hz. So I'd love to see 8K go to at least 100-120 Hz in the future. 120 -> 144+ Hz is less of an issue for me because it's not as noticeable change compared to 60 -> 120 Hz.

The sheer size of these things is the other problem. If you stacked two of the Neo G9 57" models you end up at ~63" TV size and with 1000R curve, that would probably be acceptable to use even if it is pretty huge. But for flat, I don't see myself using a screen above 55" as a desktop display, you need to push your table back quite a bit and I don't have the space to do that.

I think I like the idea of a large 8K screen more than the reality, because I'm pretty fine with the amount of desktop space on a dual 4K display, which is why I'm still considering the Samsung 57"...just want to spend less than 2099 € on it if I can.
Never really seen 60 hz for productivity as a problem, probably because I am used to things way worse then high PPI 60 hz from my glory days. Would imagine that there are smothing options in the TV if needed/wanted. That is however one reason I might consider a QN900D, but mainly that if it would accept 8K@120hz, we could probably do GPU scaling at 120 hz as well and get around that scaling problem.

Having used both the QN900C and the Neo G9 57", everything else equal, I would much rather have the flat 65". Especially considering the horrible viewing angles of the Neo G9 and the curve will do nothing vertically. But I don't see it as one 65" but rather 4 32". Comparing the actual products there are of course differences, but I would still never consider a dual setup with the Neo G9s, but maybe the OLED G9s or one Neo G9 and a C2/3/4.
 
Otherwise sure, using more than one screen, keeping a different one just for gaming is my default go to. However this being 65" minimum (and price wise) makes that idea a lot less appealing as the "non-gaming" display in a multi-display setup. Increased view distance would help like I always say, so it's not impossible to get a decent viewing angle and viewing ergonomics. Probably around 45" screen surface to eyeballs, which is a 21" gap or less behind a 24" deep desk, depending where your head ends up. So it's doable. However, if I got a 8k screen primarily for desktop/apps, I'd essentially be swapping my 48" oled central screen's duty as a media and gaming screen "stage", for a larger, primarily desktop/app screen. To me, the whole point of spending that much money on a 8k in a pc command center scenario would be to replace a multi-monitor setup for the most part, like the ark was marketed as (though it being quad of 1080p rez meant it didn't fit that bill). If the gaming functionality isn't there, if the 900D doesn't cover all of the bases at least well enough to my liking, then it isn't worth the cost, even if it dropped to 2500 - 3200 usd. If 8k in my setup was just added desktop/app real-estate with a different screen for gaming, I'd probably stick with multi-monitor setups with a central 4k gaming tv rather than a 8k screen, though I would hope that larger gaming tvs would start to do 240hz 4k at some point as a worthy upgrade. I'd rather not go back to smaller desktop monitors if I can avoid it, and even the 57" g95nc is too short to viewing perspective for my taste (plus has abraded surface).

I am really against the dual monitor approach, mainly as I feel like I am on a constant holy quest to find the ideal monitor that can do it all. But having a PG32UCDM as well now, I am considering the idea of just keeping it to the side and just lift it up on the desk (or something like that) in front of the QN900C when I feel like gaming (and have time for it). That is one big advantage of having a really flat monitor wall mounted unlike having a normal monitor on the desk (or even mounted on an arm in most case as both the arm and the monitor usually builds depth). The alternative is to have dual setups, but even though I have the space for it, I just want to avoid it if I can.

For an avid gamer, the movable gaming monitor might not be ideal, but based on my actual gaming, it would probably do. Especially as the QN900 would be able to do stuff like sim racing etc better due to its size.

But about here is also were you would also start considering the Neo G9 57" as a jack of all trades, kind of like the LCD version of the LG C-series. Not the best at anything but good at many things. If it hadn't been curved and matte, that would probably have been my solution and still might be despite that.

One big advantage of having something like a PG32UCDM around would also be that while 8K productivity might be fine for a desktop, it might not be the same success when trying to plug in an average business laptop from a client. It could have been if only the QN900 had an option to disable scaling, but since it does not, 4K would probably mean 65" 4K when used with an average laptop. Not ideal. Of course, a PG32UCDM feel way to good of a monitor to use as a backup and occasional FPS gaming. YOLO I guess :)

Choices choices :D
 
Dual monitor has been the only way to get better of two monitor tech tradeoffs for me since at least 2006. A 65" 8k relegated to more of a desktop/app monitor than a gaming monitor doesn't fit the tiled scenario very well for me though.

I disagree about flat. A 1000R(adius) curvature is a 1000mm radius, so if you have a screen with dimensions sympathetic to that you could sit at that 1000mm ~ 40 inch away viewing distance where the pixels would be on axis to you. If you are sitting closer to that, or closer than 60 to 50 deg viewing angle on a flat screen, you are going to have more pixels off axis to you - especially on the sides but if it's a large , for example 65" 16:9 screen sitting nearer on top of a desk it would be looming above you so the uniformity would probably be bad at the top portion of the screen and the top corners, like a gradient that is more non-uniform, (more degrees away from where it should be in the center of the screen) the farther and farther away from you the pixels are. The most uniform should be if your head was aligned more or less at the center of the screen, and where the pixels were aimed as directly at you as possible.


1000R.Curve.png

................................
902903_reflection-light_facing-monitor_1.gif


The g9 is too short for my tastes. I still have a 32" 1600p screen on a lunch counter that I use for some things so I have a good idea of how tall 32" 4k would be. Sitting any farther than where you would at a 32" 4k and the G95NC would be shrunken even more to your perspective. The AG is a turnoff to me among a few other things, but the ultrawides being a shorter screen just doesn't do it for me. You have to sit far inside of the ~ 40" center of curvature in order to get enough height, probably where you would sit at a 32" 4k on a desk like I said. It would be a lot better imo if it had a 700R to 800R curvature, and if it was somewhat taller in physical height, prob with a different resolution. At least for my tastes.

I actually thought about doing a remote control hydraulic pillar stand (the kind people use to raise the living room tv from a hutch it is otherwise hiddin inside of, or from behind a couch, etc) - where I'd raise the gaming monitor in front of the large 8k monitor's face and then hit a different streamdeck key to map the 8k screen space as a border around the somewhat smaller/shorter gaming screen, but I decided against it, at least for now.

As for plugging in a laptop, the ark and the 900D are both supposed to be able to do tiled multi-input, it's just that the ark is only 4k so it only gets a quad of 1080p real-estate wise and that's not a real modern "multi-monitor" environment imo.
 
Back
Top