4K Gaming: Need your advice for a new graphics card

You'll be spending upwards of $1800-$2100 for a 4090 at this point because of external factors.
I still don't understand this. I would have thought with the China ban on the top-tier GPU's the prices should have dropped, unless it's because Chinese buyers are circumventing the export ban and buying from exterior markets en-masse.
 
I would not sacrifice and take a 4080 off a 4090 for 4k. No way. 4080 is a 1440 card if you want to run games max settings, not 4k.

Also, you don't have to use the 4090 in 600w mode. You can just plug in 3 versus 4 cables and the vBios will default to a 450w capped mode.

I don't even know who thinks the default 4090 power state is 600W. The default TDP of most cards I think is around 450. I think my Suprim Liquid X allows 480W as the default power target, but it also caps at 520 or 530 at absolute max (Jayz noted that they're the only manufacturer that did this, and it looks like they've had less cable burning incidents). The highest I've seen is exactly that, but it takes me running Cyberpunk with no upscaling and everything absolutely maxed to even do that.

Then again I'm at 3440x1440, but games like Hogwarts Legacy and Cyberpunk can easily show you that a 4080 won't do jack even at 3440x1440. Scale it up to 4k (around 67% more pixels), it continues getting worse. I see posts saying "it plays <old game>, <older game>, <maybe even older game> and that's all I care about at 4k 120Hz"... and okay, fine, but what's the point of talking about that? If you play the original Thief games at 4k (assuming you can get them to run at 4k), maybe even a modern iGPU will do it.
 
I still don't understand this. I would have thought with the China ban on the top-tier GPU's the prices should have dropped, unless it's because Chinese buyers are circumventing the export ban and buying from exterior markets en-masse.
Just look at the cost of everything else. If people are lining up to still buy your product, why drop the price?
 
Dunno how long it's gonna be before price increases get bad. I posted an open box Windforce in another topic for 1437 and then here's what I have nearby me now:
1698696794578.png


Still plenty in stock in general as well.

Actually, wasn't the 4090 removed from that ban anyway? I thought that's what a link that someone posted said.
 
Maybe I'm just not playing new enough games, which could totally be it. There aren't any real recent games that interest me.

Path of Exile I get over 100fps. When I was playing World of Warcraft, 200fps mid raid maxed graphics. I unlocked the FPS in Fallout 4 and was 200+. Doom 2016 I think is locked at 60. I played Warzone a few years back and was 100+ fps.

I guess, yea, newer games like whatever Call of Duty or Battlefield I'd see the difference.

Running a 5800X3D by the way.
Responding to a thread where someone asks for a card to run "everything on ultra at 4k" with "my card is fine at 4k, people are telling you to spend too much" when most of your games are like 10 years old makes very little sense.
 
Responding to a thread where someone asks for a card to run "everything on ultra at 4k" with "my card is fine at 4k, people are telling you to spend too much" when most of your games are like 10 years old makes very little sense.
I was responding to someone stating a 4080 card is a 1440 card.

That being said, I'm out.
 
I'm not trying to justify anything and would be making the same advice no matter what hardware I was using. Having a 4090 though I know first hand that a 4080 is absolutely not capable of maintaining even 60 FPS in some of the latest games all settings cranked.
Well I’m playing Alan Wake 2, and the new Forza game at Ultra settings with my 4080 at 4k. Oh well enjoy, I’m out.
 
Well I’m playing Alan Wake 2, and the new Forza game at Ultra settings with my 4080 at 4k. Oh well enjoy, I’m out.
So do I need to link to benchmarks or maybe take a screenshot myself because you sure as hell are not maintaining 60 FPS without settings reduced. In fact maxed out on a 4090 and DLSS quality, the game still drops into the low 50s in the forest. That absolutely will put you in the 40s in those same spots.
 
So do I need to link to benchmarks or maybe take a screenshot myself because you sure as hell are not maintaining 60 FPS without settings reduced. In fact maxed out on a 4090 and DLSS quality, the game still drops into the low 50s. That absolutely will put you in the 40s in those same spots.
Yeah, I barely manage decent frames with a 4080 and dlss quality at 1440. He’s completely full of it unless he’s running dlss performance, which I don’t consider max settings or even decent settings.
 
I play Warzone at full tilt settings with dlss quality and constantly get 116 fps (vrr cap). Nonstop. 4080 is capable. Sorry if you don't believe that. I see the fps counter in warzone and unless that is full of shit, the 4080 can do 4k max settings in plenty of games. Fornite with max settings but no ray tracing, I get 80 to 95fps.
 
Last edited:
I play Warzone at full tilt settings with dlss and constantly get 116 fps (vrr cap). Nonstop. 4080 is capable. Sorry if you don't believe that. I see the fps counter in warzone and unless that is full of shit, the 4080 can do 4k max settings in plenty of games. Fornite with max settings but no ray tracing, I get 80 to 95fps.
But dlss even at quality is far from 4k, not even close, it is less than half the pixel count (2.25).And warzone is made by one of the best team engine wise with performance being top of the list of their priority. A bit the same for a Fornite, that the type of game made to run on vast amount of computer worldwide, ps4, switch, to be fast paced action title with quite the limited graphic (when nanite&RT is off)

I am not even sure what the conversation is, no one is saying that a 4080 cannot play a long list of title at 1440p ultra setitng at nice FPS, even 4k.

Does anyone say there is not a growing list of title that every card, even a 4090 let alone a 4080, will struggle to play at ultra setting (which if not said otherwise include RT), 4k native ? I would imagine, no, no one think that.

So what people are arguing about exactly ?
 
If you are doing 4k or other high resolution displays for gaming, you should buy the absolute fastest card money can buy at the time you are building or upgrading the system.

4090 struggles at times in the latest AAA games. Anything below that is going to be objectively worse. There really isn't much else to say about it.

That is kind of how it has always been. 60 frame rates even for top of the line GPUs at 4K tends to be normal. For people like myself, I don't like 60 frame rates. 4K has always been a struggle but it is very high end gaming, certainly not for the average PC gamer.

4090 or 7900 XTX is what you need, IMO.
 
that's why 1440p is the sweet spot for gaming (I use a 27" 1440p G-Sync display)...the difference in visual quality to 4K is not that big but the performance impact is...with 4K gaming you're going to always have to buy the top end card
 
I have a 1440p 32 inch that I run at 4K all the time. Waiting on my 32 inch 4K 240 Hz OLED since 2020.
 
DLDSR factors. Allows to select 4K desktop. Which in turns allows 4K in all games that are full screen borderless. All full screen games already support 4K.

Not the best solution but to me looks and runs much better than 1440P native on my G7.
 
It is as the display size increases. You probably wouldn't really want a 43" 1920x1080 monitor.

That's kind of a TV at that point. No, it's literally a TV.

I too think 4k is kind of a stupid resolution to be quite honest. I've always been satisfied at 1440p. I like 3440x1440 at about 35 inches, looks great to me. At this ultrawide res, my 4090 is also still quite required to get good FPS on many titles, and the nice thing is that it's actually at least possible to do so. It's 33% or something more pixels than stand 1440p, but it's still only 67% of 4k so it's reasonable to drive even in, say, Cyberpunk without any upscaling at all.

1080 though can just go fly out the window, though. The move to 1440p impressed me quite a bit. 4k, much less so. Definitely need at least a TV on your desk to truly appreciate it imo, though I haven't done the pixel pitch vs sitting distance math.
 
That's kind of a TV at that point. No, it's literally a TV.
It literally isn't. There are actual 43" monitors out there. ASUS, ACER and GIGABYTE all have them. Keep in mind, at one time 27" was considered a TV size as well. The fact that there are TV's as small as 43" or smaller is irrelevant. Personally, I am not interested in anything under 38-40".
I too think 4k is kind of a stupid resolution to be quite honest.
You are entitled to that opinion. Though I do not agree.
I've always been satisfied at 1440p.
So? What does that have to do with anyone else? This is why multiple options exist on the market. I think 500Hz, 27" monitors at 1920x1080 are retarded. However, I understand I'm not the target market for them. The fact that I wouldn't ever buy one is irrelevant.
I like 3440x1440 at about 35 inches, looks great to me. At this ultrawide res, my 4090 is also still quite required to get good FPS on many titles, and the nice thing is that it's actually at least possible to do so. It's 33% or something more pixels than stand 1440p, but it's still only 67% of 4k so it's reasonable to drive even in, say, Cyberpunk without any upscaling at all.
After having a 34" Ultra-wide, I disagree. I believe 38" or so is better. One issue I have with 3440x1440 or similar displays is the lack of vertical screen space. For gaming, this isn't an issue. For productivity, I hate it.
1080 though can just go fly out the window, though. The move to 1440p impressed me quite a bit. 4k, much less so.
There are a lot of factors that go into how "impressive" the difference is. Your display history and use cases have a lot to do with it. I had multiple 30" 2560x1600 displays for years. 2560x1440 was hardly impressive to me. Given the lack of vertical pixel height, I felt like it was a step backward for productivity though it was fine for gaming.
Definitely need at least a TV on your desk to truly appreciate it imo, though I haven't done the pixel pitch vs sitting distance math.
I don't agree with a lot of the distance calculators and "rules" that people seem to agree on for display size / distance for either home theater setups or computer monitors. The "math" as they put it is flawed in my opinion. I think a lot of that comes from old school trains of thought. A more immersive experience in my opinion would involve a display that fills your peripheral vision comfortably and nearly completely. It should fill your peripheral vision but in a way where you won't ever need to turn your head to see anything. The display's text should be easy to read, either with or without font scaling if you can stand the latter. I personally can't.

My experience goes against some of the "common wisdom". For me, a display size of about 38" to 40" is probably perfect for all around use. For pure gaming at a desk, I can go 49" easily and even 55" or more if its wall mounted. Of course, you can make arguments for super wide displays. Depending on how you work, this might work for you but frankly, the way web pages are designed and the aspect ratio that movies are shot in, I don't think that's ideal. More vertical space often comes in handy for things like Photoshop or even browsing web pages. More vertical space means a bit less scrolling assuming fonts and dot pitches are equal. Games don't support super wide screen aspect ratios worth a shit and really never did. Even in the Eyefinity/NV Surround days, it didn't work well. So gaming on a 49" display that's only 1080 pixels high or whatever doesn't really work all that well in most games.

There are a lot of reasons why so many options exist on the market. Depending on the applications and use cases, what is ideal varies a lot. If you have a mixed use case like mine you have to decide what works best. For me, a 4K display may not be the best thing for gaming but for productivity, its a far better option than a 34" Ultra-Wide. I fucking hated the 34 for that. After having 2160P of vertical space, I don't think I can go back to having less.
 
It literally isn't. There are actual 43" monitors out there. ASUS, ACER and GIGABYTE all have them. Keep in mind, at one time 27" was considered a TV size as well. The fact that there are TV's as small as 43" or smaller is irrelevant. Personally, I am not interested in anything under 38-40".

You are entitled to that opinion. Though I do not agree.

So? What does that have to do with anyone else? This is why multiple options exist on the market. I think 500Hz, 27" monitors at 1920x1080 are retarded. However, I understand I'm not the target market for them. The fact that I wouldn't ever buy one is irrelevant.

After having a 34" Ultra-wide, I disagree. I believe 38" or so is better. One issue I have with 3440x1440 or similar displays is the lack of vertical screen space. For gaming, this isn't an issue. For productivity, I hate it.

There are a lot of factors that go into how "impressive" the difference is. Your display history and use cases have a lot to do with it. I had multiple 30" 2560x1600 displays for years. 2560x1440 was hardly impressive to me. Given the lack of vertical pixel height, I felt like it was a step backward for productivity though it was fine for gaming.

I don't agree with a lot of the distance calculators and "rules" that people seem to agree on for display size / distance for either home theater setups or computer monitors. The "math" as they put it is flawed in my opinion. I think a lot of that comes from old school trains of thought. A more immersive experience in my opinion would involve a display that fills your peripheral vision comfortably and nearly completely. It should fill your peripheral vision but in a way where you won't ever need to turn your head to see anything. The display's text should be easy to read, either with or without font scaling if you can stand the latter. I personally can't.

My experience goes against some of the "common wisdom". For me, a display size of about 38" to 40" is probably perfect for all around use. For pure gaming at a desk, I can go 49" easily and even 55" or more if its wall mounted. Of course, you can make arguments for super wide displays. Depending on how you work, this might work for you but frankly, the way web pages are designed and the aspect ratio that movies are shot in, I don't think that's ideal. More vertical space often comes in handy for things like Photoshop or even browsing web pages. More vertical space means a bit less scrolling assuming fonts and dot pitches are equal. Games don't support super wide screen aspect ratios worth a shit and really never did. Even in the Eyefinity/NV Surround days, it didn't work well. So gaming on a 49" display that's only 1080 pixels high or whatever doesn't really work all that well in most games.

There are a lot of reasons why so many options exist on the market. Depending on the applications and use cases, what is ideal varies a lot. If you have a mixed use case like mine you have to decide what works best. For me, a 4K display may not be the best thing for gaming but for productivity, its a far better option than a 34" Ultra-Wide. I fucking hated the 34 for that. After having 2160P of vertical space, I don't think I can go back to having less.
I think those distance rules should always be treated as ballpark figures just to get an idea. I know when I sit close to our 4k TV it becomes so obvious how shitty some of the apps/feeds are. Then if you sit on the couch farther away it's much harder to tell.
 
It literally isn't. There are actual 43" monitors out there. ASUS, ACER and GIGABYTE all have them. Keep in mind, at one time 27" was considered a TV size as well. The fact that there are TV's as small as 43" or smaller is irrelevant. Personally, I am not interested in anything under 38-40".

I didn't say there weren't 43 inch monitors.

1698771061533.png


I'm just saying that even today, 43 inches is still known by most people as TV size, not monitor size.


So? What does that have to do with anyone else? This is why multiple options exist on the market. I think 500Hz, 27" monitors at 1920x1080 are retarded. However, I understand I'm not the target market for them. The fact that I wouldn't ever buy one is irrelevant.

I didn't say it had anything to do with anyone else? I don't know what you're getting at here, I prefaced the entire thing with "I too think", implying it's just my opinion. As in, I just felt like giving my opinion. So what's your point?

After having a 34" Ultra-wide, I disagree. I believe 38" or so is better. One issue I have with 3440x1440 or similar displays is the lack of vertical screen space. For gaming, this isn't an issue. For productivity, I hate it.

It depends on how you work. If you need to multitask between a lot of programs, a set of ultrawides stacked on top of each other is nice. If you need a more holistic code view then you probably don't want a standard resolution to begin with. You want a monitor that's rotated 90 degrees. 4k at about 38 inches would probably be fine for that, too, but neither compares to a 1440x2560 display anyway.

The issue with a 38-43inch display in terms of productivity in my experience is that it's only one monitor. Personally when I work, I like having discrete monitors to begin with. With cheap monitor arms from Amazon, I can still easily set up two stacked 3440x1440 displays that are stacked via a cheap monitor arm holding up both of them, costing only $40 or so. With a 43" 4k, I wouldn't be able to do that. I would have to rely on windows snapping to actually set up the workspace. Discretization of my workspace between multiple monitors is important to me.

There are a lot of factors that go into how "impressive" the difference is. Your display history and use cases have a lot to do with it. I had multiple 30" 2560x1600 displays for years. 2560x1440 was hardly impressive to me. Given the lack of vertical pixel height, I felt like it was a step backward for productivity though it was fine for gaming.

I mean, my original monitor was 1920x1200. I had an HP like that for a long time. Then I moved up to 1440p, and it was much better. I did like the extra aspect ratio on the 1200 vs my older 1080 displays, but I wouldn't call it life changing. I would kind of put 1080p->1440p as the difference between like 40-60Hz and 120Hz. And then 4k is like 120Hz to 240Hz, in refresh rate terms. Again, I don't understand why we're just basically sitting here arguing about my opinion, though, which I clearly prefaced with it being my opinion.

I don't agree with a lot of the distance calculators and "rules" that people seem to agree on for display size / distance for either home theater setups or computer monitors. The "math" as they put it is flawed in my opinion. I think a lot of that comes from old school trains of thought. A more immersive experience in my opinion would involve a display that fills your peripheral vision comfortably and nearly completely. It should fill your peripheral vision but in a way where you won't ever need to turn your head to see anything. The display's text should be easy to read, either with or without font scaling if you can stand the latter. I personally can't.

Then you should try a 4k projector. It can be as large as you want. I've been wanting a 4k projector for quite a long time, because at the size they can get to, it actually makes sense. But they're about $6000-10000 USD. That's where I really want 4k at.
 
Last edited:
It is as the display size increases. You probably wouldn't really want a 43" 1920x1080 monitor.

what do you consider too big as far as a Desktop display?...for me 27" is bordering on the biggest size I want as a Desktop display...maybe 32" max...otherwise it's pretty much a TV...a have a 65" and 55" 4K LG OLED TV and bigger definitely is better but I don't want a 55" Desktop monitor...it's too big
 
what do you consider too big as far as a Desktop display?...for me 27" is bordering on the biggest size I want as a Desktop display...maybe 32" max...otherwise it's pretty much a TV...a have a 65" and 55" 4K LG OLED TV and bigger definitely is better but I don't want a 55" Desktop monitor...
Usually, people who say things like this have never had larger displays or multiple displays at once. I had 3x 2560x1600 30" displays for a very long time. I can make use of a great deal of monitor / screen real estate. As for the TV comment, who cares? I don't see how the display size being similar to TV sizes has any relevance. As I said above, I think 48" and 49" was a bit too large for me. Though that was only an issue for things outside of gaming. For gaming it was amazing. For mixed use, I think 38-43" is perfect, with 43" being the maximum size I am personally comfortable with.

You would probably find a 40"+ display overwhelming at first, but trust me you get used to it. Eventually, you want want to go to a much smaller display. I had 30" displays all the way back in 2007. So 27's and even 32's seem small to me.
it's too big
And how did you arrive at this conclusion?
 
Usually, people who say things like this have never had larger displays or multiple displays at once. I had 3x 2560x1600 30" displays for a very long time. I can make use of a great deal of monitor / screen real estate. As for the TV comment, who cares? I don't see how the display size being similar to TV sizes has any relevance. As I said above, I think 48" and 49" was a bit too large for me. Though that was only an issue for things outside of gaming. For gaming it was amazing. For mixed use, I think 38-43" is perfect, with 43" being the maximum size I am personally comfortable with.

You would probably find a 40"+ display overwhelming at first, but trust me you get used to it. Eventually, you want want to go to a much smaller display. I had 30" displays all the way back in 2007. So 27's and even 32's seem small to me.

And how did you arrive at this conclusion?

it's too big for me...my needs...my opinion...I just said that I have a 65" TV as my main 'TV/movie/4K Blu-ray' display...I don't game on it (for PC)...with a computer display there is such a thing as too big because you're sitting right in front of it...it's not like a 'TV' where you are sitting further away...yes you can push your computer display further back but most people will have it closer than a traditional television

a 65" computer monitor is insane...you're going to be constantly swiveling your head...so yes there is such a thing as too big (for a computer monitor)
 
I have a 1440p 32 inch that I run at 4K all the time. Waiting on my 32 inch 4K 240 Hz OLED since 2020.
32" 1440p is same pixel density as 1080p at 24". Once you get 32" or above, really need 4k for the increase in resolution to be of any use.
 
This is almost like a bunch of people debating eye prescriptions or glasses frames lol. I think monitor size, distribution, and PPI is purely a preference thing. Kind of also needs to get scooted over into the Displays forum at this rate, too.

32" 1440p is same pixel density as 1080p at 24". Once you get 32" or above, really need 4k for the increase in resolution to be of any use.

I mean, isn't that exactly it...? It's 1080p as if it was at 24 inches, scaled up to 32 inches. I liked the PPI just fine with 1080p at 24 inches, but I preferred my monitor to being 27-32 inches minimum. But 27 inches at 1080p sucks. 27-32 at 1440p was perfectly fine, though. I don't really understand this line of reasoning tbh. That's why from a resolution standpoint, 3440x1440 at 34-35" is totally fine to me tbh.

I don't think anything has blown me away as much as having a projector image stretched to 176" diagonal, though. Just wish I could see it in 4k. But these days I just don't use my projector very much, and a 4k Sony or whatever company projector that's capable of that would be horribly expensive.
 
This is almost like a bunch of people debating eye prescriptions or glasses frames lol. I think monitor size, distribution, and PPI is purely a preference thing.



I mean, isn't that exactly it...? It's 1080p as if it was at 24 inches, scaled up to 32 inches. I liked the PPI just fine with 1080p at 24 inches, but I preferred my monitor to being 27-32 inches minimum. But 27 inches at 1080p sucks. 27-32 at 1440p was perfectly fine, though. I don't really understand this line of reasoning tbh. That's why from a resolution standpoint, 3440x1440 at 34-35" is totally fine to me tbh.

I don't think anything has blown me away as much as having a projector image stretched to 176" diagonal, though. Just wish I could see it in 4k. But these days I just don't use my projector very much.
I think it makes a difference in the image quality in terms of pixel density for sure, but as you say, can be a preference thing.

Basically to say, you get the benefit of a larger screen, but the image quality itself is not actually any better.
 
Actually doing a bit of calculation on a web site, 3440x1440 at 34 or 35 inches are both slightly higher PPI than plain 1440p at 32" in would be anyhow though.

I think it makes a difference in the image quality in terms of pixel density for sure, but as you say, can be a preference thing.

Basically to say, you get the benefit of a larger screen, but the image quality itself is not actually any better.

I think everyone has to balance how well they can actually see all the fine details on the monitor vs the resolution anyway. I've always found that at 4k, with almost any display that close, the text was too small anyway... so I'm using zoom features in Windows regardless for work. Even in the extreme example I have above, which was 1080p stretched out to 176" via my projector, my friends and I still loved the picture on it quite a bit anyway. At normal sitting distances away from it, it was great. We could play split screen and it was like everyone had their own TV. Everything was visible, and if I ever did browsing on it, it was still fine. I would just zoom out, if anything. But I'm sure at that size, 4k would be even better.

I'm sure game aliasing would greatly diminish at 4k, but if you're having to use upscaling to drive it... I don't really see the point, imo. But again, it's different strokes for different folks. Just like I'm willing to invest into budget audiophile gear, while most other people say "diminishing returns" and scoff at it. Shrug.
 
32" 1440p is same pixel density as 1080p at 24". Once you get 32" or above, really need 4k for the increase in resolution to be of any use.
I am aware. Hence why I said what I said.
32 inch with DLDSR 4K still looks better in games than native 1440P.
 
For 4k, its 4090 or bust, especially if you like to bump up eye candy. I told myself a long time ago, not to nitpick over a few hundred dollars when purchase will last you for at least a year. I have had my 4090 for over a year now and the pain of the purchase is long gone. I haven't had the itch to upgrade because there is nothing to upgrade to. Its a win win situation.
 
Just to add my 2 cents to the screen size/res discussion... keep in mind that I am 48 years old, diabetic, and my eyesight is not great (astigmatism in each eye).

I have an LG 55" 120Hz 4K HDR IPS TV with FreeSync as my main display and a Dell 60Hz 1080p 23" IPS Touchscreen monitor as secondary. My man cave is in the basement, and my desk is an 'L' shaped model with the right side (while sitting at it) against a wall, and the back of my chair also against a wall. The TV is suspended from the ceiling joists about 6' away from my head, and tilted a little down (3 deg, maybe?) - since I sit at my desk with my feet up on the front part of the desk and have my keyboard in my lap, and the mouse on the desk to my right, the angle puts the screen center in my line of vision. The Dell is immediately to my right on a monitor arm so I can adjust it as needed and it is fully and easily viewable with a slight turn of the head. It typically sits about 1.5' from my head. Additionally the touch functionality is useful since that screen usually displays monitoring software, Discord, and is used to browse PDFs or other like documents - just stuff I want to view casually, or keep an easy eye on.

For the TV, Windows is set to 4K 120Hz with HDR always on and the TV itself is in "Game Mode" with all the motion-smoothing bullshit turned off. Zoom is set to 200% (down from the Windows recommended 300%). With the Web browser ser to 125% page zoom on top of that, everything is very comfortable for me to look at and appears to be the correct size - I don't have to squint to see anything. I used the Windows HDR adjustment tool and some general calibrations for this model of TV from a review site and the picture is fantastic - just very sharp and clear with great color saturation. Black levels with HDR on are definitely not OLED deep, but they do not look at all grey to me unless HDR gets turned off. Although it is also calibrated, I can really tell when HDR gets turned off because everything appears duller (this sometimes happens when the computer wakes from sleep... HDR gets stuck off and will not turn back on without a reboot). This is connected to my rig in my sig 1 spot.

It's not a set up for everybody, but it's perfect for me and I love it.
 
if you care about high hz gaming and motion clarity, we just aren't there yet IMO, i'd stick to 1080p gaming till prices come down in the next few years hopefully. OLEDs cost too much and even the 4090 struggles at 4k to get 100+ fps.

i splurged $1800 on a gaming setup and am pretty underwhelmed, the motion clarity sucks with VA darks smearing everywhere, c1's still demand $600+ on ebay, the recent 240hz oled monitors apparently suck too.. and you really need the high hz to take advantage of the OLED.

wasnt really worth the upgrade but I have a blast with the high hz/fps in less performance demanding games like cs:go and historically cpu limited games like planetside 2.

Wait for BF we might see a deal on the 7900xt/xtx, those are really your only options for cost/performance ratio.
 
Last edited:
1080p gaming. Not sure if serious. I have been on 1440P 120 Hz or higher since 2009/10.
 
If your 4090 requires upscaling to get a playable frame rate at 4K then it isn’t really adequate for the job either. As new games continue to be released with higher and higher system requirements the 4090 will continue to be not enough. PC gaming performance is a moving target. I recommend waiting until quantum computing is mainstream.
 
My advice keep what you got and get a nice 1440p monitor and enjoy not being graphics card poor. 4K is likely to continue to be cost prohibitive for years to come, unless there is a major breakthrough in graphic card power. Up to you though, it's your money.
 
RTX 4090 is THE 4K card of the moment. If 4K Ultra is your goal, the RTX 4090 is the closest you’re going to get.
 
Back
Top