$2,500 RTX-5090 ( 60% faster than 4090 )

Eye tracking would be another one of those tech that (specially if the trend of monitor getting bigger for gamer continue and obviously already there for VR), there 1-2 degree of vision that need to be high resolution (and some margin of error around), specially if the system get 1000 fps fast to react quick enough to a fast eye change, we could be in 300 dpi on a very small portion of the screen and everything else in some old school 720p density and quality I would imagine, where our eye only see movement and bright change of light.
Yea that too.

I believe the Varjo VR headset has a functional implementation of that already, with some caveats of course (did not personally test).
 
Last edited:
This has been floating around on reddit today.
1715278172842.png


If those specs are true the 5090 will be 60% faster... 50% more SM/Cuda Cores/RT Cores/Tensor Cores, 15% Clock increase, 77% more L2 cache, 50% more memory bandwidth.
 
AI reconstruction and frame gen are the future, not pure raster improvements. The 4x stronger GPU will release when we have also have higher refresh rates and resolutions and more complex games, so suddenly you'll need 10x more power again or whatever (random number, but you get the idea).

Not saying we have hit the limits of raster, but when you see how much more power hungry GPUs have become you could actually argue that we have reached some limits already.

And AI reconstruction and frame gen make perfect sense when you think about it for a minute: there's an horrifying amount of GPU/CPU cycles wasted on things human cannot see/perceive whatsoever (it's why those techs work so well, even if they're not perfect yet), so it's essentially just more software optimization and it makes the dream of 1000fps/1000hz + ultra high resolution actually imaginable in our lifetime.
Yes, as you point out, the hardware will be catching up for a long time as it’s a moving target. Upscaling, frame gen, and eye tracking are each good for about a 30% effective performance improvement before they start to noticeably degrade image quality if developers have the time and resources to implement them properly. That's very good, but it still puts the hardware playing catch-up for a long time to come. It makes me reluctant to spend a lot on a GPU in this climate. The 3090 got clobbered by the 4090, which is looking to get wrecked by the 5090, and even a ~$2,500 5090 probably won't be able to max out my display hardware in some of the games I play. Even at the best of times, buying GPU’s is like investing in a melting ice cube, but in this climate it’s 100 degrees outside, so I’m not willing to pay too much for my ice.
 
but in this climate it’s 100 degrees outside,
Not sure if GPU ever had longer life time, ability to keep value and longer period before a refresh than now.

Buying a GeForce 3 was much more a melting ice cube, 6 years old 2070 super play every game quite well and will continue to do so for popular AAA game until PS6 only game show up, around a decade later someone bought it, try doing that with a 1996 GPU in 2004...
 
Last edited:
Back
Top