$2,500 RTX-5090 ( 60% faster than 4090 )

Perhaps just a "poor person's" perspective, but the "class" of user that can afford $1500 is the same as one that can afford $2500 (to me).
 
I was looking at a Tacoma but like you saw that almost 20k above other trucks that are better. They said they shut down the Mexico plant for 19 days because they can't find workers. I think the pay was reported at $2.73 cents an hour. Tacoma are not selling they fucked up all around. So many bad reviews on you tube. Then you have dealers that don't put all the trucks in one big ass line because then you can just tell the shit is not selling so they put 2 or so in the front maybe another one on the other side of the lot and then more in the back but not next to each other. Funny as hell as its all on video and being reported on video. Nissan is picking up sales because of the V6.

Now for the loan on the gpu... Even good credit falls off but I have a personal loan that reports every 12 months then repeats i pay next to nothing. But others not so much. To each there own.
 
I sure hope this rumored $2500 price for the upcoming RTX-5090 isn't correct?

Said to be 60% to 70% faster than the current RTX-4090;

https://www.pcgamesn.com/nvidia/geforce-rtx-5090-performance
I think it will be 70% faster, +/- 7% depending on the game.

Where it lands will really depend on how much of a boost that 50% memory bandwidth increase affords.

As far as the price, I don't see any reason it would be $2500. Just your typical Nvidia hater rumor-mill bs.

They will be making a LOT of chips for AI datacenters... and there will be plenty of chips not good enough for that use, which get put on video cards instead. More and more transistors means the likelyhood of some error on the chip increases, hence GPU's for us.
 
Last edited:
Seriously! I was thinking about a next gen Toyota 4Runner, until I saw the Tacoma got a $16,000 price hike for the TRD Pro model by going to the new generation. Oh well may just settle for a used one in a few years.

View attachment 654843
View attachment 654844
The new tacomas and tundras all sitting on all the dealership lots with over 100 days of inventory. No one is buying them because their price is ridiculous and they have issues. They are also still gas guzzlers while being hilariously overpriced. Unless you NEED a truck like that for work or tow it's a terrible idea. Soon enough they will bring in the Toyota Stout which will have a broader appeal to a majority of the market and it will be much MUCH cheaper and hybrid and save a ton of money on gas likely 40+ mpg. The Maverick is doing very well so Toyota and GMC need to compete. Let's hope they release their variants soon.

As for 5090 I'm skipping it. I'll get one after it's value depreciates to 600 in 3 years like the 3090s price now. I can wait I have a 4090.
 
The new tacomas and tundras all sitting on all the dealership lots with over 100 days of inventory. No one is buying them because their price is ridiculous and they have issues. They are also still gas guzzlers while being hilariously overpriced. Unless you NEED a truck like that for work or tow it's a terrible idea. Soon enough they will bring in the Toyota Stout which will have a broader appeal to a majority of the market and it will be much MUCH cheaper and hybrid and save a ton of money on gas likely 40+ mpg. The Maverick is doing very well so Toyota and GMC need to compete. Let's hope they release their variants soon.

As for 5090 I'm skipping it. I'll get one after it's value depreciates to 600 in 3 years like the 3090s price now. I can wait I have a 4090.
Kind of where I'm at, I'll just pretend 3 year old stuff is the latest and greatest, I don't have enough time to game anyway. Especially if prices are going to increase forever. It's just not sustainable unless the average working person is also making that amount more.
 
Last edited:
Perhaps just a "poor person's" perspective, but the "class" of user that can afford $1500 is the same as one that can afford $2500 (to me).
Not true. I saved up to buy a 3090 on launch day. Then I started to save up $50 a month until the 5000 series came out which should put me close to $2500 if it is that price. If you are smart with your money you can afford these type of GPU's.
 
Not true. I saved up to buy a 3090 on launch day. Then I started to save up $50 a month until the 5000 series came out which should put me close to $2500 if it is that price. If you are smart with your money you can afford these type of GPU's.
How does it goes against the notion that type of people ready to spend $1500 on a gpu would consider a $2500 one if the performance gap is big enough (in a once you go there for the best, you are the I want the best not looking at price tag much perso logic)
 
At least you'll be getting something you know will last multiple decades if you take care of it.

Wrong, there’s a guy that’s on his third Toyota engine within the first year . I believe he had a tundra. Maybe within the first six months I need to look that up again.
 
Not true. I saved up to buy a 3090 on launch day. Then I started to save up $50 a month until the 5000 series came out which should put me close to $2500 if it is that price. If you are smart with your money you can afford these type of GPU's.

I did the opposite. Instead of saving up $50 a month and then buying a 4090 in one go, I bought a 4090 on a monthly payment plan through Bestbuy with 0% interest. It's really not hard to make the monthly payments at all, just skip out on Starbucks/Fast food/other small expenses that you do not need.
 
I did the opposite. Instead of saving up $50 a month and then buying a 4090 in one go, I bought a 4090 on a monthly payment plan through Bestbuy with 0% interest. It's really not hard to make the monthly payments at all, just skip out on Starbucks/Fast food/other small expenses that you do not need.
If there is a admin fee, it is a bit of semantic.

But yes, a loan to buy something is something every one do every time they use a regular credit card, it depends of the term of the loan and what can you do right now with the capital you have in hands because you took a loan, there no absolute in that domain and a loan can always be a good idea, buying cash can be a bad one and so on...
 
The new tacomas and tundras all sitting on all the dealership lots with over 100 days of inventory. No one is buying them because their price is ridiculous and they have issues. They are also still gas guzzlers while being hilariously overpriced. Unless you NEED a truck like that for work or tow it's a terrible idea. Soon enough they will bring in the Toyota Stout which will have a broader appeal to a majority of the market and it will be much MUCH cheaper and hybrid and save a ton of money on gas likely 40+ mpg. The Maverick is doing very well so Toyota and GMC need to compete. Let's hope they release their variants soon.

As for 5090 I'm skipping it. I'll get one after it's value depreciates to 600 in 3 years like the 3090s price now. I can wait I have a 4090.
Always good to skip a gen if you can. I’d at least wait for 5090ti variant or refresh if I was on 4090.
 
It is a bit strange to plan to upgrade or not to me (but I imagine and I get that, that the hardware and the upgrading are made for it)

depends of what the next-gen will be, depends what the game will be.

Say that the 6000 gen launch in 2027 and game harder to run than Avatar at Ubnotanium (that run at 22 fps using FSR ultra quality upscaling on a 7900xtx) become the norm for the regular max setting and both the difference in graphic and the quality of game are good, that quite different if game are made to run at 30 fps 1200p on an Xbox X are run perfectly fine with a 4090 if you use DLSS until then.

Hard to imagine that the actual performance jump and how useful it uses in game worth playings would not enter the equation for people, at least a bit. We can imagine a lot of people did not think they would upgrade their 3090 to a 4090, they did not expect that level of performance jump without a price jump would occur, it did and they upgraded.

One current block is graphical fidelity in games that would be worth it over a 4090 (at playable framerate), but maybe game engine people will do it (with the help of NVIDIA) and it changes the equation.
 
Last edited:
The 5090 is going to be a beast I have little doubt about that. If the pricing is too high however then I can see people choosing to skip out on upgrading from a 4090. 3090 -> 4090 was a huge performance gain while MSRP only went up by $100 from $1499 to $1599 so let's see if 4090 -> 5090 repeats that same outcome.
 
I was starting to think the MSRP would be the same as last time, however, with the 25% tariffs resuming in June, I think it'll be 1699 + 25% on top brings you to 1699 + 425 = 2124. That being said, my final guess is 2199.
 
I was starting to think the MSRP would be the same as last time, however, with the 25% tariffs resuming in June, I think it'll be 1699 + 25% on top brings you to 1699 + 425 = 2124. That being said, my final guess is 2199.
The tariffs won't apply since the cards will not be built in China anyway. They currently assemble the 4090 outside due to the ai compute density regulations, and that will be true of the 5090 as well. :)
 
The tariffs won't apply since the cards will not be built in China anyway. They currently assemble the 4090 outside due to the ai compute density regulations, and that will be true of the 5090 as well. :)
Ah that makes sense, is this true of other cards? How will they handle that disparity I wonder?
 
Screenshot_20240529-045424.png

Kopite such a leak I think everyone wants smaller and shorter cards and less heat output.
 
Last edited:
Kopite such a leak I think everyone wants smaller and shorter cards and less heat output.
If the card do scale poorly in the power band like lovelace (we imagine that was discoverd late), could be a good way for Nvidia to save money.

Cooling get better each generation and would a 350watt 4090 have been much weaker ?

d7lq97nttkv91.png


With how much the take can be driven by efficacy in mind and early result GDDR7 could be way more efficiant than GDDR6 let alone the hungry 6X version
 
If the card do scale poorly in the power band like lovelace (we imagine that was discoverd late), could be a good way for Nvidia to save money.

Cooling get better each generation and would a 350watt 4090 have been much weaker ?

View attachment 656424

With how much the take can be driven by efficacy in mind and early result GDDR7 could be way more efficiant than GDDR6 let alone the hungry 6X version
Good chart. Yeah the 4090 is not really starved for power and certainly would have been just fine as a 350W card.

Contrast that to Ampere, and my 3080 Ti FE would absolutely love to have more power past its cap.
 
Good chart. Yeah the 4090 is not really starved for power and certainly would have been just fine as a 350W card.

Contrast that to Ampere, and my 3080 Ti FE would absolutely love to have more power past its cap.

I still find it hilarious that prior to the 4090 and RDNA3 coming out, rumors were saying the 4090 might have to "juice up it's power draw" in order to compete with top RDNA3 when in reality juicing up the power draw from 450w to 600w on a 4090 barely gets you anything lmfao. If it couldn't compete at 450w then giving it 33% more power would not have changed anything. I'm seeing my 4090 typically draw only around 360-380 watts in game so yeah a well designed 2 slot cooler probably would have been just fine for it.
 
From my understanding the 4090 is memory bandwidth starved, swapping to GDDR 7 and a larger bus should ameliorate that problem. I think the two slot thing might be true because there ought to be fewer memory modules at 2 gb per module right? I think GDDR 7 is also more efficient.
 
I still find it hilarious that prior to the 4090 and RDNA3 coming out, rumors were saying the 4090 might have to "juice up it's power draw" in order to compete with top RDNA3 when in reality juicing up the power draw from 450w to 600w on a 4090 barely gets you anything lmfao. If it couldn't compete at 450w then giving it 33% more power would not have changed anything. I'm seeing my 4090 typically draw only around 360-380 watts in game so yeah a well designed 2 slot cooler probably would have been just fine for it.
Considering the power and cooling system, Nvidia themselves could have thought that and been wrong, to latest node process are a bit of magic-alchemy that simulator can be wrong about.

From my understanding the 4090 is memory bandwidth starved, swapping to GDDR 7 and a larger bus should ameliorate that problem. I think the two slot thing might be true because there ought to be fewer memory modules at 2 gb per module right? I think GDDR 7 is also more efficient.
Rumors are for more memory module (16 module instead of 12-512 bits, 32 GB) but quite dense configuration, will have to see about that, but yes it should be significantly more efficient. GDDR6 was already 2GB per module, in the future GDDR7 could have 3-4-8GB modules and more flexibility at least that the plan, but first gen should stick to regular 2GB (16Gb):

MICRON-24Gigabit-MEMORY.jpg
 
If the card do scale poorly in the power band like lovelace (we imagine that was discoverd late), could be a good way for Nvidia to save money.

Cooling get better each generation and would a 350watt 4090 have been much weaker ?

View attachment 656424

With how much the take can be driven by efficacy in mind and early result GDDR7 could be way more efficiant than GDDR6 let alone the hungry 6X version
The 4090 performs about 15% less efficiently at 350W compared to 600W in demanding games e.g., Metro Exodus (EE). I hope the 5090 can scale up to 600W for those who want that option. However, if the 5090 has a dual-slot design, it might max out at around 400W, which could still be as effective, depending on efficiency gains from node shrink and the switch to GDDR7. Efficiency improvements, especially in scaling, are likely a focus for NVIDIA, given the demands of the AI sector. I see firsthand how costly it is to operate data centers focused on AI, so improvements in efficiency will likely benefit gaming GPUs as well.
 
New rumor seems to indicate the full GB202 wont be used in the 5090. I would imagine the perfect chips they'd like to use for AI, and this will be big enough of an upgrade over the current 4090 to justifying either a similar price or slightly higher. Honestly, if it's the same MSRP with a two slot cooler, I might be in for one. I really unabashedly can't stand Jensen as a person but that'll be a sick card.

https://videocardz.com/newz/nvidia-rtx-5090-new-rumored-specs-28gb-gddr7-and-448-bit-bus
 
New rumor seems to indicate the full GB202 wont be used in the 5090. I would imagine the perfect chips they'd like to use for AI, and this will be big enough of an upgrade over the current 4090 to justifying either a similar price or slightly higher. Honestly, if it's the same MSRP with a two slot cooler, I might be in for one. I really unabashedly can't stand Jensen as a person but that'll be a sick card.

https://videocardz.com/newz/nvidia-rtx-5090-new-rumored-specs-28gb-gddr7-and-448-bit-bus

Isn't this currently the case with the 4090? I'm pretty sure the 4090 isn't actually the full AD102 die as they reserve those for AI cards instead. As for knocking down the memory specs a bit, I'm pretty fine with that if it's going to help reduce the power draw like you said because a 448bit bus with 28GB of VRAM is honestly plenty for a new flagship card. The uplift in memory bandwidth over a 4090 is still a whopping 50% higher, and going for the full 512bit bus and 32GB would just add more heat and cost that may not be worth it in the end.
 
They usually hold off a little at the release of the top end. Maybe the Ti or Titan will have a full die, but, NV never got any competition where they needed to pull a Ti or Titan out this round and made much more selling AI gear.
 
Isn't this currently the case with the 4090? I'm pretty sure the 4090 isn't actually the full AD102 die as they reserve those for AI cards instead. As for knocking down the memory specs a bit, I'm pretty fine with that if it's going to help reduce the power draw like you said because a 448bit bus with 28GB of VRAM is honestly plenty for a new flagship card. The uplift in memory bandwidth over a 4090 is still a whopping 50% higher, and going for the full 512bit bus and 32GB would just add more heat and cost that may not be worth it in the end.
Correct, 4090 is a cutdown AD102.

Yeah honestly this makes sense. Only reason to increase the memory bus is to increase the memory capacity. The increase in L2 seen since Lovelace, with the move to GDDR7 will help the bandwidth. Had they had higher density GDDR7 starting out, then it would still be a 384-bit card.
 
Back
Top