NVIDIA rumored to be preparing GeForce RTX 4080/4070 SUPER cards

Blackwell could be costly because it would be on 3nm

It would be cheaper for nvidia to rebadge 4070 ti/4070 as 5060 ti etc.
Hearing rumors that RDNA4 might not have a halo/high end card in the stack (think similar to RDNA1). If that ends up being the case, Nvidia could very well just get by with a 5090, maybe a 5080, and then Lovelace continues to occupy the lower stack.

Who knows.
 
Blackwell could be costly because it would be on 3nm

It would be cheaper for nvidia to rebadge 4070 ti/4070 as 5060 ti etc.
Yes, but that is where die shrinks come in, and blah blah blah, N3 is weird you depending on the type of pattern you are laying down you could get a huge 1.6x density improvement, or none at all... So pure transistor density is something like 56% smaller, but things like SRAM might only be 5% smaller. TSMC was working to counter that with increased layer counts going from 19 layers to 26 layers, but each layer you add increases the chance you have a defect in that chip and it slows down production time which results in a cost increase.

So if your chip is on N3 and it is logic-heavy, you can see huge cost savings and power/performance improvement in moving from N5 (or N4) down to N3, but if you are SRAM-heavy you could very well see a ~40% cost increase instead with a 2-3% performance increase.
This is why TSMC has so many variants of the N3 process, so they can mix and match the variant to the overall chip architecture.

Consumer Blackwell may not actually happen, yes Enterprise Blackwell is coming on 3nm, but for all we know Nvidia chooses this as a good time to skip things up and move Enterprise and Consumer silicon onto completely different product stacks.
This would solve their China embargo issues, as well as let them clean up their driver stack and give them the option to open things up, the leaked code from the Nvidia hack a few years back showed they put a hell of a lot of their secret sauce right there in the drivers as plain as day and opening that up in that state would give away CUDA to everybody and that would be a minor problem for their bottom line...
It could also let them do a better job of fitting the consumer products to consumer usage and not worry about people buying consumer cards for enterprise workloads which is why Nvidia currently does a lot of the things they do like gimp memory, and do the firmware fuckery.

But why bother dealing with the PR problems of rebadging a 4000 series card as a 5000 series card when they can just slap the word SUPER on the end and call it a day, a Rebadge by any other name is still a Rebadge.
 
Last edited:
by that time Nvidia's Blackwell should also be close to release...
I think that’s kind of the point. If saving money is at all important to your decisions or maybe better said “bang for the buck” I’d just say skip a generation between purchases.

You’ll get a much bigger leap and better value moving from 3 series to 5 series than if you push for buying now. That’s just IMHO. If you planned to buy 4 series I would’ve done that a year ago rather than pull the trigger now when you could’ve had maximum benefit for the longest period of time.
 
I think that’s kind of the point. If saving money is at all important to your decisions or maybe better said “bang for the buck” I’d just say skip a generation between purchases.

You’ll get a much bigger leap and better value moving from 3 series to 5 series than if you push for buying now. That’s just IMHO. If you planned to buy 4 series I would’ve done that a year ago rather than pull the trigger now when you could’ve had maximum benefit for the longest period of time.

I typically do GPU upgrades every other generation...two things are tempting me to get the 40 Series Super...1) path tracing games like Alan Wake 2 and Cyberpunk 2077 which is not playable at 60+ fps on a 3080 even with DLSS...2) latest rumors have the 5000 series not being as big of a jump in performance as going from 20 series---> 30 series
 
I typically do GPU upgrades every other generation...two things are tempting me to get the 40 Series Super...1) path tracing games like Alan Wake 2 and Cyberpunk 2077 which is not playable at 60+ fps on a 3080 even with DLSS...2) latest rumors have the 5000 series not being as big of a jump in performance as going from 20 series---> 30 series
Sure, but lets put that all into perspective.

1080Ti was a landmark card no? And the 2080Ti was only a moderate upgrade.
However, I would still argue in that case if back then you were on a 980/Ti or whatever, it would've still been better value to skip the 1080Ti generation and have gotten the 2080Ti.
2080Ti longevity has proved itself out. It added significant feature set even if pure raster didn't increase. It put in the base necessary for DLSS.

In the current situation even if there isn't a significant performance difference in raster, the only way nVidia can release the next card and have it make sense is to make it at a competitive cost. This is also ignoring if they make improvements to their feature set, such as better RT performance or new software features that only 5 series cards will get. Another thing that I think will happen with 5 series is that nVidia absolutely must increase the VRAM across their entire product stack. The Super is already giving credence to that. They've played the low RAM game for one generation too long and every dev I've heard talk about this is that nVidia is causing problems there. It's not just about textures, it's about more complex geometry, and the RAM affects top end RT performance as well (I assume to hold all the info for all the calculations).

AMD is already known to be pushing for a 4080 level performance card that costs midrange money for their next gen and they are likely not going to bother with trying to make a "ultra high end" card (and by midrange, we're hearing $600 as being roughly the target). I bring this up, because AMD has gained ground this generation because dollar for value from them has been so high (generally setting their prices about 30% lower to a card that has equal raster and 20% better RT performance). Part of the reason the Super series exists and also why it's priced a lot lower is because the truth is nVidia's gaming cards are not moving like they want them to. Otherwise why else would they bother? If everything is selling, an update that just costs more money is unnecessary. It's just basic logic/deduction.


The tl;dr: of the above is, even if there aren't raster improvements there will be other improvements that will make it worth getting such as more vRAM, lower power, higher efficiency, quieter, and very likely more hardware/software combination enchancements like for DLSS that previous generations will not have. Also if nothing else it will have to cost less dollar to fps because of competition with AMD. So by waiting you still get more benefit.

If cost is no object though, then just buy a 4090. And then buy a 5090. Obviously everything I'm saying only makes sense in the context of trying to balance how worth while it is to buy every generation vs not. Like phones, each gen is still too close to each other. Skipping at least a gen there too also just makes way more sense unless you have money to burn.
 
Also if nothing else it will have to cost less dollar to fps because of competition with AMD

do you really think that Nvidia is going to lower the price for their next-gen card lineup?...certainly not at the high end...if AMD and Intel can't deliver a worthy competitor then what incentive do they have to lower prices when it's been proven that gamers are willing to pay $1600 for the top of the line card
 
  • Like
Reactions: pavel
like this
do you really think that Nvidia is going to lower the price for their next-gen card lineup?...certainly not at the high end...if AMD and Intel can't deliver a worthy competitor then what incentive do they have to lower prices when it's been proven that gamers are willing to pay $1600 for the top of the line card
Top end ignores the rest of the product stack.

It is without question that 4090's are selling. But everything else below that they've been taking a body blow.

Otherwise again: what is the purpose of these Super cards? Why do they perform better and cost less than the cards they're replacing? If all 4000 cards were selling, then nVidia didn't need to react at all. It's very telling that they're replacing the 4070, 4070Ti, and 4080. It's a clear indication they were all priced too high. Something everyone has known this entire time.

4090 buyers are <1% of the market, when talking about consumers (as opposed to professional and AI market). Competition is all in the mid-range and bottom end because that's where the buyers are.
 
Otherwise again: what is the purpose of these Super cards? Why do they perform better and cost less than the cards they're replacing? If all 4000 cards were selling, then nVidia didn't need to react at all. It's very telling that they're replacing the 4070, 4070Ti, and 4080. It's a clear indication they were all priced too high. Something everyone has known this entire time

this is not the first time Nvidia has released a mid-cycle product refresh...there have been Super cards before in previous generations, Ti cards, Titan cards etc
 
this is not the first time Nvidia has released a mid-cycle product refresh...there have been Super cards before in previous generations, Ti cards, Titan cards etc
And every time it has been a reaction to the rest of the market.

Jensen likes to win. Generally speaking he wants to win in all areas, perhaps other than price.
But he certainly wants the best performance always and he wants to sell the most cards and have the most marketshare.

So, if he doesn't have that he releases a refresh so that he can. I figured this was obvious.

Back to your original query, I think he's learned at least in part that nVidia pricing can't be that out of line with AMD, otherwise he'll have a product stack that won't sell. So yes, I would expect that whatever AMD releases next, will at least anchor nVidia's midrange. NVidia pushed for the highest price and people didn't go for it.
 
And every time it has been a reaction to the rest of the market.

Jensen likes to win. Generally speaking he wants to win in all areas, perhaps other than price.
But he certainly wants the best performance always and he wants to sell the most cards and have the most marketshare.

So, if he doesn't have that he releases a refresh so that he can. I figured this was obvious

when has Nvidia actually been threatened by AMD GPU's in the last decade?...the 40 Super cards might be the only time and even then it's because of raster performance and because Nvidia got greedy with their pricing (can blame COVID etc for raising costs)...Nvidia is still crushing them with RT and DLSS

Nvidia mid-cycle refreshes are the norm not the exception...it has nothing to do with the rest of the market...Nvidia has been on top forever
 
when has Nvidia actually been threatened by AMD GPU's in the last decade?...the 40 Super cards might be the only time and even then it's because of raster performance and because Nvidia got greedy with their pricing (can blame COVID etc for raising costs)...Nvidia is still crushing them with RT and DLSS

Nvidia mid-cycle refreshes are the norm not the exception...it has nothing to do with the rest of the market...Nvidia has been on top forever
The implication there being they are bad with money then?

What advantage or purpose would it serve to use engineering resources, manufacturing resources, distribution and packaging resources if you're already winning? Just continue selling the same stuff and put the money in a bank on interest would be a better use.

Why isn't there a super card for every card? Why is there no 4090 Super then? Or 4060 Super? Why specifically those 3 cards and nothing else? Why lower prices? Why make less money?
 
Last edited:
What advantage or purpose would it serve to use engineering resources, manufacturing resources, distribution and packaging resources if you're already winning? Just continue selling the same stuff and put the money in a bank on interest would be a better use.

Why isn't there a super card for every card? Why is there no 4090 Super then? Or 4060 Super? Why specifically those 3 cards and nothing else?

Nvidia always holds back and only puts out cards they know will beat AMD and Intel...no point releasing a 4090 Super when there's nothing else close to it from the competition...I'm sure they could release a 4090 Titan if they wanted to but there's no need

why doesn't Sony or LG release the best TV ever made that is much better than anything else?...because you want people to continue upgrading on a regular basis...the 4080 didn't sell because of the price, it was a bad value, not because it was a bad GPU...the 4090 pretty much made the other high end step down models obsolete
 
Nvidia always holds back and only puts out cards they know will beat AMD and Intel...no point releasing a 4090 Super when there's nothing else close to it from the competition...I'm sure they could release a 4090 Titan if they wanted to but there's no need
Oh, so it is about competition then. You're contradicting yourself.
why doesn't Sony or LG release the best TV ever made that is much better than anything else?...because you want people to continue upgrading on a regular basis...
Actually both of those companies try and release the best TV's they can right now. The competition is incredibly fierce at the top end TV market. But it's totally different as well because the biggest process improvements they can muster year to year is 5%, which is very different from silicon which can gain 30%+ swings in a generation.

Also people do not buy TV's nearly as often as any part of a computer or computers themselves. At least not specifically top end $3000+ ones. People on CXs if they're smart are not itching to upgrade unless they simply have money to burn. But there isn't really a technological reason to do so.

Believe me when I say, if they could release a TV for 20% greater performance OR 20% less cost vs a competitor they would. Because than either would literally have the entire market to themselves. People buying top end TV's are much less brand loyal and generally want the best product available.
the 4080 didn't sell because of the price, it was a bad value, not because it was a bad GPU...the 4090 pretty much made the other high end step down models obsolete
Again contradicting yourself. They are having to release updated models because of bad value. And it's only bad value because of relative cost vs competition. "Bad value" cannot exist in a void. As annoying as it is to say, Intel dominance during AMD's foibles pre-Ryzen and during Intel's 14nm++++ could be "argued" was bad value. Except it wasn't because there was no alternative.

If the 7900XTX didn't exist then the 4080 couldn't be considered a bad value vs anything. Exactly like the 4090 isn't bad value vs anything because there isn't anything else as you note in that competing space. That was my point about nVidia not bothering with a 4090 Super. This is all directly related to competition.

Anyway, I'm done playing with you and we're far afield. Buy a 4000 series card or don't. None of this is relevant to that other than whatever argument you have about nVidia lowering pricing. Which personally I think is a waste of time to discuss and is patently clear, what with the Supers literally costing less money for more performance, but apparently it isn't.

EDIT: Just context. I did not edit content.
 
Last edited:
Also people do not buy TV's nearly as often as any part of a computer or computers themselves. At least not specifically top end $3000+ ones. People on CXs if they're smart are not itching to upgrade unless they simply have money to burn. But there isn't really a technological reason to do so.
$3000 top end….
You’re almost half way there.
 
$3000 top end….
You’re almost half way there.
Depends on how high you want to push.

In terms of Pro OLED displays that are getting used for on set stuff for the director/DP, it's not even 25% of the way there.

I mean, you can also buy theater grade projectors from Sony too for $80k.

I was simply referring to generally top end consumer displays such as G3, A95L, and/or S95C. Which is definitely in the $3000+ (you missed the "+") range.

But no matter how you slice it, my point remains that TVs are not something getting replaced every release or every other release like a graphics card is. Unless as an organization/person you have more money than sense.
 
Last edited:
Look I have said this before I'll say it again. I'm just talking MSRPs here. Yes I am well aware of what happened to the street pricing and the gouging by retailers.
Plus, didn't the crypto craze happen shortly after the Ampere release - especially around the time the upper tier 30 series were released? That's how I remember it, anyway. The 'normal prices/pricing' was pretty brief.
 
do you really think that Nvidia is going to lower the price for their next-gen card lineup?...certainly not at the high end...if AMD and Intel can't deliver a worthy competitor then what incentive do they have to lower prices when it's been proven that gamers are willing to pay $1600 for the top of the line card
When do they lower their prices at all? Yet, there's supposedly 'improvements' or 'enhancements' to their current gen - 4070 Ti -> 4070 Ti Super etc. and these are getting 'discounts' or 'price reductions' supposedly?!?
 
instead of just getting a 4070 Ti Super I'm debating upgrading my entire rig (CPU/memory/motherboard)...or maybe I'll just keep my 3080 and upgrade the system around it...or the 3rd option is keep everything as is and get a new home theater display- the Sony A95L (QD-OLED)...decisions, decisions

Depends on the games, but generally a GPU would be a better real world performance increase. 5700X to 7800X3D provided some decent improvements in some games. In CoD MW2, the benchmark tool results were maybe 1-2 frame rates more. In actual multiplayer matches it seemed to bump frame rates up by 20-30. I am at 2560x1440, so if you're using a higher resolution those results might be smaller.

4070ti Super may or may not be that big of a leap over the 3080 though. Although you do get 2GB more VRAM.
 
Last edited:
I'll be using my 10GB 3080 until the 5000 series releases. Then I'll likely get a 5080.


Does anyone think that more powerful GPUs are going to become obsolete in the nearish future, as technologies like Epic Games' nanite become more prevalent, which could possibly make a visually-unlimited amount of graphical detail run excellent on any high-end GPU of today? Could Nvidia be seeing this possibility on the horizon and therefore be transitioning to AI-based, and also be trying to squeeze the most profits out of GPUs now before the need for ever-more-performant GPUs vanishes and their gaming sales drop?
 
I'll be using my 10GB 3080 until the 5000 series releases. Then I'll likely get a 5080.


Does anyone think that more powerful GPUs are going to become obsolete in the nearish future, as technologies like Epic Games' nanite become more prevalent, which could possibly make a visually-unlimited amount of graphical detail run excellent on any high-end GPU of today? Could Nvidia be seeing this possibility on the horizon and therefore be transitioning to AI-based, and also be trying to squeeze the most profits out of GPUs now before the need for ever-more-performant GPUs vanishes and their gaming sales drop?
I think we will see differently powerful GPU’s as they focus on the render methods being used by developers.

Which could be interesting, may see little to no gain or possibly regressions in performance in older titles, still more than playable, like if you go from 350fps down to 280fps would 99% of players notice?

But similarly see larger gains in newer titles that the GPU architectures are essentially optimized for.
 
When do they lower their prices at all? Yet, there's supposedly 'improvements' or 'enhancements' to their current gen - 4070 Ti -> 4070 Ti Super etc. and these are getting 'discounts' or 'price reductions' supposedly?!?
Yes and no. It is technically bringing the AD103 silicon which was previously only in the $1200 4080 to the $800 price point and spec wise is only a little worse than the 4080, but definitely an improvement on the AD104 based 4070 Ti.

That said, the naming/branding of the card is still messed up. I don't recall cards with "70" in the name ever costing $800. (MSRP specifically, forget the 30-series shortage, mining, covid, etc.)
 
Yes and no. It is technically bringing the AD103 silicon which was previously only in the $1200 4080 to the $800 price point and spec wise is only a little worse than the 4080, but definitely an improvement on the AD104 based 4070 Ti.

That said, the naming/branding of the card is still messed up. I don't recall cards with "70" in the name ever costing $800. (MSRP specifically, forget the 30-series shortage, mining, covid, etc.)
Also the 4080 Super's MSRP is $1000 from 4080 $1200.

A previous card, the 4070 is also directly dropping in price, though I would say "not enough". At least not officially (it's a $50 price drop, but you can't say nVidia isn't dropping prices if they're dropping prices). What I suspect will happen that the official price is "only" $50 less, but unofficially they'll have to drop the card $100 or more.

I also suspect the original 4080 and 4070 Ti, which do not have official price drops, will also have to drop in price to meet the in line cost of Super series. That is until they sell out of course. After all who would buy a $1200 4080 when a better $1000 4080 Super exists? Or the same case with 4070Ti vs 4070Ti Super at $800? The answer is you wouldn't unless you literally don't know anything about graphics cards or their pricing.

So there is a new lower starting point of at least one incoming card, and a price shift of previous ones. All of which are price drops. (Two of them official, and several unofficial).
 
I think that’s kind of the point. If saving money is at all important to your decisions or maybe better said “bang for the buck” I’d just say skip a generation between purchases.

You’ll get a much bigger leap and better value moving from 3 series to 5 series than if you push for buying now. That’s just IMHO. If you planned to buy 4 series I would’ve done that a year ago rather than pull the trigger now when you could’ve had maximum benefit for the longest period of time.
I'm going to wait for 5 series to upgrade. And I might even upgrade to AMD, who knows?
 
when has Nvidia actually been threatened by AMD GPU's in the last decade?...the 40 Super cards might be the only time and even then it's because of raster performance and because Nvidia got greedy with their pricing (can blame COVID etc for raising costs)...Nvidia is still crushing them with RT and DLSS

I now hate NVidia. I going to look for any excuse to go AMD next time around.
 
How do you know that?
It's fairly well known in general. There has been a ton of reportage directly and indirectly.

To be clear: the reality is much closer to nVidia's cards slowing down in sales significantly. When I say "they aren't selling" I don't mean literally "zero", I mean they aren't moving at the rates that nVidia would like and certainly not at the rates necessary to maintain market-share.

Both GamersNexus and MLID's industry contacts include retail sales channels.

GN as an example literally walked around in Taiwan and asked proprietors of hardware stores if 4000 cards were selling and they said no. And it's the same for locations like MC/BB. Where they have gone from store to store and noting that none of these cards are selling in large numbers. Some of this is opaque as both GN and MLID also have contacts that don't want to be named that also state at scale that these cards have been sitting on shelves.

This is also reflected in sales data. There is a 7% swing towards AMD in 2023. Which although this still means 80% of discrete is still nVidia, but a loss of 7% is still 10s if not 100s of millions of dollars in difference. A difference which obviously nVidia has no interest in seeding. The Super series is designed to cut off this trend. Otherwise in 2024 if the market stayed the same (as in, if the Super series didn't exist) then they'd simply seed another 7%, again something they don't want to do. It's obviously not a short term trend and required intervention (the sales data reflects this, the drop was continous month after month).

Then there is the obvious evidence that the Super series exists to correct not only the performance difference in their stack, but also the price performance issues in their stack and to a lesser degree issues with vRAM in their stack. As I noted with poloncy2 above, if the cards were already selling the way nVidia wants them to, then there would not be a need to make a Super series at all. It's basic logic/deduction. Otherwise it's a waste of development resources, manufacturing resources, and channel resources to make a new series of cards and it would simply be more profitable to put that money on interest in the bank. The only reason to make a new series of cards is to affect change in competition (especially considering that they increase performance at the same dollar value or also are lowering prices, two things that are not necessary unless they need to compete somewhere). Which obviously the Super series does.
 
Last edited:
I now hate NVidia. I going to look for any excuse to go AMD next time around.
My synopsis basically is: nVidia still makes the best cards, but not by the margins nVidia fanboys say they do and they also run the company like a bunch of anti-consumer (and anti-business too, when considering how much vendor lock-in they do) jerks.

Full breakdown:
When considering price to performance, AMD is always ahead on raster, and price to performance RT is equal. So a card like the 4080 has better top end RT performance yes, but the 7900XTX costs 20% less. So if you calculate "20% less RT performance" in the 7900XTX it's equal dollar to dollar RT performance to it's nVidia counterpart.

So, if you know that basic principle then dollar wise you're still coming out ahead on AMD. Getting more vRAM and generally speaking better raster dollar to dollar.

Software wise I think it's also much closer than nVidia fanboys want to say it is. AMD drivers are just as stable. Both nVidia and AMD have had their flukes related there even recently. In terms of features like the entire DLSS vs FSR debate, those differences are at best marginal at this point. It mostly comes down to whether or not whatever game is using the latest version of FSR and whether or not they bothered to implement it properly. And not whether or not the image quality is so different between technologies (also people's memories are apparently really short. DLSS 1.0 also sucked from an IQ perspective).

I also think the marketplace is reflecting all of this. The 7800 XT is a sales darling and more people have been willing to "try AMD", what with the dollar per performance value being as obvious as it is and more balanced and less biased channels like Daniel Owen and GN acknowledging what I'm saying: that yes nVidia is ahead performance and software wise, but it's a lot closer than people are saying it is.
Daniel Owen especially when discussing cards that cost less than $800 basically recommends an AMD card at every price point. It isn't until the 4080 where it become murky (as it's dependent on sales and current price) and 4090, which obviously gets top recommendations.

We'll see what happens in the coming months with AMD, because in order for their sales trend to continue, they have to react to the Super series launch by ensuring their price differential vs it stays the same. That might mean the 7900XTX needs to drop to $850 and the 7900XT needs to drop to $700~. If AMD does do that though, I foresee that the sales trends will continue towards AMD, which AMD of course wants.


(EDIT: A big part of AMD's "problems" basically are perception issues. Both related to their software sets and driver sets. Because they released their frame generation basically in beta state and said it was "ready", it really destroyed consumer confidence in their tech. They should've called it a beta and then the perception of it would have been really different. Something to "try" but it's not "full baked yet". And honestly AMD keeps on making bad decisions like these. They always end up fixing them in the end, BUT by the time they do the perception issues that their software is "bad" still lingers. So they need to be more patient, call betas betas, and launch stuff when it's fully baked).
 
Last edited:
Why lower prices? Why make less money?
Sometimes if you lower money, you make more total revenue or profit. The economists call that elastic demand. See this web page.https://www.investopedia.com/terms/e/elasticity.asp Note that electronics are considered elastic. You are assuming that demand is inelastic.
 
Sometimes if you lower money, you make more total revenue or profit. The economists call that elastic demand. See this web page.https://www.investopedia.com/terms/e/elasticity.asp Note that electronics are considered elastic.
That would suggest that you're agreeing with me. (You're telling me "no, but" and then posting an economic principle that agrees with my statements).

nVidia has priced their cards to the "elastic breaking point". That's not how economists state this, they would say: nVidia have reached a point in the demand curve where the percentage price increase to the negative demand is high. This principle does not work in a void, the competitive environment or as economists would say: "the availability of substitute goods", directly affects price elasticity. In this case AMD, that have priced their cards lower than nVidia.

Though top end cards like the 4090 are "less elastic" as it's a top luxury good (referring to the economic principle), midrange cards for people where "value matters more" are "more elastic". Showing why cards like the 4080 haven't moved nearly as much as nVidia would like. And certainly the same could be said about the 4070Ti and 4070.

In short you could summarize what I said in the post you're quoting to: nVidia is losing sales to AMD due to price elasticity of demand. Or to continue on, nVidia is losing sales to AMD because they are a lower priced substitute good that are occupying price points in the market that have higher elasticity of demand.

Edit: For more on this, I'd recommend reading the "Determinants" section on Price elasticity of demand's wiki, which describes the reasons why a good is more elastic or less elastic.


You are assuming that demand is inelastic.
I've made no such assumption. Quite the opposite. You took a single sentence that I wrote entirely out of context - they were rhetorical questions to polonyc2's statements which make absolutely no economic sense.
The obvious answer to each of those rhetorical questions even in just the two you quoted is that competition "exists" and nVidia wants to sell more cards. A point that polonyc2 wasn't willing to admit. Hence why you're making statements that agree with me.

https://hardforum.com/threads/nvidi...4080-4070-super-cards.2031127/post-1045810539
https://hardforum.com/threads/nvidi...4080-4070-super-cards.2031127/post-1045810546



Edits in general: Spelling/grammar. Extra context.
 
Last edited:
That would suggest that you're agreeing with me. (You're telling me "no, but" and then posting an economic principle that agrees with my statements).

nVidia has priced their cards to the "elastic breaking point". That's not how economists state this, they would say: nVidia have reached a point in the demand curve where the percentage price increase to the negative demand is high. This principle does not work in a void, the competitive environment or as economists would say: "the availability of substitute goods", directly affects price elasticity. In this case AMD, that have priced their cards lower than nVidia.

Though top end cards like the 4090 are "less elastic" as it's a top luxury good (referring to the economic principle), midrange cards for people where "value matters more" are "more elastic". Showing why cards like the 4080 haven't moved nearly as much as nVidia would like. And certainly the same could be said about the 4070Ti and 4070.

In short you could summarize what I said in the post you're quoting to: nVidia is losing sales to AMD due to price elasticity of demand. Or to continue on, nVidia is losing sales to AMD because they are a lower priced substitute good that are occupying price points in the market that have higher elasticity of demand.

Edit: For more on this, I'd recommend reading the "Determinants" section on Price elasticity of demand's wiki, which describes the reasons why a good is more elastic or less elastic.



I've made no such assumption. Quite the opposite. You took a single sentence that I wrote entirely out of context - they were rhetorical questions to polonyc2's statements which make absolutely no economic sense.
The obvious answer to each of those rhetorical questions even in just the two you quoted is that competition "exists" and nVidia wants to sell more cards. A point that polonyc2 wasn't willing to admit. Hence why you're making statements that agree with me.

https://hardforum.com/threads/nvidi...4080-4070-super-cards.2031127/post-1045810539
https://hardforum.com/threads/nvidi...4080-4070-super-cards.2031127/post-1045810546



Edits in general: Spelling/grammar. Extra context.
Sorry if I took what you wrote out of context. I don[t like it when other people do that to me, so I have to apologize.
 
I would like to see
AMD do to NVidia what they have done to Intel.

It could happen, NVidia are in the same place intel were not all that long ago, basically with a virtual monopoly, on top of the world. AMD came out with Ryzen which was a kick in the balls for intel to say the least, nothing to say they cant do it in the gpu space.
 
It could happen, NVidia are in the same place intel were not all that long ago, basically with a virtual monopoly, on top of the world. AMD came out with Ryzen which was a kick in the balls for intel to say the least, nothing to say they cant do it in the gpu space.
Intel had barely been advancing their mainstream CPU's though. Nvidia is not in that kind of position ;).
 
Intel had barely been advancing their mainstream CPU's though. Nvidia is not in that kind of position ;).

At the time intel had basically written amd off as a competitor, the company was almost bankrupt during that period. NVidia are in a similar type of position in the sense they think they're basically untouchable considering the RT uplift they have over amd which is quite substantial. Time will tell.
 
Intel had barely been advancing their mainstream CPU's though. Nvidia is not in that kind of position ;).
How much was because they could not get their 10nm node stable and had to scrap 3 or 4 generations of CPU cores as a result and how much was "we're the top and nobody can touch us" is up to debate.
Intel for a long time designed CPUs for specific nodes and they tend to have their CPUs in the design phase for 3 or 4 generations before it comes out, So when 10nm didn't happen it then screwed up all their other nodes down the line they were forced to improvise.
AMD on the other hand could bounce between GF, Samsung, and TSMC as they saw fit, I mean as things stand now it's TSMC or nothing to remain competitive because nobody can touch them on the high end, but AMD for all that time had the choice, where Intel was stuck.
Intel can now use TSMC for small parts of their chips but neither then nor now does TSMC and Samsung have the capacity to take on the bulk of Intel's fabbing.

It's not at all an understatement to say that Intel's 5-year 10nm delay set them back 8 to 10 years of CPU development because of how their production and design chains were integrated.

Nvidia though is highly focused, and could easily go with TSMC, Samsung, or Intel for getting their chips fabricated and packaged, but having TSMC fabricate the logic, and have Samsung handle the memory and controllers while contracting out the advanced packaging to Intel is very much something that is on the table for them if that is what they need to do to remain competitive you bet they will. Nvidia sells its products by offering feature and performance improvements over its previous generation, and if half the data center rumors about Blackwell are true then even facilities that spent hundreds of millions on Lovelace GPUs could see a reasonable ROI on upgrading to Blackwell. Nvidia doesn't have the luxury to sit around because they are in the lead and they are more than aware of that.

If anything we are on the weird end of Nvidia's cycle with features we don't understand why they are present and don't want to pay extra for until suddenly they are usable because developers are using the features. After all, it's out in the wild then suddenly everybody else is playing catch up to add it too.
 
Intel had barely been advancing their mainstream CPU's though. Nvidia is not in that kind of position ;).
I mean...4090 is hardly "mainstream" for consumer parts either. I'd say 4060 Ti and 4060 haven't advanced much at all either over 3060 Ti and 3060.

While its true that there isn't a lack of performance progression for Nvidia at the high end like their was with Intel, the "mainstream" is definitely reaching a point where the price of entry is higher than ever and the actual "mainstream" segment has stagnated.
 
I mean...4090 is hardly "mainstream" for consumer parts either. I'd say 4060 Ti and 4060 haven't advanced much at all either over 3060 Ti and 3060.

While its true that there isn't a lack of performance progression for Nvidia at the high end like their was with Intel, the "mainstream" is definitely reaching a point where the price of entry is higher than ever and the actual "mainstream" segment has stagnated.
The 4060 and 7600 exist in a strange price point, the problem is what consumers want to spend or even have to spend at that entry point hasn’t really changed in 6+ years. But costs have increased all around them so that is sadly one of those deflation points where you get much less for the money.
Likely what will start happening is both Nvidia and AMD work on the middle and upper ends of the stack (Nvidia will obviously claim the High and Ultra High end) and they will let the previous gen overstock fill in the price gap while they release new low ends but because of pricing keep them undesirable and very low volume.
 
6 years ago in 2018, I paid $435 Canadian + tax ($500) for the cheapest 1060 6gb around.

Right now the cheapest 4060 are around $400, 6700xt are $440, 7600 $360 (again in Canadian), cheaper in absolute dollars despite quite the 6 years of inflation.

Price of entry is not necessarily higher than during the last 2 crypto bubble, specially not the last one.
 
It’s insane how lame nV’s lower tier cards are this 4000 series go-around. My 4090 rocks, but I’ve got another comp with a 3060ti and the 4000 equivalent is an absolute joke.

These Supers are what the 4000 series should’ve been at release.
 
Back
Top