Geforce GTX 200 launch on June 18th

Status
Not open for further replies.
Does every single internet thread have to end up in a flame fest?

*shrug* Seems it, people are too sensitive and aggressive.

Anyway, not long to go... can't wait myself ;)... soon it'll all be irrelevant and we'll have posts of people with their hot new card off of newegg :p.
 
I can't wait, I have my $700 waiting for the card and I will be selling my 8800GTS OC soon as well....
 
I can't wait, I have my $700 waiting for the card and I will be selling my 8800GTS OC soon as well....

Nice, wish I did :D. I'm working on it though... selling off my 360/etc. that I found myself rarely using, and the 2x 8800GT 512 setup I'm currently using to help fund.
 
Nice, wish I did :D. I'm working on it though... selling off my 360/etc. that I found myself rarely using, and the 2x 8800GT 512 setup I'm currently using to help fund.

My 360 actually blew-up so I couldn't sell it and MS refused to do anything about it =( But the good part was that it caused me to reinvest in my PC and to grab a Wii, and both of those efforts have been far, far more rewarding than that 360 ever was or ever could have been.
 
Fudzilla reports price cuts:
GTX 260: 399,- $ (old 449,- $)
GTX 280: 499,- $ (old 649,- $)
http://www.fudzilla.com/index.php?option=com_content&task=view&id=7853&Itemid=65

By reading through all the reports lately it seems like the GTX 280 simply is not the uber-card many people hoped for. I guess the pre-release price cut is related to that fact.
The GTX 260 has been made cheaper because the Radeon 4870 is rumoured to catch up with it in some benchmarks, yet the new ATI card is only at 349,- $

With the old 649,- $ price the GTX 280 must deliver an impressive performance to justify the price. That doesn't seem to be the case, so they simply turned the price down by 150 $. Now the pressure for price/performance ratio is less.
Just remember the 9800 GX2 when it has been released. It was too expensive to justify its price, almost all reviewers said that. Just a short time after launch, nvidia reduced the price to make it look more attractive.
 
http://www.theinquirer.net/gb/inquirer/news/2008/06/12/gt200-scores-revealed

THANKS TO NVIDIA'S shutting us out, we are not handcuffed about the GT200 numbers, so here they are. Prepare to be underwhelmed, Nvidia botched this one badly.

Since you probably care only about the numbers, lets start out with them. All 3DMark scores are rounded to the nearest 250, frame rates to the nearest .25FPS. The drivers are a bit old, but not that old, and the CPU is an Intel QX9650 @ 3.0GHz on the broken OS, 32-bit, SP1.

Numbers up

The short story: not enough, not nearly enough. It is barely faster than a GX2, yields are crap, and there will only be a syphilitic trickle of parts on launch, slowing to next to nothing after that.

Remember though, these parts are $449 for the 260, $649 for the 280, and they are barely faster than the ATI 770/4870. On price/performance, they lose badly, really badly, to the 770. On the high end, the R700 spanks them by wide margins, but those numbers will have to wait a bit.

If you are thinking that NV will put out a dual card, don't hold your breath, they are power limited, die size limited, cost limited and production limited.

They can't make it until the shrink in late fall, and even then it is questionable. Is the card quick? Yeah, it is decent. Is it good enough? Nope, not even close. This card is a dinosaur, too hot, too late. µ

nv_gt200_numbers.jpg


This guy really knows how to act like an ass. :cool:
 
No, but people who still believe in 3x 8800 Ultra performance or other performance miracles should better prepare themselves for a not as impressive card. There are too many clues on forums and sites which states the GTX 280 not as the most impressive gfx card ever.
 
Yet another stupid statement from the Inquirer. Lame. But we'll see how the GTX performs against the new ATI.

I must say, if the 280 GTX isn't a monster card we hoped for, that means NVIDIA got really lazy, I know they can do better. Like I said before, they are waiting for ATI to catch up to them.
 
Inquirer numbers...


I do not think those are true. I just check some old benchmarks everywhere and the 35 at 1920x1200 is the same as the GX2 and 8800 SLI numbers...In fact most reviews put the gx2 in the High 30's low 40s on high setting at 1920x1200

I do not think nvidia would release a card with those numbers....
 
Inquirer numbers...


I do not think those are true. I just check some old benchmarks everywhere and the 35 at 1920x1200 is the same as the GX2 and 8800 SLI numbers...In fact most reviews put the gx2 in the High 30's low 40s on high setting at 1920x1200

I do not think nvidia would release a card with those numbers....

They would not release a single GPU card that is as powerful as 2 of it's previous generation GPUs?:confused:
 
Actually at around the same price point and to the less (H) user that would like a new released card for more money(if the gx2 drops in price) and the same performance.


They are both "cards" and if they have 4 gpus on them or 20 gpus many people do not know this.
They BOTH eat power and cost alot.

If they have the same performance it would be a major fail for nvidia.

I think the inquirer numbers are BS. If they are not, anyone want to buy a q9450 system I just built for the GTX 280 ? :D
 
One word to reviewers, make sure you check the retail prices with partners before you quote price/performance numbers, NV has a dirty tricks campaign lined up here, we told you they would have to drop the price when they saw the 770 number, and they did.

There are also a bunch of new things on the NV press FTP site, including 177.34 drivers, up from the 177.26 we tested with. We would be shocked if these were not special press-tweaked drivers, so beware of scores tested with these last-minute releases.

Standard policy trashing of nvidia in every article brought to you by enquirer
 
Standard policy trashing of nvidia in every article brought to you by enquirer

No wonder NVIDIA doesn't invite them for anything. They don't even make an effort to behave and deserve NVIDIA's trust...
 
Has anybody saved any past inquirer articles on nvidia to show how wrong they are? If not, I hope somebody saves these new articles, so they can rub it their faces later(if nvidia pulls through that is :D )
 
Actually at around the same price point and to the less (H) user that would like a new released card for more money(if the gx2 drops in price) and the same performance.


They are both "cards" and if they have 4 gpus on them or 20 gpus many people do not know this.
They BOTH eat power and cost alot.

If they have the same performance it would be a major fail for nvidia.

I think the inquirer numbers are BS. If they are not, anyone want to buy a q9450 system I just built for the GTX 280 ? :D

Because the in games that SLI doesn't work well with, Age of Conan at the moment for example, the GTX 280 if it is the same speed as a GX2 in games that scale well with SLI will be twice as fast as a GX2 in games that don't.

Oh, and because when you put two of them in high res high AA enviroments, they won't drop to 0 FPS because of the 256bit bus on the GX2.

Oh, and because you can put 3 of them in TRI SLI, something you can't do with the GX2.

Oh, and Inquierer numbers are allways ATI biased. They didn't even bother to use the latest drivers. And they even admit it in the article.
 
Will I see more of a benefit if I mostly run games at 2560x1600 because of the memory and bandwith? (vs a GX2)
 
Will I see more of a benefit if I mostly run games at 2560x1600 because of the memory and bandwith? (vs a GX2)

Man I hope so. I just want to be able to play my Oblivion game with QTP3, FCOM, and a myriad of other mods @ 2560x1600 without any micro-stutters outdoors (which hasn't been possible in most areas on my current set up).

What I'm really looking forward to, since the stock clocks are pretty low, is seeing what kind of overclock potential the these cards have. :cool:
 
Will I see more of a benefit if I mostly run games at 2560x1600 because of the memory and bandwith? (vs a GX2)

If your talking 2 GTXs vs 2 GX2s, then most likely yes, in almost every situation. We won't know for sure until the cards come out of course, but that is the expectation.
 
Then why have their patches improved framerates all-around?

That's stupid to say. The patches just helped SLi. The game is coded excellently since no game can match its graphical fidelity. It's funny that because a game is too much for any PC to play, it's poorly coded despite the fact it crushes the competition. :rolleyes:
 
No, but people who still believe in 3x 8800 Ultra performance or other performance miracles should better prepare themselves for a not as impressive card. There are too many clues on forums and sites which states the GTX 280 not as the most impressive gfx card ever.
Only because they cite specs and performance that are impressive and then merely assert via extraneous verbage that nv has "blown it."

A single-GPU card that is as fast or faster than the two prior cards in SLi and/or it's current competitor's top-end offering that uses two GPUs?

That's fast. No matter how you slice it, or try to slant/spin things, that's fast. Leave it to the Inq and Fudzilla to try to ofuscate this fact behind a bunch of childish ranting and wrong-headed assertions.

Whether the price is worth it for some is a personal decision. But the chips are fast. And with the rumored price drop. . . well. . . what ridiculous over-blown hyperbolic nonsense will the Inq and Fud use then?

Well, I suppose they can just continue doing the same thing, demonstrating very impressive speed and just bizarrely asserting: "Not good enough." :rolleyes:
 
I must say, if the 280 GTX isn't a monster card we hoped for, that means NVIDIA got really lazy, I know they can do better. Like I said before, they are waiting for ATI to catch up to them.
.... I really don't think that's the case at all.

Both NVIDIA and AMD both have multiple teams working on next gen architectures. GT200 was likely in development as soon as the G80 taped out for the first time, way before NVIDIA could have known that R600 would flop so badly.

This whole idea that NVIDIA is twiddling their thumbs so that AMD can catch up is utterly ridiculous and it blows me away how often I hear people say it. They didn't get really lazy, they ran into poor yields and less than expected clock speeds. Adding complexity without adding latency isn't exactly a walk in the park.

As for that Inq article, I think the author is really downplaying the performance of the GTX280. Surpassing the SLI speeds of the last generation is not bad at all. Compared to what we got with the 9800GTX (ie nothing) it's a welcome improvement.
 
Just put 2 GTX 280s of these in a system side by side with 2 GTS G92s or 2 8800GTXs. And see what happens. Seriously, why are people so hell bent on comparing two of the last gen chips with 1 of the next gen chips. If you want to use 2 of the last gen, fine do that. Just use 2 of the next gen chips. You'll see the double preformance.
 
The Crysis FPS is what got my attention. (Hope they are wrong)

There are already other Crysis numbers floating around which, imo, I'd put a good deal more stock in than The Inquirer's. Those numbers indicated 36.7fps at 1920x1200 at Very High settings. Maybe that test was with the new drivers that The Inq overlooked ;)

Anyway, the price drop from $650 to $500 could reflect a number of things. It could first reflect their intimation that perhaps that $650 market is not as strong as it used to be or that because of cards like the 9600GT and 8800GT people are finding it harder to justify. It may also be because perhaps the HD 4870 is a little bit closer to the GTX 260 than they'd like. Even if the latter is true, it is no comment at all on the GTX 280's performance but rather a comment on the HD 4870's. And do note that from ATi's own slides it looks like the HD 4870 1GB will be $430- the 512MB version looks like it will be $330, so keep that in mind as you compare the GTX 260 and HD 4870 (also keep in mind that the GTX 260/280 are expected to hard launch June 17th whereas the HD 4870 is only supposed to soft launch a week later).

And hell, nVidia bringing down the price of the GTX 260 and GTX 280 could also be because of their mid-range offerings or lackthereof atm. If they feel they ATi's HD 4850 and HD 4870 really have the potential to take-down the 9800GTX, nVidia may be planing to drop its price below that of the HD 4850 and thus have also moved-down the price of the GTX 260 and GTX 280 to leave a smoother price scale and to encourage more people to opt for one of the new GTX 260's or GTX 280's over the 9800GTX, HD 4850, or HD 4870.

Simply put, there are a lot of factors that affect nVidia's pricing strategies, and immediately pulling-out 3dMark numbers and Crysis framerates that blatantly conflict with others released are not going to get us to the root of nVidia's decision. Or, perhaps more to point, how many times have we seen people here, [H] members, posting in these threads and saying that they're not looking to spend over $400 on a card? Is it not possible that nVidia recognizes this?
 
There's no way those crysis numbers from the Inquirer are right for 1920x1200. Only 36 fps for the GTX 280, on high?? no AA? If those numbers were for very high, it would give it a bit more weight, but that's just ridiculous. It sounds like a number the 8800gtx would get.
 
The Crysis numbers are just as suspect as 3DMark scores.We have no idea how the FPS measurement was taken—and you know it probably wasn't done in a fair and objective manner like you're used to with [H]. Come on, this is Charlie @ the INQ we're talking about here.
 
Status
Not open for further replies.
Back
Top