Nvidia Geforce 9800GTX Specifications released!

I don't feel sorry for you when you anger the invisible flying spaghetti monster and he comes for your soul with his noodly appendage.

You superstitious fool! Don't you know that all spaghetti evolved independently from primitive single-celled fettucini?
 
I dont belive the double the speed of a 8800gtx. It would need huge amounts of stream processors which they havent stated, high clock speeds and shader speeds.
 
I dont belive the double the speed of a 8800gtx. It would need huge amounts of stream processors which they havent stated, high clock speeds and shader speeds.

High clock speeds are expected, though NVIDIA has shown us, in the past, that they really don't count on high clock frequencies, to get the performance, but rather the efficiency of their architecture. G80 is definitely a good example of that. That said, even though it should be at 65 nm, I don't expect G90/G92 to be clocked at something over 750 Mhz.
As to the amount of stream processors and their clock speeds, I'm betting on no less than 192 of them, clocked at something close to 2 Ghz.
The 1 TeraFLOP of computing power is not a rumor, since it was confirmed by NVIDIA sources, so twice as fast as a 8800 GTX, in computing power i.e. peak FLOPS capacity, is a given. The real question is if that computing power is applied where it matters: games. That of course, remains to be seen, since paper specs do not translate into actual performance.
 
As long as the card isnt any longer than an 8800GTX then I'll be happy. Mine barely fits in my Lian Li PC-60
 
A rumor is a rumor and this is no different. It will be considered and discussed as a rumor, until we know for a fact, it's true or not.
And actually, most of R600 rumored specs, turned out to be true...Let's just wait and see what happens.

So I guess a comment to an article is a rumor now...wow I should post in some other forum some "rumored" G90 specs with no backup (just like this one) and it'd be considered too? Even if I made it up and no one really knew!? :rolleyes: It's just funny to see the clamoring in this thread from some old rumor, I mean it isn't even a wide-spread one at that lol.
 
So I guess a comment to an article is a rumor now...wow I should post in some other forum some "rumored" G90 specs with no backup (just like this one) and it'd be considered too? Even if I made it up and no one really knew!? :rolleyes: It's just funny to see the clamoring in this thread from some old rumor, I mean it isn't even a wide-spread one at that lol.

Of course you can. Rumors don't need anything to back them up. You can just say the usual "sources confirmed that..." or "a friend told me..." for it to be a rumor. That's the nature of a rumor actually.
 
we obviously know that terrflops can flop too
look at the xfire 2900xts that hit 1tf :p

Which is exactly why playing the waiting game, is a bad move. You should buy what you can now and enjoy it. Otherwise, if you wait for the next big thing and it turns out it's not worth the hassle, your disapointement will be even bigger. R600 is definitely a good example of that.
 
built in Audio Chip.
Riiiiiiiiiiiiiight again. Even on the HDMI ATI cards there's no sound chip, just uses a pass through from the sound card to combine the audio with the video signal to pass via the hdmi.
I don't think that's correct. Even AMD's website says the 2900 features an "integrated HD audio controller". From what I can tell, even if you took out your sound card and disabled audio, you'd still be getting your audio.
 
I just hope that all these hype don't turn out to be a big flop. Too much expectations and you will surely be disappointed.
 
we obviously know that terrflops can flop too
look at the xfire 2900xts that hit 1tf :p

When talking about the amount of flops with the latest ATI cards it wasn't a technical summary of hardware specifications, more a general term relating to how well the card will impress us.

One terra flop, thats right, it's one HELL of a flop, you've heard it right here :)
 
You superstitious fool! Don't you know that all spaghetti evolved independently from primitive single-celled fettucini?

No, that was the FSM simply altering the universe with his noodly appendage to appear that way, all hail his noodlyness!
 
E-dram = pure crap. PC's don't output a fixed resolution, the amount you'd need (and the cost) would be huge. If it really is going to be 512bit (and fast DDR4), bandwidth to ram would be huge anyway (over half that of Xenos's e-dram most likely).

This seems like the same shite thats happened with every bloody gen. Some geezer jumps around forums grabbing all the expected/desired buzzwords, wraps them up into a single post and calls it a "leak". It's a bit annoying NV hasn't let anything out about G92 yet* (still, no competition = no need), but this kind of crap just gets tiresome.

edram.......pfffffft

Edit: * Nothing major anyway. 1 Tflop is touted, somewhere else I read 1 billion transistors, and dp (or is it dvda lol) for the non-gaming parts methinks.
 
E-dram = pure crap. PC's don't output a fixed resolution, the amount you'd need (and the cost) would be huge. If it really is going to be 512bit (and fast DDR4), bandwidth to ram would be huge anyway (over half that of Xenos's e-dram most likely).

This seems like the same shite thats happened with every bloody gen. Some geezer jumps around forums grabbing all the expected/desired buzzwords, wraps them up into a single post and calls it a "leak".

Glad I'm not the only one here (seeing through non-rose-colored glasses ftw).
 
Oh god, I keep seeing this card being mentioned everywhere on pc forums. "Don't get an 8800, wait for the 9800 GTX it's twice as good and it's coming out in November." Come on people, don't believe everything you read...
 
I call BS for the one line saying its going to have eDRAM, mainly because ALL rumored spec's listed this as a feature since the release of the spec's for XENOS (BTW its R500, not R600, or R520 or R580, and has nothing to do with either of those cores).

Also it claims free AA, the eDRAM needs special instructions to use and at a certain resoultions, the current one we have on the 360 has problems putting out 2xAA on 720p so I definetly can't see this being used at all on a high end card. Another reason for this is due to the constraints of the outer enclosure of the core, its going to be big and require alot of efficient placings as well as a new way to apply a heatsink with out much gaps and this will only complicate that.

So why would Nvidia use something thats expensive, and give them no advantage except at a resolution (the developer has to properly prepare this) that no one will use with these?

PS: the R600 series do have built in audio capabilities, it requires no stream of audio from external components, its self reliant and produces audio on its own.
 
buying the Compusa replacement plan was the best 20 bucks i ever spent. i am so turning my 8800gts in when the 9800 gts's come out...
 
The audio coming out of the R600 is a realtek codec chip, its not cutting edge by any means, So having a separate audio card that pipes in the sound is still in my mind a better solution, nvidia should just work on a better interface with audio cards.

It pisses me off, people telling other people not to buy an 8800GTX/ultra because they think the 9800 is going to be out in a few months. Theirs still no hard evidence which is baffling.
 
Nevermind the evidence, its been 8 months and going that these chips have been out and we have seen about a 25 or less % price drop. Just seems crazy.
 
The audio coming out of the R600 is a realtek codec chip, its not cutting edge by any means, So having a separate audio card that pipes in the sound is still in my mind a better solution, nvidia should just work on a better interface with audio cards.

It pisses me off, people telling other people not to buy an 8800GTX/ultra because they think the 9800 is going to be out in a few months. Theirs still no hard evidence which is baffling.

I have to say, only creative actually offloads the processor from doing audio. realtek sux ass.thats like a 1 dollar chip.
 
Back
Top