G80 Specs!

Brent_Justice said:
haha, I will DEFINITELY test it! thanks for reminding me
You bet! One by one, the "can't crank it up yet" titles have fallen--BF2, F.E.A.R., etc. The current champ seems to be Oblivion, but I don't remember EQ2 ever being conquered, and one good thing is that with it being an MMORPG, any improvements will still be relevant rather than just a historical curiosity--"Oh, everybody's already played that through to the end, who cares now?"--sort of thing.
 
This is interesting:

* 384-bit memory interface (256-bit+128-bit)
* 768MB memory size (512MB+256MB)

Not only are there two separate memory modules, but it would seem as though they are completely isolated interfaces. Wonder what nV has in store....:D Physics (although highly unlikely) is the first thing that jumps to mind, along with the dedicated AA processing you guys suggested.
 
Slot in this thread reserved for my new 8800gtx when it is released and I order it :D
 
J-M-E said:
Slot in this thread reserved for my new 8800gtx when it is released and I order it :D

I'll be waiting for the 9950GT when it goes down in price a bit...
 
InorganicMatter said:
This is interesting:

* 384-bit memory interface (256-bit+128-bit)
* 768MB memory size (512MB+256MB)

Not only are there two separate memory modules, but it would seem as though they are completely isolated interfaces. Wonder what nV has in store....:D Physics (although highly unlikely) is the first thing that jumps to mind, along with the dedicated AA processing you guys suggested.

AA was my first thought, but I have to wonder. IIRC, the X360's 10MB of super fast ram is for AA. Why 256MB, and it certainly isn't all that fast, given that it is on a 128bit memory interface? (That's assuming the + denotes a separation, which I think is a pretty good assumption.)

Then I thought, AA doesn't need all that fast of ram and it doesn't need that much ram, so what is going on? I think the secondary ram is for all the extra post processing (AA, AF, and maybe even DoF, etc). That is my purely logical assumption based on my limited knowledge of video card architectures. nVidia may have come up with a way to impliment a near-zero performance loss AA+AF system.

As long as the 8800GTX is faster than my 7900GT SLi in 1920x1080, I will be upgrading. 7900GTs just can't keep pace for me :(
 
Why dont people consider ATI? you guys seen benchmarks of x1900's owning quad sli in max settings. along x1950 has higher memory and core clocks. well i love both ati and nvidia. but just saying
 
johnjohn said:
Why dont people consider ATI?

Long story:
Though I have cards from both (AIW 9800 Pro, X800Pro, X800XL, 7800GTX, 7800GTX KO, 7950GX2), my first real card was the GeForce 2 GTS. It served me very well since I upgraded to the Ti 4600 and all its competitors were slain. I've done presentations on the company and they were very accomodating to a nub student such as myself. They've sponored a great deal of lans that I've been to and they throw one hell of a lan party.


Short story:
Their marketing department is better. (sup Phil)

(it also doesn't hurt that I've won one 7800GTX, 7950GX2, BFG 650W psu & BFG mobo)
 
Fawkes said:
None of this makes sense.

for one why would nVidia allow a complete U turn on power consumption and heat.

The 7950GX2 draws less power and makes less noise than a stock X1900.

As the chewbacca defence goes.

It does not make sense.

you get a smiley face for watching south park :D
 
johnjohn said:
Why dont people consider ATI? you guys seen benchmarks of x1900's owning quad sli in max settings. along x1950 has higher memory and core clocks. well i love both ati and nvidia. but just saying

you are in the "NVIDIA" subforum; I bet you'd find plenty of ATI next gen discussion in the "ATI" subforum
 
kuyaglen said:
Sure it does. Nvidia has seen the performance of their gpu's against ATI's and probably know the performance of the R600 and have built up their offerings to be competitive and hopefully to be the sampe jump in performance was the case with the 6 series vs the 5's, 7's vs 6's.

Hope the new card is twice the performance of the 7900GTX.

Maybe its also a sandwich card like the 7950 GX2.

The 8800GTX wont be a "sandwich card" because then they would call it 8800GX2
 
sorry to break it to yeh folks, G80 will NOT be of the unified shader architecture. this is straight outta the PR reps of Nvidia themselves.

According to Nvidia, its just not economical yet, while it sounds good on paper, currently its just to expensive to research and for now your getting your traditional Pipe.

"our G80 will be more of a hybrid" --Nv PR.
"we will of course adapt the unified shader architecture when it becomes the best performing [or something to that effect]" --Nv PR. sounds to me like they'll let ATI take the first step in this one.

lol! @johnjohn, way to bust out the flame--thrower.
but i gotta repsond to this...

"[we've} seen charts of Crossfire X1950s owning Quad SLI!"
which were all proven false when infact the X1950 gained 0-10% more performance over traditional GDDR3...

im sorry, 2X R580 @ 675MHz core < 4X G71 @ 500MHz core.

and just so you dont think im an NV faynboy, im buying an X1900XT 256mb in 2 weeks.
 
MrWizard6600 said:
sorry to break it to yeh folks, G80 will NOT be of the unified shader architecture. this is straight outta the PR reps of Nvidia themselves.

"our G80 will be more of a hybrid" --Nv PR.
"we will of course adapt the unified shader architecture when it becomes the best performing [or something to that effect]" --Nv PR. sounds to me like they'll let ATI take the first step in this one.
Nvidia said that well over a year or two ago. I have read in more than one place that Nvidia changed those plans and went with a unified architecture.
 
mashie said:
I like the theory over at the beyond3d forums that the g80 is a dual dice solution where one dice is for pixel processing and the other for geometry processing.

Pixel processor with 256bit bus to 512MB RAM
Geometry processor with 64/128bit bus to 256MB RAM.

It will be interesting to see if they are correct.

I'm watching the conversation closely over there...
they are pulling many interesting possibilities
 
This throws a new spin on things although quite unrealistic. I wish there was something regarding a new and improved AF instead of VCAA.
 
*Core clock to 1.5GHZ?
*700M transistors?


Remember guys 65nm is not mature at TSMC yet so......700M transistors at 80nm? Going up to 1.5GHz?

"Cough**bullshit**Cough".


If you guys believe this then you are out of your brains.....
 
sorry to break it to yeh folks, G80 will NOT be of the unified shader architecture. this is straight outta the PR reps of Nvidia themselves.

Well, what was in the xtremesystem forums dicussion they argued the same thing and made that distinction that this statement was said years ago, and that a unified shader architecture is completely possible. Also the point was made that the G80 card has been in developement longer - it's supposed to be based for future video cards for the next year or two, and at least longer than the G70/71 designs.
 
trek554 said:
Nvidia said that well over a year or two ago. I have read in more than one place that Nvidia changed those plans and went with a unified architecture.

k well i must have been reading recycled info because that was in some artical only a few weeks ago.

you got a link?
 
I think the extra stuff is for Physics. It's their way of making the 3 card ATi thing look stupid. Nvidia has been focusing on 3 objectives lately.

  1. beating ATi to the punch (releasing new gen stuff first)
  2. Launcing alongside a killer title (Crysis)
  3. Profitting on their engineering

They have been launching before ATi since after the FX series. They seem to always launch their new stuff with one of the most graphically demanding game in some way forcing you to upgrade because you can't play crysis with a 7900GTX or a X1950XTX because you can only play it at medium. The New 8800GTX OC eddition can run it at ultra High this giving your cyberpenis a errection. 3rd ATi seems to be either losing money or profitting very poorly on their GPU's. Nvidia always seems to come out with a stopgap card that forces ATi to lower their price on one of their mid or high end cards to compete thus making them either lose money or profit 1/2 of what they would have liked. It's not always the case but it's what I have noticed lately.

The Core clock and transistor count sound bogus. I'll say I would believe 950 Core and overclockable to over 1Ghz and the transistor count to be around 480 million. The memory will be GDDR 3 or GDDR 4. I think GDDR 3 will be for the low end parts. But who knows for sure. ;) Just cause I see it in my crystal ball doesn't mean it's true.
 
http://www.beyond3d.com/forum/showthread.php?t=33576&page=13, post 307 on....


B3D has some extreme doubts on the authenticity of this, however there is a tendency to think of the extra bit width and memory as being due to D3D10's constant buffers: ie 2 separate RAM banks, one for conventional tex buffer space and one for constant buffers, streamout buffers and post GS cache.


Seems like a few interesting ideas mixed in with some FUD. Who knows which way this thing is going to go.
 
Hmmm...I wonder why it disappeared!?!?!?!!?!

NDA finally caught up to them?

Oh the drama....
 
Any chance of NVIDIA whipping out something DX10 for ~$300? At all? Want to wait for dx10 before going the upgrade route, but if i dont want to spend $400+. Yes I Know this is crazy rumor, and apparently taken down. But my question still stands:

When should we expect a dx10 card for ~$300?
 
Soodey said:
Any chance of NVIDIA whipping out something DX10 for ~$300? At all? Want to wait for dx10 before going the upgrade route, but if i dont want to spend $400+. Yes I Know this is crazy rumor, and apparently taken down. But my question still stands:

When should we expect a dx10 card for ~$300?
Who knows? Maybe around April/May for the 8600 series.
 
How big is this thing going to be physically? My 7800GTX is pretty tight in my case (lengthwise) and the X1900XT will be extremely close. Weight will also be an issue for some.

I take it by water cooling they mean heat pipes? Sort of like how Apple said the G5 Power Macs were water cooled. Maybe these specs are for a factory OC'd card.
 
MH Knights said:
How big is this thing going to be physically? My 7800GTX is pretty tight in my case (lengthwise) and the X1900XT will be extremely close. Weight will also be an issue for some.

I take it by water cooling they mean heat pipes? Sort of like how Apple said the G5 Power Macs were water cooled. Maybe these specs are for a factory OC'd card.
They have to adhere to Industry Standard Architecture (ISA) measurements or they will suffer greatly with lost sales. Not many people will be buying GPU's that can't fit in to existing cases.

Either that, or give away free proprietary-size conforming cases with each GPU purchase. :p :D
 
as soon as i can find a deal on the GTX for < 500 i'm jumpin on that mother.
 
Sovereign said:
I bought mine for $489.99 each back in May....:)

pretty sure he was talking about the 8800gtx...lol.

And unless they pull off a decent card for ~$300, I'm not gonna wait for DX10 release if i can't afford it. I'll just jump on the 7950gt, it should serve me well enough until DX10 cards are actually affordable.
 
That $650 price would most likely be $700+ when the first e-tailers get them...so $1400 for these in SLI. If I work some overtime I can clear just over 1k. Include the $300-400 for Vista Ultimate, and definately an X2 4400 to upgrade my 3200venice, I should start saving.
 
I'll just buy ONE when they come out, OR one of the $999 Spyder Wicked Laser things...so hard to choose....:(
 
kuyaglen said:
That $650 price would most likely be $700+ when the first e-tailers get them...so $1400 for these in SLI. If I work some overtime I can clear just over 1k. Include the $300-400 for Vista Ultimate, and definately an X2 4400 to upgrade my 3200venice, I should start saving.

Too bad they stopped making the 4400...
 
Curious, I keep hearing that ATI's next is going to be a beast and that is based on how it looks on paper. So that ought to mean someone has seen the paper.

I know it is all mostly speculation at this point but does anyone have any supposed specs on ATI's next one? Even if it is rumor would still be interesting to compare to these G80 specs.

Personally am waiting on this G80 and hope it hits soon. Been noticing of late that GX2 cards seem to be creeping down in price a bit and / or have some nice rebates. Makes me wonder if we are pretty close.
 
Sneak said:
Curious, I keep hearing that ATI's next is going to be a beast and that is based on how it looks on paper. So that ought to mean someone has seen the paper.

I know it is all mostly speculation at this point but does anyone have any supposed specs on ATI's next one? Even if it is rumor would still be interesting to compare to these G80 specs.

Personally am waiting on this G80 and hope it hits soon. Been noticing of late that GX2 cards seem to be creeping down in price a bit and / or have some nice rebates. Makes me wonder if we are pretty close.
I dont think so, I keep hearing end of november, but you never know.
 
I just finished the November issue of PCgamer. At the end they usually have a preview of next month's issue. This time they decided they needed to "make you guess", then gave the following hint: "Think exclusive review"

Now what this means to me, is that since I'm reading November's issue in September. I'll get december's issue in October- and it'll have a review of Nvidia's next part. Therefore I conclude that Nvidia will release the G80 by mid October.

Hardware review websites can push the info out immediately when the NDA expires. Paper mags cannot, unless there is alot of planning up front. I think this is what I'm seeing here.
 
i doubt those specs are accurate...


if they are, then buying 2 of them for SLI would be such a waste right now, no games can even challenge them... and i've yet to see anything on the horizon that could either..



IMO, if the initial releases aren't GDDR4... i'd wait for the refresh... imagine 4ghz memory(ddr)
 
osirus35 said:
mid november release... hmm if i save up now :)

Mon-ay already saved! :D I've got loot reserved for the top of the line g80 card whatever it may be and a 30 inch lcd from either BenQ, Dell, Samsung, HP.

I'm gonna run crysis at 2560x1600 with acceptable frames rates and eye candy and thats all there is too it! :p
 
Verge said:
i doubt those specs are accurate...


if they are, then buying 2 of them for SLI would be such a waste right now, no games can even challenge them... and i've yet to see anything on the horizon that could either..



IMO, if the initial releases aren't GDDR4... i'd wait for the refresh... imagine 4ghz memory(ddr)

oh yeah? What about crysis running with all the dx10 eye candy at 2560x1600 resolution huh?
I'm just hoping that the top of the line g80 card can even HANDLE it adequately.
 
Back
Top