Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
You bet! One by one, the "can't crank it up yet" titles have fallen--BF2, F.E.A.R., etc. The current champ seems to be Oblivion, but I don't remember EQ2 ever being conquered, and one good thing is that with it being an MMORPG, any improvements will still be relevant rather than just a historical curiosity--"Oh, everybody's already played that through to the end, who cares now?"--sort of thing.Brent_Justice said:haha, I will DEFINITELY test it! thanks for reminding me
J-M-E said:Slot in this thread reserved for my new 8800gtx when it is released and I order it
InorganicMatter said:This is interesting:
* 384-bit memory interface (256-bit+128-bit)
* 768MB memory size (512MB+256MB)
Not only are there two separate memory modules, but it would seem as though they are completely isolated interfaces. Wonder what nV has in store.... Physics (although highly unlikely) is the first thing that jumps to mind, along with the dedicated AA processing you guys suggested.
johnjohn said:Why dont people consider ATI?
Fawkes said:None of this makes sense.
for one why would nVidia allow a complete U turn on power consumption and heat.
The 7950GX2 draws less power and makes less noise than a stock X1900.
As the chewbacca defence goes.
It does not make sense.
johnjohn said:Why dont people consider ATI? you guys seen benchmarks of x1900's owning quad sli in max settings. along x1950 has higher memory and core clocks. well i love both ati and nvidia. but just saying
kuyaglen said:Sure it does. Nvidia has seen the performance of their gpu's against ATI's and probably know the performance of the R600 and have built up their offerings to be competitive and hopefully to be the sampe jump in performance was the case with the 6 series vs the 5's, 7's vs 6's.
Hope the new card is twice the performance of the 7900GTX.
Maybe its also a sandwich card like the 7950 GX2.
Nvidia said that well over a year or two ago. I have read in more than one place that Nvidia changed those plans and went with a unified architecture.MrWizard6600 said:sorry to break it to yeh folks, G80 will NOT be of the unified shader architecture. this is straight outta the PR reps of Nvidia themselves.
"our G80 will be more of a hybrid" --Nv PR.
"we will of course adapt the unified shader architecture when it becomes the best performing [or something to that effect]" --Nv PR. sounds to me like they'll let ATI take the first step in this one.
mashie said:I like the theory over at the beyond3d forums that the g80 is a dual dice solution where one dice is for pixel processing and the other for geometry processing.
Pixel processor with 256bit bus to 512MB RAM
Geometry processor with 64/128bit bus to 256MB RAM.
It will be interesting to see if they are correct.
sorry to break it to yeh folks, G80 will NOT be of the unified shader architecture. this is straight outta the PR reps of Nvidia themselves.
trek554 said:Nvidia said that well over a year or two ago. I have read in more than one place that Nvidia changed those plans and went with a unified architecture.
Who knows? Maybe around April/May for the 8600 series.Soodey said:Any chance of NVIDIA whipping out something DX10 for ~$300? At all? Want to wait for dx10 before going the upgrade route, but if i dont want to spend $400+. Yes I Know this is crazy rumor, and apparently taken down. But my question still stands:
When should we expect a dx10 card for ~$300?
They have to adhere to Industry Standard Architecture (ISA) measurements or they will suffer greatly with lost sales. Not many people will be buying GPU's that can't fit in to existing cases.MH Knights said:How big is this thing going to be physically? My 7800GTX is pretty tight in my case (lengthwise) and the X1900XT will be extremely close. Weight will also be an issue for some.
I take it by water cooling they mean heat pipes? Sort of like how Apple said the G5 Power Macs were water cooled. Maybe these specs are for a factory OC'd card.
Sovereign said:I bought mine for $489.99 each back in May....
kuyaglen said:That $650 price would most likely be $700+ when the first e-tailers get them...so $1400 for these in SLI. If I work some overtime I can clear just over 1k. Include the $300-400 for Vista Ultimate, and definately an X2 4400 to upgrade my 3200venice, I should start saving.
I dont think so, I keep hearing end of november, but you never know.Sneak said:Curious, I keep hearing that ATI's next is going to be a beast and that is based on how it looks on paper. So that ought to mean someone has seen the paper.
I know it is all mostly speculation at this point but does anyone have any supposed specs on ATI's next one? Even if it is rumor would still be interesting to compare to these G80 specs.
Personally am waiting on this G80 and hope it hits soon. Been noticing of late that GX2 cards seem to be creeping down in price a bit and / or have some nice rebates. Makes me wonder if we are pretty close.
osirus35 said:mid november release... hmm if i save up now
Verge said:i doubt those specs are accurate...
if they are, then buying 2 of them for SLI would be such a waste right now, no games can even challenge them... and i've yet to see anything on the horizon that could either..
IMO, if the initial releases aren't GDDR4... i'd wait for the refresh... imagine 4ghz memory(ddr)