GTX 280 Crysis Benchmarks

first to reply.

That's awesome, looking forward to getting two of these.
 
yea it is. plus SLI is a waste at that res. i dont thing SLI even makes a single fps difference at that res
 
What gets me is that we've said the GPU is the bottleneck in gaming until now? Are the tides turning?
 
with today's image editing software, yeah a screenshot is a really good proof :rolleyes:

how could someone that is relative unknown (or totally unknown, like in this case) gets to test one of those cards? ah?

:rolleyes:
 
with today's image editing software, yeah a screenshot is a really good proof :rolleyes:

how could someone that is relative unknown (or totally unknown, like in this case) gets to test one of those cards? ah?

:rolleyes:

Easy. Especially this close to launch. Any Newegg, Fry's, etc. warehouse worker could get their hands on it.
 
And these are the sorts of people that would show benchmarks has official places like here have to folow the NDA
 
Some serious funny business is going on here, especially since that test was SLI. I want to see a single GTX280 run 1920x1200 dx10, very high, 16xaf. Actually, I take that back, I want to see a single GTX260 run those settings. Only time will tell.
 
Wasn't DX10 supposed to get rid of the CPU overhead issues...? I wonder if this review was using dx9 very high or dx10
 
Hey all,
I was planning on buying a single gtx 280 for a brand new rig. I want to get a CPU that won't hold it back and, since it appears that the CPU can bottleneck, would it be smarter to get a dual core that can overclock to higher Ghz? Or should I go with the easily overclock-able Q6600? (No way I'm shelling out $1000 for the Extreme)
 
Nope wait till nehalem if this guy is tellling the truth. every core 2 cpu will bottleneck this card that qx9770 is not slough mate. actualy aint that the fastest cpu out right now?
 
Hey all,
I was planning on buying a single gtx 280 for a brand new rig. I want to get a CPU that won't hold it back and, since it appears that the CPU can bottleneck, would it be smarter to get a dual core that can overclock to higher Ghz? Or should I go with the easily overclock-able Q6600? (No way I'm shelling out $1000 for the Extreme)

wait until a real and proper review of the card, mmkay?
 
Nope wait till nehalem if this guy is tellling the truth. every core 2 cpu will bottleneck this card that qx9770 is not slough mate. actualy aint that the fastest cpu out right now?

The QX9770 and the QX9775 would both be the fastest. They are equal and the only real difference between the two is that the latter is a socket LGA771 part and is SMP capable. The former is not. They both however have identical clock speeds, bus speeds, L2 cache, and number of cores.
 
so dan D if this guy is telling the truth then no cpu today is capable of running these cards in its full glory? much like plonking in a 8800 card in a p4 cpu?

time for nehalem? are we now at a time where GPU's are out peforming the cpu's and that were in dying need for a faster cpu?
 
It's not the first time it's happened mendoza. When the 6800 ultra and x800xt PE came out, many people who didn't have top of the line processors had CPU bottleneck issues. Of course... that's a different story now that there's no processor out that will solve the bottleneck. I'm sure back then if you SLI'd 6800ultras you would have the same problem though.
 
This totally goes against what nvidia has been pushing though. They keep touting that its better to have a TOP END video card and "ok" processor rather than the other way around. Something tells me they arent going to allow the cpu to bottleneck their flagship cards. Anyway, at that low res, of course the cpu is going to bottleneck because its not pushing the cards hard enough. I'm basically ignoring all info until the NDA is gone and someone that doesnt have an axe to grind with nvidia does the testing at 1920x1200 and higher.
 
Just to run a quick comparison, at DX10 very high settings i just ran my 8800gts 640mb card through the GPU test in crysis at 1440x990 and averaged 25 FPS, so this would mean roughly a 40-50% boost from what i have now.
 
Just to run a quick comparison, at DX10 very high settings i just ran my 8800gts 640mb card through the GPU test in crysis at 1440x990 and averaged 25 FPS, so this would mean roughly a 40-50% boost from what i have now.

with AA 4x ?
 
Ehh, I'm not so sure I believe these benchies. It is early but my guess is that two GTX 280 cards running at 1440x900 would get more than 38fps on average....
 
The low resolution in this test jeopardizes its validity. Its well known that differences in cpu speeds are most pronounced at low resolutions while differences in GPU speed are shown at higher resolutions. Make this test a real resolution, then it might say something.
 
Can't help but hearing in my head "Queen - I Want It All". Can't believe they had to change the power pins AGAIN. Really, standard molex -> 6 pin -> dual 6 pin -> 6 AND 8 pin. Meh I'll wait for the 55nm refresh of big dawg.
 
Any test where the guy who runs it says that his Core 2 Extreme is bottlenecking the system is probably messed up. Especially since those benchies are lower than what you can get with a 9800 GX2 at that res.
 
What does a 9800 gx2 min/ avg fps at 1440x 990 with aa 4x. Maybe some of you have run crysis at that res with those settings already, b/c I dont seem to recal [H] or any other sites running fps tests using the gx2 in crysis at that res.
 
Back
Top