Techpowerup review of GTX 280!!

Strange benches, gets beaten by the GX2 and X2 in stalker, but not that much faster than the GX2 in most benches.
 
I just hope so. Keep in mind what games we have with DX10 and what games are comming. DX10 puts more stress to the system and I just hope the 280 can deliver better performance compared to the 1st generation DX10 cards.
 
Yea, CoH DX10 definitely works a card much harder then DX9. Then again, its an RTS so a bit more bound to the CPU.
 
Yea... my 8800GT definitely doesn't get 34fps with 4xAA and 16xAF at 1600x1200. Something seems fishy with that.
 
Gasp!!

Not much of an upgrade for us GX2 SSC boy's.

For nVidia to put out a new card to take the 'Single Fastest Card' crown away from the GX2, I was hoping it would be by more than this....

If the 280 is faster than a GX2, it isn't by much?

Sad... I hope these are wrong.
 
Not bad, though the GX2 beats the 280 GTX in some games, though not surprising because the GX2 is a dual chip. But I hope future drivers will improve its performance.
 
It sure seems like it doesn't perform much better than the GX2 at high resolutions. Considering the GX2 is gonna be around $100-150 cheaper at release, I was expecting a little more. Maybe the drivers aren't final yet?
 
Remember though, nVidia had no problem selling us the GX2's, and stressed how it was the fastest single card.

http://kotaku.com/369475/new-nvidia-card-fastest-on-planet
They're calling it "bar none the fastest Graphics Card on the Planet"

Now I hope if the 280 can't beat a GX2, they dont wossie out and change their tune to 'we can't really put it up against a GX2. It's not fare...'

It was fare when the GX2 took the crown away from a single Ultra.

The 280 needs to actually beat a GX2 in my book.
 
There is something odd about those numbers, one, Crysis at Wide screen with 4xAA/16AF getting over 60FPS!? Not just with the GTX but the Radeon cards as well. The other is there S.T.A.L.K.E.R. settings, I know from playing the game personally, you can't use the nVidia control panel to control AA/AF, just doesn't work.
 
I'm more concerned with 1920x1200 or 2560x1600 and the average fps, not a one time instance of a the max.
 
The Crysis benchmarks were done through time demos, not in-game benchmarks :(

These results have to be fake. They must have done their benchmark inside of a room with 4 walls and a few items on the ground with an HDR light. This is quite upsetting.
 
The most interesting number is the power consumption. I guess Charlie from theinq was wrong. Big shocker there.

I'll wait for a mix of reviews and real-life [H]-style testing to evaluate performance.
 
man looking at their review. it looks like the 3870x2 is really inefficient. most of the time it gets like 5 frames more than the single card. whats wrong with that? sucky drivers still?
 
That must be some super-duper special system, all Crysis benches seem WAY inflated. I had a 3870, and I have the 8800gts 512 now, neither cards came close to the numbers they have in Crysis with 4AA and 16AF. The rest of my system specs are about the same as theirs.
 
Thread has been deleted

Sorry guys, we have to respect NDAs, I requested removal of the thread.

Please do not repost the info, it's not the final data anyway.

If you want to discuss our review methodology, how GTX280 reviews should be done or anything similar without using our NDA info feel free to continue.
 
IIRC their Crysis numbers are weird because they use their own custom timedemo for it so you can only really compare it to his own numbers because its a different testing methodology

Anyways, they're considered old #'s so we'll see some new ones soon I hope
 
I am pretty sure that the crysis numbers are wrong because for some reason, the guy's stuff was messed up and it was actually done on 1280x1024 resolution.
 
why would they only record the max FPS value they are gettting? we want average dammit.
 
Even with Dynamic lighting off?

To my experience yes, even with it off. I remember playing around with all the settings to try and get the NV control panel to work with that game, it would just crash, glitch out, or run at like 16FPS.
 
Remember though, nVidia had no problem selling us the GX2's, and stressed how it was the fastest single card.

http://kotaku.com/369475/new-nvidia-card-fastest-on-planet
They're calling it "bar none the fastest Graphics Card on the Planet"

Now I hope if the 280 can't beat a GX2, they dont wossie out and change their tune to 'we can't really put it up against a GX2. It's not fare...'

It was fare when the GX2 took the crown away from a single Ultra.

The 280 needs to actually beat a GX2 in my book.

and when the 8800GTX took the crown away from the driver-plauged 7950GX2.
 
Back
Top