id Software's Official DOOM3 Benchmarks

Not like this wasnt predicted with D3 being an OpenGL game, nvidia has always been faster in gl, nothing new there.
Guess the slideshow begins for those with last generation hw.
I fully expect to run better numbers than those..always do.
I mean, comeon, who runs their card stock clocks anyhow?
 
and what about the 10,000 games that will be based on the doom3 engine? are they not worth the $ either?
 
Godmachine said:
One game gets alittle better fps ..OH NO MY X800XT IS CRAP!!! :p
please one game isn't worth 400 dollars ..

ouch... somebody is butt hurt.
 
^eMpTy^
I know the forum caters to extreme hardware and performance. Thats why I spend hours every day reading threads here. That is precisely my point though, everyone is making such a big stink out of ONE benchmark test from ONE source, on ONE game. Hell I remember back a few replies some guy was looking to trade his XT now. To me that is rediculous. The XT is still definately a [H] peice of equipment. Everyone makes such a huge stink about FPS. In most games players wouldn't even realize the 10-15 FPS difference without the little numbers flashing in the upperleft corner. As far as realworld visual noticablility, its very minute, if not nonexistant. As long as you're gettin 20-30 FPS most games run pretty smoothly. I've never in my life witnessed a noticable change in playability of a game going from say 40 fps to 50
 
fallguy said:
Being shown on a certain card, doesnt mean it was built for it. It was far and away the best card at the time.I remember many times its been said it was built around the GF3. No I dont have link, and if Im wrong, please provide links.

I think that the fact that it was demo'd on a 9700 is sufficient proof that id is hardware agnostic. At any rate, the accusation that id is playing favorites is completely mis-directed. Nvidia owns OpenGL and they have known since it's announcement that Doom3 was going to be a big title for them and they have simply built a better card to run Doom3 on.

Basically, the only place to point the finger here is squarely at ATI. The XTPE has more than enough fillrate to beat the GT, but it doesn't, because ATI's OpenGL drivers suck. And that's ATI's fault. Nobody is playing favorites, nobody designed the game to run better on Nvidia's hardware. ATI just fucked up, plain and simple.
 
10,000 other games that will use the doom 3 engine??? ya right maybe a total of 10 before its lifetime is over ..with farcry engine already out and cheaper to buy and use who do you think developers will go for?? ..half life 2 is almost out and that will be the most popular engine for sure. Don't get me wrong doom 3 is pretty but just because it doesn't run fantastic for Ati cards doesn't mean its worth selling one to get a nvidia card . I have a x800 pro and a EVGA 6800 GT and between the 2 i prefer the GT .. no fanboy here :cool:
 
Well, not talking either NVidia or ATi.

I'm really impressed with the FPS numbers in general at 1600x1200. I was somehow sort of expecting to see both cards just barely scraping 30 fps.

Scares the bejeebus shit out of me that it was benchmarked with 4GB of DDR2 main ram though. Did anyone else catch that? I bet the game is gonna harddrive-swap stutter like hell on 512MB systems. People might be looking to upgrade to at least 2GB now more than getting a fast videocard.
 
I just want to take a second amidst all the banter and thank Kyle and the [H] for getting us this info. I was one of the people hesitant to pre-order D3 because wanted to see what video card I'll need and what to expect with my current setup. This gives me time to order everything I need and do so with confidence. I really am looking forward to the doom 3 hardware guide they have coming out as well. Good job [H]! If you're going to hitch your wagon somewhere, hitching it to ID seems like a hell of a smart move. I wonder how long they've been playing Doom 3 now in the bunker!!!
 
Operaghost said:
^eMpTy^
I know the forum caters to extreme hardware and performance. Thats why I spend hours every day reading threads here. That is precisely my point though, everyone is making such a big stink out of ONE benchmark test from ONE source, on ONE game. Hell I remember back a few replies some guy was looking to trade his XT now. To me that is rediculous. The XT is still definately a [H] peice of equipment. Everyone makes such a huge stink about FPS. In most games players wouldn't even realize the 10-15 FPS difference without the little numbers flashing in the upperleft corner. As far as realworld visual noticablility, its very minute, if not nonexistant. As long as you're gettin 20-30 FPS most games run pretty smoothly. I've never in my life witnessed a noticable change in playability of a game going from say 40 fps to 50

Well you have to remember you're talking to a bunch of people who are overclocking to get every last fps out of their systems trying to get to higher and higher resolutions and levels of image quality. So for a lot of people this is huge. And yeah I agree it's somewhat silly, but it's to be expected, it just comes with the territory.

Also, a lot of people have been looking forward to Doom3 for a long time. This isn't just any other title, this is almost guaranteed to be one of the best FPS's ever made and it's the reason people pay $500 for a video card.

These benchmarks may be on ONE game from ONE source. But look at it like this, for the vast majority of people on this forum, Doom3 is THE game and [H] is THE site. This is the benchmark of the game people want to know about from the source they trust. The importance of this review isn't being overstated here. This is a serious blow to ATI.
 
Operaghost said:
^eMpTy^
I know the forum caters to extreme hardware and performance. Thats why I spend hours every day reading threads here. That is precisely my point though, everyone is making such a big stink out of ONE benchmark test from ONE source, on ONE game. Hell I remember back a few replies some guy was looking to trade his XT now. To me that is rediculous. The XT is still definately a [H] peice of equipment. Everyone makes such a huge stink about FPS. In most games players wouldn't even realize the 10-15 FPS difference without the little numbers flashing in the upperleft corner. As far as realworld visual noticablility, its very minute, if not nonexistant. As long as you're gettin 20-30 FPS most games run pretty smoothly. I've never in my life witnessed a noticable change in playability of a game going from say 40 fps to 50

Everyone's perception is different... 20fps for me is already unplayable, and 30 is bare minimum. My best FPS for gamingis 40-50 and I can certainly tell when it drops below 40... For you maybe there is no difference, but I do notice it. Of course, if fps exceeds 50, it's hard to tell but anything below is discernible.
 
gigantorebirdie said:
I just want to take a second amidst all the banter and thank Kyle and the [H] for getting us this info. I was one of the people hesitant to pre-order D3 because wanted to see what video card I'll need and what to expect with my current setup. This gives me time to order everything I need and do so with confidence. I really am looking forward to the doom 3 hardware guide they have coming out as well. Good job [H]! If you're going to hitch your wagon somewhere, hitching it to ID seems like a hell of a smart move. I wonder how long they've been playing Doom 3 now in the bunker!!!

I second that emotion. You [H] boys fucking rock.
 
burningrave101 said:
Woohoo!!!! Doom 3 benches finally!!!

I tried to tell you guys the 6800GT would beat the X800XT PE in Doom 3. But NOOOOO, nobody wanted to listen to me :p.

God I hate this guy. How did I know you would be one of the first to toot yours and NVidias horn.lol

Man that is some big news though I cant believe the difference between the cards. I am glad Frys told me my X800pro was never coming and freed me up to order an eVGA 6800GT. :D I have had good luck with ATI and NVidia and was having trouble sticking with the X800pro after the new review on HardOCP. If Frys hadnt screwed me I would have swiched any way after seeing the DOOM 3 benches.
 
hey guys, i work at Fry's electronics in Burbank, CA.
We got about 2-3boxes as of today, which translates into about 10 cards.
I just finished setting up a deal with aphex, hes building his new puter as we speak and wanted the

BFG 6800GT OC.

we have it for 399.99 + tax.
Worked it out with aphex and were talking about 450 shipped.
if anyones interested we do have a few more on hand.
 
Godmachine said:
10,000 other games that will use the doom 3 engine??? ya right maybe a total of 10 before its lifetime is over ..with farcry engine already out and cheaper to buy and use who do you think developers will go for?? ..half life 2 is almost out and that will be the most popular engine for sure. Don't get me wrong doom 3 is pretty but just because it doesn't run fantastic for Ati cards doesn't mean its worth selling one to get a nvidia card . I have a x800 pro and a EVGA 6800 GT and between the 2 i prefer the GT .. no fanboy here :cool:

There maybe only 10 games that use the D3 engine, but the sequel to RTCW and Q4 are going to be built on it.

I'm sure there will be a zillion games based on the farcry engine, but how many tier 1 games coming out are gonna use it? None that I know of.

You're assuming that a lot of games will be based on the source engine, and that's just that...an assumption.

You can try to downplay the fact that ATI sucks at Doom3 all you want. Bottom line is, this is one more reason NOT to buy ATI. I'm not saying the ATI cards suck, but ATI's performance reputation is being chipped away at rather rapidly, and Doom3 just took a big chunk.
 
Does nobody here play with 9800Pro's and 512 RAM? DOes Id software honestly think that a common configuration is 2 gigs of ram and a $500 video card? I work in retail and bring home scratch. A 9800XT is STILL $300 which is now considered low end budget card from the way the article sounds. With my setup I might as well not even bother paying $60 for this game when it comes out if I have to turn off all eye candy to be able to play it.

Would have been nice to see benchmarks made on a real machine and not the dream machines that were used.
 
BTW, im beginning to think the r300 was a fluke. People forget that Nvidia has been on top until the r300
 
4GB of ram for the first set of benches? 2GB of Ram for the 2nd? Custom timedemos? Come on....

This whole Doom 3 "benchmarking" seems to go against everything HARDOCP is trying to establish themselves with 'playable' settings.
 
Bah, NVidia only really had the GF line. 3dfx kicked both ATi's and NVidias butts before that.
 
BoogerBomb said:
Does nobody here play with 9800Pro's and 512 RAM? DOes Id software honestly think that a common configuration is 2 gigs of ram and a $500 video card? I work in retail and bring home scratch. A 9800XT is STILL $300 which is now considered low end budget card from the way the article sounds. With my setup I might as well not even bother paying $60 for this game when it comes out if I have to turn off all eye candy to be able to play it.

Would have been nice to see benchmarks made on a real machine and not the dream machines that were used.

Man I think you totally missed the point here. Your 9800 is gonna play D3 just fine. Even the alpha I saw way back in the day looked awesome at 800x600 on a geforce 3.

Don't freak out with the amount of ram the system had, I doubt that will make a huge difference. 512Mb should do very nicely for you.

If anything, this article has reassured me that my current system will give me a decent experience with doom3, and I'm sure you'll have a good one too.
 
Well, I don't think it was a choice on systems:

"Today we are sharing with you framerate data that was collected at the id Software offices in Mesquite, Texas."

So in other words, the test system specs were probably controlled by id.
 
Wow, it's funny seeing all the ATI people try and make light of the benchmarks and saying that they are tainted. ;)
 
512MB is even low for UT2004 IMO, unless you like waiting 2 minutes for a level to load (and then entering the game after all the key objectives have been fought for) And thats a DX7 game.

Nope, I can see people running to upgrade to at least 1 to 2GB pretty quickly.
 
TehQuick ---> Reread my post buddy, you completely reitterated what I was saying, after approx 40-50 FPS, it doesn't matter.

^eMpTy^ ---> "This is the benchmark of the game people want to know about from the source they trust"

I learned very early on that it isn't wise to take one sources story to heart. Instead finding similar results from multiple sources is a more solid way of getting the "truth"

"This is a serious blow to ATI"

I remember awhile back when the cards first came out reading benchmarks on a variety of games in which the ATI cards shined above the Nvidias, again, I didn't take it to heart because it was one sources findings on drivers that weren't even "official" So I don't see it being a huge blow. My bet is that within a month or two ATI will have new drivers that will excel thier OpenGL support up to speed.

I almost purchased a GT today but I don't feel comfortable with making my decision just yet. I'll be patient and not worry about regretting my purchase down the road.
 
Hopefully [H] does their own benchmarks on a wider range of systems to give people a REAL idea of what D3 will play like on their current machines with at least a few screenshots from each system to give an idea of the eye candy. Anything over 60fps I can't tell a difference so right now eye candy is the most inportant thing to me.
 
ZenOps said:
Bah, NVidia only really had the GF line. 3dfx kicked both ATi's and NVidias butts before that.

Time for a history lesson.

Voodoo 1 beat the Riva 128

2 Voodoo 2s in SLI was faster than a TNT 1, but without 32 bit color, not to mention it cost a hell of a lot more. So you can almost score the TNT as a win for NV.

Voodoo3s sucked, Nvidia wins with the TNT 2 line.

3dfx goes bye bye, ATI does some nonsense with the MAXX line, which sucks, Nvidia win's again with the Geforce 1 DDR

Nvidia Goes on to make the Geforce 2 GTS, Ultra and then the Geforce 3, and later the Geforce 3 Ti 500. ATI isn't even competitive until the original Radeon and later the 8500 which some people score as a win for ATI, but then Nvidia released new drivers and won again, but ATI was definitely in the running.

Nvidia wins again with the Geforce 4 Ti 4600.

Enter the 9700 Pro and later the 9800 Pro, and XT aka r300 which kick the crap out of nvidia due to thier native 24bit shaders vs Nvidia's 16/32 which defaults to 32 bit when running native 24bit DX9 code and performs like shit till they release a driver which recompiles the shaders on the fly and brings their performance back up to a respectable level, but they still lose.

And now we have the X800 vs the GeForce 6800, which is still open for debate...
 
CrimandEvil said:
Wow, it's funny seeing all the ATI people try and make light of the benchmarks and saying that they are tainted. ;)

I know right. It's not the end of the world, but I don't see how you can't just come out and admit that ATI just took one in the buttocks.
 
Godmachine said:
One game gets alittle better fps ..OH NO MY X800XT IS CRAP!!! :p
please one game isn't worth 400 dollars ..

ONE game?

The performance for the X800XT PE is like that in nearly EVERY OpenGL game vs the 6800u.

Did you not see the benchmarks for Call of Duty and Neverwinter Nights where i showed the 6800nu outperforming the PE at 1600x1200 w/ 4xAA + 8xAF? lol

OpenGL is better then DirectX anyways. The only reason it's not used is because lazy programmers don't want to provide enhanced instruction sets by doing their own coding, they would rather use pre-written instruction sets made by DirectX.

Quake 4, the Call of Duty expansion, KOTOR 2, and many other upcoming OpenGL games are going to be dominated by the 6800's :).

The 6800's and X800's are tied for performance in DX9. If nVidia can get some better AF optimizations through heavy brilinear filtering tricks like ATI does, then nVidia will hold both crowns instead of one crown and a tie.

At just 1600x1200 w/ no AF the 6800u beats the X800XT PE in nearly every DX9 game. That shows the the 6800's are currently the faster hardware while the X800's have better use of AF.

Both cards are incredibly awesome compared to last years but nVidia is definitely moving in the right direction to hold the lead in the GFX department this year.
 
I predict HL2 benchies soon, courtesy of ATi pressuring the hell out of valve (no pun intended.)

As for the X800 owners, for adults capable of spending hundreds of dollars on hardware, you sure act childish after reading that the game runs fine on an GeForce MX.
 
Merauder|FX said:
Ill be picking up my BFG 6800 Ultra OC tommorow, that what the tech dude at Best buys said they will be in. 529.98 after tax's

Damn...I just went to Fry's and picked mine up for $399. After I sell my 9800 Pro to my friend for $100, it's only a $299 cost to me...Fry's had a whole bunch of the BFG 6800 GT OC's in stock.

Dougieha
 
I guess Kyle and the guys are failing in their attempts to show gamers that there is more to a game and hardware than just how fast a card can render a game. Framerate and quality are 2 different things. Just because Nvidia has a higher framerate doesn't mean it is a better card. I would like to actually see game screenshots from each card and be able to look at the image quality differences between the two. What good is 60fps if its rendering 60fps of crap?
 
Oh comeon, lets not do this debate, but IMO:

ATi owned 2D after weeding out Cirrus logic, Tseng and number 9, Nvidia hadn't even been born at that time. 3Dfx made the early rage and tnt chips look like they were standing still, but it did not matter becase 3dfx was 3d only at the time (you needed both boards anyways) Everything up to the GF3 was owned by 3Dfx. The GF3 and 4, but not the 3 and 4MX's (which is the worst naming deception to date) definitively took the crown from 3dfx just as 3dfx was starting to get into the 2D aspect. Ati caught up with the Radeon 8500, but only after about 6 months of straightening out the drivers. Ati definitively beat Nvidia with the 9700pro.

Now, its just about even again, with ATi doing better in D3D high res/AA/AF and NVidia doing better in OGL low AA/AF.
 
burningrave101 said:
ONE game? lmao

The performance for the X800XT PE is like that in nearly EVERY OpenGL game vs the 6800u lol.

OpenGL is better then DirectX anyways. The only reason it's not used is because lazy programmers don't want to provide enhanced instruction sets by doing their own coding, they would rather use pre-written instruction sets made by DirectX.

Quake 4, the Call of Duty expansion, KOTOR 2, and many other upcoming OpenGL games are going to be dominated by the 6800's :).

The 6800's and X800's are tied for performance in DX9. If nVidia can get some better AF optimizations through heavy brilinear filtering tricks like ATI does then nVidia will hold both crowns instead of one crown and a tie.

At just 1600x1200 w/ no AF the 6800u beats the X800XT PE in nearly every DX9 game. That shows the the 6800's are currently the faster hardware while the X800's have better use of AF.

Both cards are incredibly awesome compared to last years but nVidia is definitely moving in the right direction to hold the lead in the GFX department this year.

I'm going to just start cutting and pasting your posts to save myself some typing. I've been saying for a while that all the X800 series has going for it is AA/AF which they do incredibly well. But now that even their optimizations aren't putting them decisively ahead, all the little things they suck at are starting to drag them down. And today, it's their OpenGL performance that is making them look bad.
 
BoogerBomb said:
I guess Kyle and the guys are failing in their attempts to show gamers that there is more to a game and hardware than just how fast a card can render a game. Framerate and quality are 2 different things. Just because Nvidia has a higher framerate doesn't mean it is a better card. I would like to actually see game screenshots from each card and be able to look at the image quality differences between the two. What good is 60fps if its rendering 60fps of crap?
Keep in mind these are benches from id, not the [H].
 
OriginalReaper said:
I predict HL2 benchies soon, courtesy of ATi pressuring the hell out of valve (no pun intended.)

As for the X800 owners, for adults capable of spending hundreds of dollars on hardware, you sure act childish after reading that the game runs fine on an GeForce MX.

Well I'm sure atleast part of them saved up for their X800 and are now feeling a little buyer's remorse. And besides, adults are just big kids anyways.
 
Congrats to NVidia - this benchmark just made me change my mind a week before I was going to order a x800 Pro.
 
AcneBrain said:
Keep in mind these are benches from id, not the [H].

I know but the [H] readers are already rating the cards based solely on the fps that Id has provided. I would think that a few more would want to seen IQ screenshot comparisons. Would have been nice if Kyle could have talked them into at least 2 (one for each card).
 
BoogerBomb said:
I guess Kyle and the guys are failing in their attempts to show gamers that there is more to a game and hardware than just how fast a card can render a game. Framerate and quality are 2 different things. Just because Nvidia has a higher framerate doesn't mean it is a better card. I would like to actually see game screenshots from each card and be able to look at the image quality differences between the two. What good is 60fps if its rendering 60fps of crap?

Now don't you think that if there was a dramatic difference in image quality someone would have said something by now? Oh wait I guess it's that huge Nvida + Id software + HardOCP conspiracy covering up the "crap" Nvidia is rendering. :rolleyes:
 
I cant believe people are changing their mind because of a benchmark that was unreasonable( 2 gigs of ram?), and when the winner was "designed for playing Doom 3".
And the game is in OpenGL which is only used in a few games. And I find it hard to believe that ATI users would be dissapointed with the results as Nvidia cards usually have better OpenGL performance than ATI's cards.
 
Back
Top