DOOM 3 Config Tweak

caesardog said:
Hey we have the same card exactly (MSI 5900 XT). Note that I have my running at 425/763 (using MSI's Dynamic Overclocking -- so it doesn't overclock when I'm not gaming). I use MSI's 61.21 drivers. What drivers are you using?

I think you should be able to get to 763 on the memory clock (that's only a 9% increase from the default of 700).

I assume you are using coolbits?


nm im a dumbass delete this
 
caesardog said:
Hey we have the same card exactly (MSI 5900 XT). Note that I have my running at 425/763 (using MSI's Dynamic Overclocking -- so it doesn't overclock when I'm not gaming). I use MSI's 61.21 drivers. What drivers are you using?

I think you should be able to get to 763 on the memory clock (that's only a 9% increase from the default of 700).

I assume you are using coolbits?

I'm using Nvidia's 61.77 I think. When I try to oc past what I have it at now, coolbits tells me it has failed to run at those settings. Guess its just bad luck :rolleyes:
 
Mind did that until I turned off ATItool. I'm not sure if it's that utility, or the overclock.
I got memory aftifacts in D3 at a lower speed than I can normally run as well (lower than in other games and per the 'find max speed' function in atitool).


cornelious0_0 said:
This is sort of "in response" to the post above about the timdemo.......

Has anyone with a 9800XT noticed anything odd when running timedemo demo1? When I run it it scores abnormally low for my setup, because sporatically throughout the run it will "get stuck" and have pauses, essentially bringing my minimum fps in the test to around 3. I'm gonna try it again withOUT the cachemegs changes to see if for some reason it had some effect on it, but right now I'm not sure what it could be.
 
Unraring the .pak4 files has been confirmed to do nothing good by id. They said all it will do is increase load times, as D3 preloads everything it needs from the .pak4's during load time - not during gameplay. I wish we could get some official word on the image_cache settings.
 
Quote:
Originally Posted by kronchev
how do you do that?
winrar also make a new folder and put them in it just incase you ever need them agaim
As I posted. :)
FiZ said:
Unraring the .pak4 files has been confirmed to do nothing good by id. They said all it will do is increase load times, as D3 preloads everything it needs from the .pak4's during load time - not during gameplay. I wish we could get some official word on the image_cache settings.
Weird thing is, you do actually get an increase in FPS.
 
Well, I used every tweak I found so far + unrar the paks. I am running it real smooth on a AMD 1600+ XP overclocked to match a 1800+ XP with 800x600 and high settings! I was at medium and decided to try high...same framerates! Its great now! I bet it looks awesome at 1024, but I dont think I can do that :p

My system is real old too. AMD 1600+ XP, 512 RAM, GeForce4 Ti4600 128 RAM. Its sweeeeet!

Its spooky too! I like it, 2 thumbs up! Although AvP was scarier cause of the little blimps on the radar :D
 
Simon_Howes said:
As I posted. :)

Weird thing is, you do actually get an increase in FPS.


i concur, i actually did see a huge gain in fps, weither id confirms it or not. i'm happy.
 
ehZn said:
I'm using Nvidia's 61.77 I think. When I try to oc past what I have it at now, coolbits tells me it has failed to run at those settings. Guess its just bad luck :rolleyes:


Try using the 56.72 drivers. I tried both 61.xx drivers and I couldnt overclock my video card worth a crap. I went back to 56.72 and everything was fine with my 5950U.
For the 6800 cards use the 61.xx drivers to enable PS3.0.
 
The pk4 files are only referenced during level load time. Everything is precached into memory during that time and not continually throughout the level.

Generally speaking accessing the files via the .zip system is a lot faster as you may see level load times double or triple with everything unpacked.

Now if there are some hard numbers that show there is an actual FPS performance difference then I'll sure have a look to see why but it should have zero effect on FPS and a negative effect on load times.

robert...

Straight from the horses mouth (Robert Duffy is the lead programmer @ id). Not to say you people don't get an increase in FPS, I figured I had better post it since I didn't before. I don't get any benefit, only increases load times for me.
 
Mind did that until I turned off ATItool. I'm not sure if it's that utility, or the overclock.

I'm not using ATITool at all, and my 9800XT isnt overclocked, so I'm not sure what it could be. As of tomorrow afternoon I will not have a computer for 1 week or less cus I'm selling my 9800XT locally, and my Leadtek A400GT is shipping to me on Monday.....yay!!! :D

ATi Cat 4.9beta is fast too.

Yes, I did see SLIGHT increases in fps in Doom, but literally ALL of my benchmark scores went up significantly (Aquamark3 up by 200, 3DMark2001 up by 150, 3DMark03 up by 50) which is nice, because I had yet to see a driver release that has raised ALL my scores, and by that much.....too bad I'm ditching ATI (for now) tomorrow. :p

got memory aftifacts in D3 at a lower speed than I can normally run as well (lower than in other games and per the 'find max speed' function in atitool).

I dont have an actual link, but Carmack was quoted quite recently (slightly before the launch) that overclocking your video card may not prove AS successfull with Doom3, as it has certain ways of....."making artifacts happen" for lack of better words. I'd be perfectly happy having a seperate overclock just for Doom3 though, more fun testing for me. :)





Stupid question, but exactly WHAT do i do if i wanna try the .pk4 "trick" with my game? I can open the files with Winrar no problem, but what do I do from there? I wanna try this out but its not 100% clear to me what I should be doing.
 
System in sig, I only get 37.8 FPS in the timedemo...1280 at Medium detail, with the tweaks. what gives?
 
kronchev said:
System in sig, I only get 37.8 FPS in the timedemo...1280 at Medium detail, with the tweaks. what gives?

The same thing is happening to me, but my system is that much beefier to boot, I play at 1280x1024 High Detail and the timedemo (even at 1024 res) as horrible pauses and skips throughout it, which butchers my score and casues an average fps of 38.7.....when I KNOW that the game doesnt skip like that, something's messed up here.....how can I record my own timedemos to try that out?
 
caesardog said:
Run it 2 or 3 times -- discard the first run, as it is caching the textures then.

Your 2nd and 3rd runs should be more valid.

Well with the system in the sig I'm getting an average of 24FPS in the timedemo. This is with High details 1280x1024. I get only marginal increases as I drop down the res/high to the highest of 30FPS in 800x600 Medium detail. With a P4 EE, 2GB of ram and 9800XT that doesn't make much sense. I've done the tweaks and that's what I get. I've tried both Cat4.7 and beta D3 Cats. I've enabled tripple buffering in openGL. Crazy huh?

Though in the game I have very few problems and my FPS stays much, much higher. I don't think the timedemo is an accurate judge of a systems performance at ALL. I stay between 30-60 in 1280 high detail with very few drops below 30...most aren't even noticeably when playing. I think the timedemo be hosed!
 
cornelious0_0 said:
The same thing is happening to me, but my system is that much beefier to boot, I play at 1280x1024 High Detail and the timedemo (even at 1024 res) as horrible pauses and skips throughout it, which butchers my score and casues an average fps of 38.7.....when I KNOW that the game doesnt skip like that, something's messed up here.....how can I record my own timedemos to try that out?

run timedemo two or three times in a row without restarting doom, the first run cache's the textures. timedemo goes from 36 on the first run to 50 on the third because it doesnt have to cache the textures.
 
Dr. Who said:
run timedemo two or three times in a row without restarting doom, the first run cache's the textures. timedemo goes from 36 on the first run to 50 on the third because it doesnt have to cache the textures.

I do remember running it twice in a row but nothing changed, 38fps both times.....it actually went down from 38.7 to 38 the second run. I dunno, I'll go give it another shot real quick.

I think the timedemo be hosed!

Thats what I'm starting to think too.....it just doesn't make any sense.
 
poopman said:
Well with the system in the sig I'm getting an average of 24FPS in the timedemo. This is with High details 1280x1024. I get only marginal increases as I drop down the res/high to the highest of 30FPS in 800x600 Medium detail. With a P4 EE, 2GB of ram and 9800XT that doesn't make much sense. I've done the tweaks and that's what I get. I've tried both Cat4.7 and beta D3 Cats. I've enabled tripple buffering in openGL. Crazy huh?

Though in the game I have very few problems and my FPS stays much, much higher. I don't think the timedemo is an accurate judge of a systems performance at ALL. I stay between 30-60 in 1280 high detail with very few drops below 30...most aren't even noticeably when playing. I think the timedemo be hosed!

That is a little slow for your rig. I got timedemo of 32 fps on the following rig at 1024x768, high quality:

PIV 2.8ghz
Gigabyte 8KNXP i875
1GB Kingston HyperX (forget speed, but it takes advantage of i875 bus speed)
ATI 9700 Pro AIW w/4.9 beta catalysts
2xWDigital 80gb drives Raid-0 stripe
Doom3 tweaks mentioned in thread (all three lines) engaged.

The only other thing I do (and I don't know if you do) is I boost the processing priority of Doom3 to high from normal. If you don't, ya might try that.

But you are probably right that the timedemo isn't accurate. My gameplay is fairly smooth on my rig at the above settings.
Mr.Dearthian
 
mr.dearthian said:
That is a little slow for your rig. I got timedemo of 32 fps on the following rig at 1024x768, high quality:

PIV 2.8ghz
Gigabyte 8KNXP i875
1GB Kingston HyperX (forget speed, but it takes advantage of i875 bus speed)
ATI 9700 Pro AIW w/4.9 beta catalysts
2xWDigital 80gb drives Raid-0 stripe
Doom3 tweaks mentioned in thread (all three lines) engaged.

The only other thing I do (and I don't know if you do) is I boost the processing priority of Doom3 to high from normal. If you don't, ya might try that.

Mr.Dearthian

I just did what Dr.Who did (and ran the timedemo 3 times) and my timedemo fps went from 32 to 37 (much smoother).

Mr.Dearthian
 
I tried the winrar/pak thing. It does significantly lengthen the load times. I put the regular paks back.
I did not notice any changes during the game itself.

This is during single player...not that f'n timedemo that means so little.
 
Maybe I did something wrong the first time cus THIS time when I re-run the timedemo once or twice without restarting the game my fps DOES take a nice jump. I went and ran timedemo1 at every resolution and every detail level.....with my 2.8C at stock, and then at 3.43GHz to see the whole range of results. I'll be posting up my findings fairly soon in another thread.
 
cornelious0_0 said:
Maybe I did something wrong the first time cus THIS time when I re-run the timedemo once or twice without restarting the game my fps DOES take a nice jump. I went and ran timedemo1 at every resolution and every detail level.....with my 2.8C at stock, and then at 3.43GHz to see the whole range of results. I'll be posting up my findings fairly soon in another thread.

Cool, can't wait.

Mr.Dearthian
 
haha...does it matter? ;)

Great review though. And it makes me glad to see my 5900xt manages to hang with your 9800xt fairly well.
 
cornelious0_0 said:
Here ya go you guys.....be sure to tell your friends. :p

Interesting findings. Thing is that I'm getting about the same results as you in the timedemo...but my actual in game performance is much different (and yes I've run the timedemo multiple times). I'm running 1280x1024 at high detail and I have yet to see a hitch or stutter in actual gameplay. The FPS counter stays at a steady 30 or 60 (I have triple buffering enabled) and thus is the reason why I think the timedemo is not accurate.

Your results are still very valid in so much as the fact that it does make headway that the 9800XT is probably the limiting factor at higher resolutions (and a huge CPU doesn't make much of a difference) but it could also be a factor of RAM...though if you combine the fact that I have 2GB of ram and am getting aprox. the same results as you it must be the video card.

I intend on getting a new video card (will be putting my 9800XT up for sale/trade along with my LCD) and 20" LCD to support the monster that is D3. D3 has finally pushed me over the edge and I can no longer stand 25ms response time.

now only if I could decide between the Dell or Viewsonic 16ms 20" LCD's...

-poops
 
ehZn said:
haha...does it matter? ;)

Great review though. And it makes me glad to see my 5900xt manages to hang with your 9800xt fairly well.

Are you sure about that? On 1024 x 768 (high quality), the timedemo fps is 25.9 for me (5900xt).

His 9800xt is about 20 fps higher.

What do you see on your 5900xt and high quality/1024 x 768?
 
poopman, I realise that there are obvious differences between the timedemo and real gameplay.....as at 1280x1024 my experience with the game has been so far much more pleasent then what I saw rolling by in the tests. ;) I was merely trying to find out and expose what are the limiting factor(s) in my setup, as there are a lot of people right about now that are going to be picking up cards in the same performance bracket as my 9800XT with prices dropping, and stock of new cards still fairly flaky. I consider the testing to be a success personally.

Not to downplay what Kyle and the others do with THEIR reviews but i still dont' believe in using FRAPS runs for my testing.....because I wanted to produce results based on tests that ANYONE can run, just to give people a better idea of where I'm coming from. I understand that Kyle and the staff are coming from a different angle here, and I respect what they do, I just wanted to put things in black and white for you guys is all. :cool:

Viewsonic all the way.

What a strange coincidence.....I've had my eye's set on the P95F+B for quite some time now, I just gotta get the $400 togethor to drop down for it. Can't complain about 1600x1200 @ 87Hz and 2048x1536 @ ~ 60Hz on a 19" CRT. :D
 
caesardog said:
Are you sure about that? On 1024 x 768 (high quality), the timedemo fps is 25.9 for me (5900xt).

His 9800xt is about 20 fps higher.

What do you see on your 5900xt and high quality/1024 x 768?

Did you make sure to run the timedemo 2-3 times in succession? That's what I had to do with my tests to get more accurate results, as the game doesn't cache/load shit as it's playing, which is essentially what the Timedemo is doing the first time you run it.....when you run the demo the second time its basically playing through the same way the game does, and it more closely resembles actually gameplay performance.
 
it helped me go from high to ultra quality...before in UQ I had hickups every so often, none now. framerates didnot seem to change much, as far as I could tell.

forgot to mention
1600X1200, AA16X, 30-60fps depending on whats going on, feels smooth
 
cornelious0_0 said:
Did you make sure to run the timedemo 2-3 times in succession? That's what I had to do with my tests to get more accurate results, as the game doesn't cache/load shit as it's playing, which is essentially what the Timedemo is doing the first time you run it.....when you run the demo the second time its basically playing through the same way the game does, and it more closely resembles actually gameplay performance.

Yes - 25.9 is on 2nd and 3rd runs. And I think to be expected on a P4 2.6 ghz (1 gig ram); 5900 xt.

A 5900xt is a 128 mb ram video card. Its stock speed is 390/700.

I play the game at 800 x 600 (High quality) -- no issues. FPS on timedemo for that is 36 fps. I could play singleplayer at 1024 x 768 high, but I don't want to change resolution for mulitplayer.
 
caesardog said:
Yes - 25.9 is on 2nd and 3rd runs. And I think to be expected on a P4 2.6 ghz (1 gig ram); 5900 xt.

A 5900xt is a 128 mb ram video card. Its stock speed is 390/700.

I play the game at 800 x 600 (High quality) -- no issues. FPS on timedemo for that is 36 fps. I could play singleplayer at 1024 x 768 high, but I don't want to change resolution for mulitplayer.

I ignore timedemos because I don't think they reflect gameplay much, but I get a very playable 50 fps average or more on 1024x768 HQ noAA 8xAF. Remember though I'm running an oc'ed rig.
 
ehZn said:
I ignore timedemos because I don't think they reflect gameplay much, but I get a very playable 50 fps average or more on 1024x768 HQ noAA 8xAF. Remember though I'm running an oc'ed rig.

I agree with out on that, I dont go strictly by timedemo's to judge my game performance, but they ARE a nice reference from time to time.....especially in those "mine's bigger then yours" computer conversations. :p I actually see timedemo's and synthetic benchmarks as a chance to expose certain parts of our computers as bottlenecks, or identify a problem or issue we might be trying to get around. Of course I'm a big fan of the "Who's is bigger" school of thought with benchmarking, as long as you guys know were I'm coming from on my tests we're cool. :cool:
 
Yea...and I'm bigger. But your p4 C pwnz my B on the oc :(




Kinda like the bench press for weight lifting...gotta have some kind of comparable baseline test ;)
 
i know this may sound like shens, but on my system (in sig), i have turned the settings to ultra quality and 1024x768 resolution, and am currently getting about 30-40 fps. it does skip every once in a while, and especially at first, but it runs fine after a few minutes of actually running the game. :D
 
cornelious0_0 said:
crap about how I'm right and stuff...

Hmm, maybe that came off wrong. Like I said your tests were very insightful and helpful, but we're trying to make 2 different points. I'm saying the Timedemo is hosed and not an acurate portrayal of in game performance by any means (though I'm only in alpha labs1, maybe by the time I reach Hell it'll be a different story). You're trying to point out the performance one should expect and how higher resolutions w/ the 9800XT are video card limited and what they are to expect. We're both right I think :)

And about the Viewsonic...when I first saw that you were getting together 400 bucks I was like WHAT! But then I realized you were after the CRT, where as I'm after the LCD. I currently have a sony SDM S91 19", 25ms, 600:1, 170 degree LCD. Great for stuff other then gaming...but D3 has gotten to me. I've read mostly good review about the Dell 2001FP (20" 16ms response) but also some bad stuff.

As to the viewsonic VP201B I've read a really good review in the display forums here but haven't heard so much as a peep anywhere else about it. Do a lot of people just go with the Dell and overlook the viewsonic or what. Anyways this is probably a bit off topic so I'll shut up.
 
how do i enable/disable triple buffering? is it in the ati control panel or the game itself?
 
Back
Top