Venice 3000+ and 7800GTX, bottleneck ?

N

nösferatu

Guest
i need help on my gaming rig, i run win xp,

venice 3000+ @ 2.6 GHz, 288x9 on ABIT Fatal1ty AN8-SLI, my 1GB RAM runs below 200Mhz not sure about exact Mhz, these are 2 years old parts.

last year i bought msi 7900GT, on the on RMA (3 months!!!), so i stuck with msi 7800GTX (borrowed from local msi), recently i played titan quest and marvel ultimate alliances.

using fraps, i noticed low framerate, 19fps - 29fps on MUA, all setting maxed @ 1024x768,
framerate at titan quest only slightly better @ 1152x768, all maxed.
so i begin to wonder, is there some sort of bottleneck on my rig? or is it simply because
7800GTX doesn't have enough to play my game at those low resolution and gfx setting maxed?
 
With your processor at 2.6ghz theres hardly a bottleneck. Not sure whats going on but you should be getting higher framerates than that. But when you say all settings maxed, are you talking about AA and AF also?
 
sorry forgot to mention it,
both game use ingame AA (i think it's 4x) and AF 16x (forceware),
at titan quest, i've tried to lower AA and AF level, even turning them off, but no considerabe performance gain, that's why i thought about some sort of bottleneck.

thanks for your kind reply.
 
wow...so far only one reply?
is there something wrong with my post? or just simply no one care about my problems?

sorry i'm a noob anyway.
 
I don't think it would be so much of a bottleneck if you oc'ed to 2.6ghz.
 
i've just checked yesterday, according to CPU-Z, my memory runs @185 Mhz, also note that my 7800GTX runs @ 510/1350, asking 4AA/16AF @ 1024x768 too much for my PC ?
 
I had issues with MUA also on my 7800GTX. I could not get a fully playble framerate on that game maxed out with my last system, 1024x768 no AA no AF on Opty 170 @2.8, 2GB RAM, 7800GTX. I tried finding tweaks to fix it but there was next to no info anywhere for the pc version. I think it was a bad port but then again it could be to much for the 7800GTX. I played it through on the PS3 and I thought it looked far superior on the pc, it looked amazing.

I didnt have an issue with Titan Quest, I played maxed out @1680x1050. I did of course notice some slow down under heavy activity but it was rare.

Try finding a similar systems 3dmark scores and see if they are somewhat close. If its a substantial amount less then there is definatly a problem. Could be drivers or even something like the motherboard. Also, what does CPU-Z Link width tab say?
 
thanks alot, i'll check it out as soon as i arrive home.
 
my system score 18233 on 3DMark2003 and 8886 on 3DMark2005, CPU-Z report my link width 16x, PCMark04 only 5061 at my system running 288*9.

last week i installed a few newer title such supreme commander, maelstrom, battlefield 2142, i'm afraid i'd be unable to enjoy eye candy on these game on my system.

i was about to buy 8800GTS 320MB but then i suspect the bottleneck to be outside my 7800GTX because lowering eye candy setting one step or two didn't help much, please help.
 
nösferatu;1031107047 said:
wow...so far only one reply?
is there something wrong with my post? or just simply no one care about my problems?

sorry i'm a noob anyway.

rofl that waz funnie

anyways, your sistam is fine, the rig in my signature gets ~8600 in 3dmarks05 ~17000 in 3dmarks03 (excuse my zeros don't remember the exact scores sorries) with my cpu overclocked at 2600mhz, its dual core but only half of the cpu is used for processing by 3dmark03/05 anyways

don't worry about it and if i remember correctly one of the settings set to medium in titans quest improves performance drastically (i think it was shadows not sure)
 
hm..i think it was "texture detail", if set to medium, it will improve drastically, but i personally feel that the looks on the rock, grass etc also change drastically.

so..assumed that it is the 7800GTX, right?
feel kinda lame to expect more from 7800GTX now.
 
I doubt there will be a bottlenck. I recently threw my friends venice 3000 clocked at 2.6 into my 7900GT SLI setup running (730/850). I don't remember exact benchmark differences but I guess they wern't insignificant enough for me to notice a difference between his chip and my Opty 170 in my system. I probably would have remembered a significant differnece. although the number 8600 strikes my mind for some reason about 3dmark06. Maybe false memory , can't recall so closely. I do know that most games played nearly the same as my opty did with his chip in. Only played 3 games, NFSMW, Quake4, CODII. I really couldn't see a large difference between the opty at 3.0 and his venice at 2.6. Resolutions were set to 1680x1050 in games that allowed it.

His chip scored 5800 on 3dmark 06 when/cpu score 975 when one of my 7900gt's was put on his mb (tforce6100) (2.5ghz cpu , 590/900 gpu)
 
No bottlenecks with my setup either (see sig). I am running at 2.65 GHz and the 7900GT gives up before the CPU does. The only cases where there are longer loading times are due to 256 MB ram on the graphics card.
 
Isnt texture detail somewhat related to system ram on some games? I remember with FEARs highest texture settings, they required like 7xxMB of system ram. Thats the game that made me finally upgrade my tccd for 2GB of UCCC.

Im not certain if Titan Quests high textures take up alot of space. That would definatly be something to look up if thats the option that cripples you. I was abit dissapointed going from a volt mod'd 7900GT KO to a 7800GTX as it perfomed much worse but it was cheap and I got my money out of it.
 
you might want to try going into your video card settings and disable the (vertical sync)

Even though you turn the vertical sync off in your games , the geforce settings menu overides it, always check that man. kinda had the same problem you did , was playing oblivion and it was syncing the frames at 30 even though I had turned vertical sync off in the game on my 3ghz with a 7800 , now that I have it turned off in the display settings menu its flucuates from 40-125 with no visual tearing .

just my 2 cents man , hope that helps with your issue
 
I was running SLI 7800 gtx KO's (factory oc 1.3 ghz i believe). They are descent cards for their time but when you turn up eye candy on them they fall on their face, especially at 1680x1050. I upgraded to an 8800 GTS 640 meg and now its a night and day diffrence. Eye candy for newer games is no problem. Runs most games very smooth.

P.S. your processor is defiantly not bottlenecking a single 7800 gtx.
 
Ok Try This out :D

First set your Texture Filter Quality to ((((High Performance)))) then click (APPLY )if you dont it will change other settings if you change it later!!!!!!!!

Ok every setting from here on out "you must" click( APPLY )after you change each one!
Next change these 3settings they mean the Most:

Turn (((off))) Vsync...............Turn Negative LOD bias to (((Clamp)))................Turn TrippleBuffer((( off )))............

Now Change these settings:

Antialiasing Mode ............(((Override any Application setting)))
Antialiasing Gamma Correction ...............(((ON)))
Antialiasing Transparency.......................(((Multisampling)))
Force MipMaps..........................(((Trilinear)))

Finally set these 2 Settings last:

Anisotropic Filtering.......2x upto16x............................ Antialiasing from 2x upto16xQ

I usually raise the last two settings as high as they go and adjust my in game settings for best results These settings are for 1024X768 and you should use the Highest Hz possible...If you like high Resolutions like 1650x1024 even that card will struggle ...if you use 1024x768 and highest Hz youll game way better than if you used a lower Hz ...Hz is speed! ....You will even be able to use your Antialiasing and Anisotropic filtering to max in most games and Medium in the more Demanding titles .. hope this helps
Have fun:cool:
 
first, i would like to thank you all for your kindness.

ok, last question now, what if i upgrade to 8800GTS 320? do i need to up my proc first? my max res is 1280x1024, thanks.
 
Ok im running a 2.0 @2.4 with DDR26400 @960 with 8800GTS 320 it Screams...Look in my sig for Vid Clock!....With your Venice it will run for Now with 8800 you will notice big differance even with a limetd CPU just Crank the Card Settings up to Everything on Youll Be Happy You Did:D

Ps those Menu Tweaks above in my post will work for a 8800GTS just put Antialising to 16Q and Anisotropic filter to 16 and your rockin...
 
I would go with processor upgrade first. I noticed a significant increase in frame rates and general performance when upgrading from my x2 3800+ 939 to my E6420. Both were running a 7950GT.

Just an IMHO, of course. YMMV. ;)
 
Ok im running a 2.0 @2.4 with DDR26400 @960 with 8800GTS 320 it Screams...Look in my sig for Vid Clock!....With your Venice it will run for Now with 8800 you will notice big differance even with a limetd CPU just Crank the Card Settings up to Everything on Youll Be Happy You Did:D

Ps those Menu Tweaks above in my post will work for a 8800GTS just put Antialising to 16Q and Anisotropic filter to 16 and your rockin...


i know that as i've seen nothing 16x on both AA and AF with my 7800GTX ( or even 7900GT ). also i already knew about texture quality setting, yes it'll improve a bit as shown with 3DMark score. normally i don't trust 3Dmark score, but when you want to oc you card, it should be your first choice to get the indication of how well your result from oc'ing the card, right?

@practal
yes, i do know that, even with 7800GTX @513/1365, gameplay as well as 3DMark score is terrible that i missed my 7900GT so much :( .
 
Yup i know that feeling!....ll be puting in a 5000+ x2 2.6windsor or a 4800+ x2 2.5brisbane when i find out which is more clockable with a DD stepping which is G1 then ill be rockin...My friend has the chips he said i could have faster of the two for building his Pc plus he lost a bet with me , funny thing is i dont like betting but he insistedLOL:D
 
I would go with processor upgrade first. I noticed a significant increase in frame rates and general performance when upgrading from my x2 3800+ 939 to my E6420. Both were running a 7950GT.

Just an IMHO, of course. YMMV. ;)

now i have two choice that is very frustating, sure you know why, i'll only, maybe up to one and a half YEAR to come, have the budget to upgrade to 8800GTS 320 OR C2D E4300 + DS3 + 2 GB DDR2 800, it a big OR. ofcourse the first option is much cheaper, hm....

and to the post above me, if i to buy 2GB of ram, i'll be end up with 2GB DDR2 and C2D, AM2 will not on my list until AMD come up with something better, for now, C2D simply offer a massive performance than the slightly better AM2 (not by clock per clock, probably only by single vs dual core)
 
Well i like the 8800 and you are on a budget ,I would Spend the least if i could and if thats what you need to do ,Just do it, Your computer wont ever pay you your money back:D But you do only have ddr400 or even 333 and not DDR2 800 this makes ahuge differance wether its AMD or Intel......Your limiting factor is your( old memory and motherboard) seeing your Cpu is 2.6Ghz is more than enuff to run a 8800 let alone your 7800GTX .

Getting the C2D will definately speed you up and will probally double or tripple your Fps is some or most games on the 7800GTX....Then down the road you can always find a E or Q 2.4 on sale and grab a kick but Video Card Before or around Christmas time When the real good stuff will be here VideoCard wise....

So in other words if you can afford to Pc Upgrade your whole system with the C2D and 2gig memory you picked and use your Gtx for now ...Just get the (Best Motherboard )you can afford to get for C2D so you can upgrade to a faster Cpu Down the Road or overclock the E4400 to say like 2.4Ghz or something ...I hear they Overclock well!;)
 
okay, thanks for the opinion.
my ram runs at 185 Mhz, but from sisoft memory bandwidth bench, i highly doubt that my system suffering from low mem bandwidth (IFAIR, mem score of a little below 6K, say 57xx), AFAIR again, on AM2, performance delta between DDR and DDR2 only slightly, CMIIW.
 
Back
Top