AGP > 6800GT to X1950 Pro, Expected too Much?

jedihobbit

Gawd
Joined
Nov 3, 2005
Messages
963
Trying to stretch a friend’s system (NF7-S w/ Sempron 3000+ @ 2.4) by up gradeing his aging PNY 6800GT with an AGP version of the X1950 Pro. Originally thought that the extra ½” in length wasn’t an issue, card length wasn’t but read on……

Here is what we got:

x1950ProBoxSM.jpg
x1950Prow-BoxSM.jpg
x1950SM.jpg


As usual for me the simple in simple upgrade never seems to be there! I guess it is the extra circuitry for AGP but this card has a whole lot of BS at the end that made the fit a real pain.

First off the original IDE cable for the optical drives got knocked out of its socket because of being “too short” with the new card.

IDE-GPUConflict1SM.jpg
IDE-GPUConflict2SM.jpg


Also something still didn’t “feel right” so I looked on the topside and a couple of small “thingies” unlatched the last mem card. Have to do a little “twist” & “rotate” when seating to avoid snagging the latch.

GPUChip-MemLatchConflict1SM.jpg
GPUChip-MemLatchConflict2SM.jpg


Luckily I had a longer IDE cable in my Celtic Spirit parts I could raid.

IDE-GPUCableConflict1SM.jpg
NewIDECablearoundGPU.jpg


It is in the system running. I believe, Catalyst 7.2 and wondering if maybe I expected too much? It could be what I’m seeing is the CPU limits as here is my only reference: PNY 6800GT @ 404/1132 vs. the stock Visiontek X1950 Pro………

3Dmarks ’03……… 12221 vs. 13087
3Dmarks ’05……… 5695 vs. 7545
3Dmarks ’06……… 3251 vs. 4332

Don’t know, is that a respectable increase for an almost $200.00 outlay? Just don’t have a feel, please “enlighten” me.
 
I don't think you're going to see a big gain in just basic 3DMark numbers. I would imagine the difference between the two cards would be your ability to crank up the AA and AF. If you want to see if it's work it, I would have your friend take a game he played often on his old card, crank up the details above what they were, and see how well it runs. If he is not pleasently surprised there, then it wasn't worth it.
 
first, i wouldn't go just by 3dmock.

secondly, it's likely to be a little cpu limited. i made the same upgrade on one of my pc's with a a64 @2.6Ghz and the difference was almost night/day. i used the HIS turbo, but a slight increase on clocks shouldn't make a significant difference.
 
it's likely to be a little cpu limited.

Might be but it might not be. Is that the 512k cache 3000+ Sempron? I had one and they are great chips. Are you overclocking the CPU at all? You should be on that motherboard/RAM (G.Skill, I seen it ;) ).
 
first, i wouldn't go just by 3dmock.

secondly, it's likely to be a little cpu limited. i made the same upgrade on one of my pc's with a a64 @2.6Ghz and the difference was almost night/day. i used the HIS turbo, but a slight increase on clocks shouldn't make a significant difference.

Made an OOPS as the Sempron is oc'ed to 2.4. :eek:
 
First, your problems with installing the hardware, thats stupid ABIT layout for ya.

Second, weak performance. That Sempron CPU is severly bottlenecked with that card.
You should be able to go over 10.000 points in 3dMark-05 and over 5000 in 3dMark-06 with the current generation of CPU's

However, in games you should be able to enable a lot of AA and AF on very high detail without loosing much in fps.
 
Just got the same card, the VisonTek X1950 AGP PRO two weeks ago to upgrade a 6800GT for an Athlon 64 3400+ that I'm running Vista Ultimate. I did change out the memory as for some reason the Vista Experience memory score was low, now it’s fine.

The reason I’m running Vista on such old hardware is that I want Vista drivers and hardware to mature a little bit before I build a high end Vista rig, and I want to play Halo 2 in about six weeks, so I’m thinking that this system should provide enough juice for that till I upgrade.
 
First, your problems with installing the hardware, thats stupid ABIT layout for ya.

Second, weak performance. That Sempron CPU is severly bottlenecked with that card.
You should be able to go over 10.000 points in 3dMark-05 and over 5000 in 3dMark-06 with the current generation of CPU's

However, in games you should be able to enable a lot of AA and AF on very high detail without loosing much in fps.

In discussion with the system owner it was felt the GPU upgrade would gain him more in game play as he has no desire to leave SKT A yet. So my next challenge is to figure how to get him more CPU wise.
 
Look at it this way--in 3DMark06 his score rose 33%. The 3DMark06 score is fairly heavily influenced by CPU performance, so considering that, the score increase is even more impressive. Would a 33% boost in game performance make a worthwhile difference in your friend's user experience? I'd think so. As others have said, crank up the actual games and see if there's a "wow" factor. If so, success.

PS I don't think there's going to be a better Socket A CPU out there than the one he already has, especially at the given OC.
 
Wow. Socket A mobo. Boy, those things ARE getting long in the tooth.
Doing a bit of digging, though, the '3000+' IS the fastest chip you can put in it. But, seriously, PRODUCTION of that chip was discontinued nearly two years ago. Definitely upgrade time this year!

MrMike has a good point, though. The video card is definitely more power than the CPU knows what to do with...so it makes sense to put more load on the video card. Increase FSAA and AF settings higher than he could before. No CPU hit for doing this, it's all video card work, and you can usually find a spot where the increased graphics quality is high enough the video card is then 'running in pace' with the CPU.

(Keep in mind, though, given a CPU that old...it's running MUCH slower FPS than it otherwise would. "Oblivion" will barely be playable FPS with that CPU at the quality levels that video card can 'handle' keeping up with it.)
 
Long in the tooth isn't the word. I'm stuck with one for now too. Hey jedi, now I know where I can find you... LOL
 
In discussion with the system owner it was felt the GPU upgrade would gain him more in game play as he has no desire to leave SKT A yet. So my next challenge is to figure how to get him more CPU wise.

Quite true, next step would mean changing at least motherboard and CPU. Going with the latest generation CPU's you would be looking at adding new memory and a PCIe graphics card and most likely a new PSU as well.
Not much more you can do CPU wise with that system without getting a new mobo and CPU.
 
Wow. Socket A mobo. Boy, those things ARE getting long in the tooth.
Doing a bit of digging, though, the '3000+' IS the fastest chip you can put in it. But, seriously, PRODUCTION of that chip was discontinued nearly two years ago. Definitely upgrade time this year!

MrMike has a good point, though. The video card is definitely more power than the CPU knows what to do with...so it makes sense to put more load on the video card. Increase FSAA and AF settings higher than he could before. No CPU hit for doing this, it's all video card work, and you can usually find a spot where the increased graphics quality is high enough the video card is then 'running in pace' with the CPU.

(Keep in mind, though, given a CPU that old...it's running MUCH slower FPS than it otherwise would. "Oblivion" will barely be playable FPS with that CPU at the quality levels that video card can 'handle' keeping up with it.)

Actually a 3200+ AXP would be the best chip you can get (stock-wise) for Skt A. I would search around the FS/FT section here and on Ebay for a 2500+ XP-M tho. I am pretty sure that the 3000+ Sempy in it now is the 512k cache Barton core as I had one but you better check to make sure first before wanting to upgrade it. If it already is the Barton core then I think that the system is stuck til the owner decides to get a cpu/mobo.

What is the difference between the Sempy Barton core and the XP/XP-M Barton core?
 
jbmx4life I'm 80 - 20 sure it is Barton and I know I don't know the difference. :D

The general response I've received from this question are in the same direction >> this card with an "old" cpu isn't about how fast but how much!

Besides it isn't even my machine, and the owner has to play his games to see how he likes it. ;)
 
D/L tha latest CPU-Z and check it out. That is the best way unless you want to pull the chip out and jot down the numbers....:rolleyes:
 
Shader heavy games are where you'll see the biggest difference with that card vs the old one.
 
Back
Top