5200 vs gf3 - quick and dirty question

SlimShady

[H]ard|Gawd
Joined
Aug 2, 2000
Messages
1,143
managed to pick up a cheap geforce fx 5200 128MB 250/500Mhz, thinking about replacing my vanilla geforce 3 (leadtek) is it worth it? or should i return it? will i gain anything from this upgrade? thanks!
 
I'm pretty sure the 5200 would be faster, especially if you overclocked it. Do some before and after benchmarks though, just to be certain. I'd expect at least 10-15% increase in frame rate, more if you overclock, plus it has all the fancy features what what-not.
 
Above poster is wrong. Any gf3 is faster than any 5200. I own/benched with both and the gf3 clearly outperforms the 5200. The 5200 series is crap and will always be crap. There is no sugar-coating thick enough to hide that fact. Keep on using your gf3, you will be happier with it.
 
depending on what your doing the gf 3 is faster, gf 3 is equally fast as the 5200 with polygon rendering, but kicks the 5200's ass in shader performance for dx 8, 7, sm 1.0 or sm 1.4 can't really choose dx 9 since they didn't have em, but the 5200 sucked at dx 9 anyways :p

don't go by the 10.4 mt fillrates of the 5200 its will fool you! its crap beyond belief!
 
Same here I own both a 5200 and a geforce 3. I decided one day to replace my secondary system which had the geforce 3 with a 5200 and the 3dmarks scores plus the games where actually getting less frames. The gf3 beats the 5200 by over a thousand marks and mine is the regular geforce 3 not the ti500.
 
5200's suck, period. I had one, a 64 bitter, as a loaner, and it sucked hardcore. I doubt the 128 bit would be much better. Yes it has basic DX9 support, but it can't be used smoothly anyways. :)

I personally would get the GF3, or perhapsa cheap GF4. Great DX8 cards.
 
my money's on the gf3, and it doesn't matter what ram the 5200 has. btw tom is a -ing sellout and his numbers are crap too for the record

BURN IN HELL TOM, BURN

but anyways the fx5 series was a mistake, until you hit about fx5700 you are dealing with crap in silicon form. gf3's are very tough little workers, you will get a lot out of a good gf3 ti200 or ti500 in particular.
 
You won't really gain anything. DX9 on the 5200 isn't fast enough to be usable and both cards run DX8 (5200 has PS1.0-1.4, GF3 has PS1.0-1.1). If you use anisotropic filtering and/or FSAA, the 5200 is way faster.

I'd pick the 5200 over the GF3.
 
yep, majority seems to say its crap. this thing will be heading back to the store. thanks!
 
pxc said:
You won't really gain anything. DX9 on the 5200 isn't fast enough to be usable and both cards run DX8 (5200 has PS1.0-1.4, GF3 has PS1.0-1.1). If you use anisotropic filtering and/or FSAA, the 5200 is way faster.

I'd pick the 5200 over the GF3.

Mostly true - there are SOME games the DX9 effects on the 5200 can be used (Doom 3, for example), in which case the GF3 is going to be coming up lacking.

I dunno WHAT anyone is smoking who claims a GF3 is faster in ANYTHING, though. Maybe they just don't know how to configure their PCs? Anyway...

A brief search on Futuremark's ORB shows right away the difference:
FX5200: 2088
GF3: 1514

Searching on various other review sites (FiringSquad, XBit, etc) shows similar results, although you have to compare across multiple reviews. The 5200 (128-bit version) IS faster than the GF3, period, end of story, in all cases.

Now, the 64-bit 5200....well....is a LOT slower. Usually it doesn't work that when you cut part of something in a computer in half, it exactly halves performance...but that's pretty much what is happening, here. The 64-bit 5200 may well be slower than a GF3 in every area, but the 'regular' 5200 is NOT.
 
i was trying to determine to see if this was the 128 bit version, as nothing i have seen says so or not, and i couldnt tell through 3dmark either. whats the best way to determine this?
 
RivaTuner will tell you.

And, yeah, that's the ONLY way. Most manufacturer web pages have 'generic' fact sheets for the 5200 series in general (or, worse, like MSI does, generic fact sheets for the ENTIRE FX line. Yeah, like the FX5200 is a .13u part with 4x2 architecture. Uhhhh....NO.)

Get the latest version of RivaTuner, on startup, in the 'Target Adapter' section, the second box will tell you what kind of card you have (the core, revision, and memory pipe width)
 
I would be more convinced by 3d2001 scores myself.

I've played with both on customer's systems, the Geforce 3 is a tad slower than a Geforce 4 Ti4200. The 5200 is on par with a Radeon 9000.

More often than not, it depends on the game, but the 5200 just doesn't feel as smooth. I did not try AA or AF tho.
 
0ldman said:
I would be more convinced by 3d2001 scores myself.

I've played with both on customer's systems, the Geforce 3 is a tad slower than a Geforce 4 Ti4200. The 5200 is on par with a Radeon 9000.

More often than not, it depends on the game, but the 5200 just doesn't feel as smooth. I did not try AA or AF tho.

Why would you be 'more convinced' by a test that is primarily CPU-bound instead of graphics-card limited? Both the GF3 and FX5200 score well over 10k in it with 'modern' systems!

In any case, check out the 3dMark03 compare links I provided - it breaks it down by game test. Remember, even in 3dMark03, you aren't looking at STRICTLY DX9 tests. Game test 1 is primarily DX7, game test 2 and 3 are DX8, and only game test 4 is DX9. The FX5200 whips the GF3 in all 4 tests on comparable systems!

As an aside, the FX5200 DOES tend to be a bit limited in its memory bus, which means you really can't do high-res gaming with it. Keep the resolution low, though (800x600 or lower), and the substantially more powerful core than the GF3 will allow it to pull ahead quite a bit.
 
dderidex said:
Why would you be 'more convinced' by a test that is primarily CPU-bound instead of graphics-card limited? Both the GF3 and FX5200 score well over 10k in it with 'modern' systems!

In any case, check out the 3dMark03 compare links I provided - it breaks it down by game test. Remember, even in 3dMark03, you aren't looking at STRICTLY DX9 tests. Game test 1 is primarily DX7, game test 2 and 3 are DX8, and only game test 4 is DX9. The FX5200 whips the GF3 in all 4 tests on comparable systems!

As an aside, the FX5200 DOES tend to be a bit limited in its memory bus, which means you really can't do high-res gaming with it. Keep the resolution low, though (800x600 or lower), and the substantially more powerful core than the GF3 will allow it to pull ahead quite a bit.

I'm sorry... You are very incorrect. I had a PNY 5200 that scored lower than my GF2MX400 in 3dMark, and in Aquamark. Not only that, my friend's GF2 GTS handed that same 5200's ass to it the same benches as well. My Pixelview 5200 (had faster RAM, larger bus, better OCed) could almost keep up with the GTS, and was a little faster than the MX400.

I have since fried the MX400, the PNY 5200 killed itself and PNY is an asshole conservatory when it comes to RMA, and my friend's GTS is, well, at his house. :p

When it comes down to it, the GF2 series is designed for 2 or 3 things, which it does very well. The FX5200 is designed to do a LOT of things, with about the same power. Not only does it have too few resources for what it is designed to do, but the horrid 128 bit bus just kills it (my Pixelview has a 256 bit bus, oddly enough). Comparing the 5200 with anything newer or older is just pointless, it's pathetic. The 128 bit 5200's can't even keep up with a MX400, and the 256 bit versions can almost keep up with a GTS. I had benches to prove it, I wish that I had SOMETHING left from that... Well, I might, let me dig around on my computers.
 
Going by numbers:

The gf3 is on a 4x2 architecture while the 5200 is on 4x1.

geforce 3 800 Mpixels 1600 Mtexels 7.4 GB/s Memory bandwidth
fx 5200 128 bit 1000 Mpixels 1000 Mtexels 6.4 GB/s Memory bandwidth
fx 5200 *64 bit 1000 Mpixels 1000 Mtexels 3.2 GB/s Memory bandwidth
fx 5200 Ultra 1300 Mpixels 1300 Mtexels 10.4 GB/s Memory bandwidth


With the exception of the 5200 64 bit data(which I just halfed the memory bandwidth of the 128 bit version for) all the data came from the tech report.

Another factor to consider is that the 5200 doesn't have any color or z compression

No color or Z compression - Unlike the rest of the NV3X line, NV34 can't do color or Z compression. The lack of color compression should hamper the chip's performance primarily with antialiasing enabled, but the lack of Z compression will hurt across the board. Without advanced lossless compression schemes, NV34 doesn't make as efficient use of the bandwidth it has available, which reduces the chip's overall effective fill rate (or pixel-pushing power).

While the geforce 3 does.

Z compression — Like the Radeon, the GeForce3 is capable of compressing and decompressing Z data on the fly. This info, which describes the depth of each pixel (its position on the Z axis), chews up a lot of memory bandwidth. NVIDIA claims a peak compression ratio of 4:1 on Z data, and that compression routine is "lossless," so visual fidelity isn't compromised.

the geforce 3 also has z occulusion which I don't believe the 5200 has.

http://www.techreport.com/reviews/2003q2/geforcefx-5200/vil.gif



in short I would like to retarct my statement of the 5200 being faster, and say that the gf3 > 5200

The geforce 3 owns by having more memory bandwidth and z compression :) The only reason a 5200 scores higher on 3d mark 03 is because it can complete the direct x 9 tests.
 
archevilangel said:
The geforce 3 owns by having more memory bandwidth and z compression :) The only reason a 5200 scores higher on 3d mark 03 is because it can complete the direct x 9 tests.

:eek: You just now figured that out?!

Do you have ANY idea how pathetic a 5200 will score in '03? IIRC, both of mine scored under 600 points... I don't think I could even get a score from the MX400 in '03, though. :p
 
’m‚³‚ñ said:
:eek: You just now figured that out?!

Do you have ANY idea how pathetic a 5200 will score in '03? IIRC, both of mine scored under 600 points... I don't think I could even get a score from the MX400 in '03, though. :p

That's pretty amazing, given that I just linked a 5200 score above that scored over 2000 in 3dmark03. Hell, *I'm* scoring over 1400 in 3dMark03, and that's on a 64-bit version of the 5200! Pardon me while I stand in awe of your PC optimizing capabilities. :rolleyes:

Like I said, some of the people in this thread obviously have no idea how to configure their systems.

archevilangel said:
The geforce 3 owns by having more memory bandwidth and z compression The only reason a 5200 scores higher on 3d mark 03 is because it can complete the direct x 9 tests.

Hey, here's an idea, how about you try again and READ THE THREAD.

The 5200 outscores the GF3 because, as I pointed out above, it beats the GF3 in *every* *single* *test*. Not just because it can complete the DX9 test while the GF3 can't (althoug that certainly plays a part) - if you look at the scores I linked, you'll see that the FX5200 is faster in EVERY GAME TEST.

Here is another FX5200 that scores 2167 in 3dMark03. You can see the details on this one for the synthetic tests, and how it stacks up to the GF3.

Does it lose in multi-texturing? Well, yeah, of couse, it's 4x1 instead of 4x2. But, it has a MUCH more powerful core, and so wins everything else.
 
dderidex said:
That's pretty amazing, given that I just linked a 5200 score above that scored over 2000 in 3dmark03. Hell, *I'm* scoring over 1400 in 3dMark03, and that's on a 64-bit version of the 5200! Pardon me while I stand in awe of your PC optimizing capabilities. :rolleyes:

Like I said, some of the people in this thread obviously have no idea how to configure their systems.



Hey, here's an idea, how about you try again and READ THE THREAD.

The 5200 outscores the GF3 because, as I pointed out above, it beats the GF3 in *every* *single* *test*. Not just because it can complete the DX9 test while the GF3 can't (althoug that certainly plays a part) - if you look at the scores I linked, you'll see that the FX5200 is faster in EVERY GAME TEST.

Here is another FX5200 that scores 2167 in 3dMark03. You can see the details on this one for the synthetic tests, and how it stacks up to the GF3.

Does it lose in multi-texturing? Well, yeah, of couse, it's 4x1 instead of 4x2. But, it has a MUCH more powerful core, and so wins everything else.

and 3dmark2k3 is a useless benchmark...no need to compare a DX9 card with a DX8 card, in which a DX8 card WILL NOT BE ABLE to finish the DX9 part of the test (and even saying the fx5200 is a dx9 card is stretching the truth)...so comparing the 2 cards with 3dmark2k3 is useless....use 3dmark2k1 then compare.

DASHlT
 
http://service.futuremark.com/compare?2k1=7539841
show me a 5200 that outscores that please.


http://service.futuremark.com/compare?2k3=2036168
Crappy, but its a ti200, the lowest of the gf3's. Can't even complete the last test. I never tried forcing ps1.1 on it either so i have no idea if that makes a difference.

Rest my case, the GF3 is a MUCH BETTER card than ANY 5200. 5200 = shit.
gf3 = kickass for its time and better than the 5200.


http://service.futuremark.com/compare?2k3=3142518
If you compare the scores of my ti200 vs this 64 bit 5200, you will see why the test means jack shit. My card outscores that 5200 in every test that it can complete. But the 5200 has a higher score since it can "run" the last test. That is bullshit that just because it can run a test it gets a higher score. Hell, if you are getting 2k in 3dmark03, you are beating most gf4's. Are you telling me that a 5200 is better than them too?

And the below poster, you are a fool for saying that a 6800 wouldn't score much better than 11k at that speed. It would actually be around 18k-22k for that speed. With the older cards, it isn't cpu limited as much as you think. It gives you system performance and that translates into how well your games run. The 5200 is crap and anyone who thinks otherwise either hasn't used one or is trying to start a fight over it. I have used both and the gf3 actually plays games well. I have never seen a 5200 play a game well imo.
 
DASHlT said:
and 3dmark2k3 is a useless benchmark...no need to compare a DX9 card with a DX8 card, in which a DX8 card WILL NOT BE ABLE to finish the DX9 part of the test (and even saying the fx5200 is a dx9 card is stretching the truth)...so comparing the 2 cards with 3dmark2k3 is useless....use 3dmark2k1 then compare.

DASHlT

Look, really, can you READ?

The FX5200 *beats the GF3* in the non-DX9 tests, too! It wins in the DX7 test, it wins in both DX8 tests.

Have you actually bothered to LOOK around the web at benchmark results? The FX5200 is faster in UT2k3, it's faster in Quake 3, etc, etc. The FX5200 *is* faster than the GF3, end of story, and if you couldn't get it any faster in YOUR system, I'd seriously suggest you re-examine your configuration!

The FX5200 has its fair share of problems, it certainly doesn't stack up well against 'modern' cards, but it does JUST FINE against the rather ancient GF3.

botreaper said:
show me a 5200 that outscores that please.
Why would I even try? 3dMark2001 is completely CPU limited at this level. The graphics card has virtually nothing to do with the score at that point. Hell, 6800s hardly get higher scores than that at that CPU speed!

(EDIT: In any case, here is an FX5200 that outscores it just fine.)
 
OMG...you found a p4extreme edition system that ran a 5200. How many more mhz is that running over my old rig? i believe it was somewhere around 1.3 ghz more. I would bet that the gf3 would outscore it if it had a p4ee running it.


And i did a quick little search and found that the highest gf3 score for 3dmark01 was 15904 while the highest 5200U score was 14238. Tell me why the gf3 is getting a higher score when you say it is an inferior card? I'll tell you. The gf3 is a much better card. It outperforms the 5200 in every way i use my card for. I play FPS and RPG games on my computer.
 
botreaper10 said:
OMG...you found a p4extreme edition system that ran a 5200. How many more mhz is that running over my old rig? i believe it was somewhere around 1.3 ghz more. I would bet that the gf3 would outscore it if it had a p4ee running it.


And i did a quick little search and found that the highest gf3 score for 3dmark01 was 15904 while the highest 5200U score was 14238. Tell me why the gf3 is getting a higher score when you say it is an inferior card? I'll tell you. The gf3 is a much better card. It outperforms the 5200 in every way i use my card for. I play FPS and RPG games on my computer.

its because it can run the DX9 test on 3dmark2k3, which the gf3 cannot run....plus the system setup is totally different...not a good compare or arguement on his part. We all know the gf3 is faster then an fx5200...
 
DASHlT said:
its because it can run the DX9 test on 3dmark2k3, which the gf3 cannot run....plus the system setup is totally different...not a good compare or arguement on his part. We all know the gf3 is faster then an fx5200...

Okay, look, for the last time, IT DOES NOT MATTER IF THE 5200 IS DX9 OR NOT. It beats the GF3 in tests that have NOTHING TO DO WITH DX9 AT ALL!!!

Hell, take a look here. Digit-Life did a rundown of every card released from 1999-2003. Notice that the GF3 Ti200 loses in *every* *single* *test* to the FX5200? They test in Code Creatures, Serious Sam: Second Encounter, Return to Castle Wolfenstein, UT2k3, Unreal II, RightMark3d, and Splinter Cell.

Granted, the Ti200 is usually *right* under the FX5200 in these tests, but, then, they used the 53.03 nVidia drivers rather than the newer WHQL drivers (that do continue to offer performance boosts for the GF-FX line).

botreaper10 said:
OMG...you found a p4extreme edition system that ran a 5200. How many more mhz is that running over my old rig?.....Tell me why the gf3 is getting a higher score when you say it is an inferior card?

Well, you answered your own question if you are smart enough to see it. 3dMark2001 is useless on modern graphics cards, because the test is ENTIRELY CPU bound at the moment. 3dMark03 is approaching that same place for the same reason, so it's cool they are about to put out a new version.
 
dderidex said:
Okay, look, for the last time, IT DOES NOT MATTER IF THE 5200 IS DX9 OR NOT. It beats the GF3 in tests that have NOTHING TO DO WITH DX9 AT ALL!!!

Hell, take a look here. Digit-Life did a rundown of every card released from 1999-2003. Notice that the GF3 Ti200 loses in *every* *single* *test* to the FX5200? They test in Code Creatures, Serious Sam: Second Encounter, Return to Castle Wolfenstein, UT2k3, Unreal II, RightMark3d, and Splinter Cell.

Granted, the Ti200 is usually *right* under the FX5200 in these tests, but, then, they used the 53.03 nVidia drivers rather than the newer WHQL drivers (that do continue to offer performance boosts for the GF-FX line).




Well, you answered your own question if you are smart enough to see it. 3dMark2001 is useless on modern graphics cards, because the test is ENTIRELY CPU bound at the moment. 3dMark03 is approaching that same place for the same reason, so it's cool they are about to put out a new version.

Um you need to look at the post for the topic...he's asking about a GF3 not a ti200 (which was slower) LOL dood we all know the ti200 is slower...but the vanilla gf3 <which is what he has> is faster then the fx5200.


DASHlT

P.S. that digit-life review also only has the ti200 wth 64megs of ram...the gf3 vanilla also has 128 megs....not a VERY good compare at all LOL

P.S.S. Also on this vga chart

http://graphics.tomshardware.com/graphic/20021218/vgacharts-05.html

the vanilla gf3 gets 800+ points more on 3dmark2k1 then a ti200 with 64megs of ram.
 
DASHlT said:
Um you need to look at the post for the topic...he's asking about a GF3 not a ti200 (which was slower) LOL dood we all know the ti200 is slower...but the vanilla gf3 <which is what he has> is faster then the fx5200.


DASHlT

P.S. that digit-life review also only has the ti200 wth 64megs of ram...the gf3 vanilla also has 128 megs....not a VERY good compare at all LOL

Wow, you really are a little slow, aren't you?

1) Care to find me a link of a GF3 with 128mb of ram?

Oh, that's right, THEY NEVER MADE ONE. There were 128mb versions of the Ti200, but never of the regular GF3 (that was the start of marketting's discovery that people bought on 'mb of ram' numbers alone, rather than actual performance of the card). Some manufacturers eventually did make 128mb Ti500s, but there were never many of them.

2) You don't remember the launch of the GF3 Ti series at all, do you? The Ti200 was the 'replacement' part for the regular GF3 - it was a *little* slower....very, very, little slower. Almost identical, though. The Ti500 was substantially faster.
 
dderidex said:
Wow, you really are a little slow, aren't you?

1) Care to find me a link of a GF3 with 128mb of ram?

Oh, that's right, THEY NEVER MADE ONE. There were 128mb versions of the Ti200, but never of the regular GF3 (that was the start of marketting's discovery that people bought on 'mb of ram' numbers alone, rather than actual performance of the card). Some manufacturers eventually did make 128mb Ti500s, but there were never many of them.

2) You don't remember the launch of the GF3 Ti series at all, do you? The Ti200 was the 'replacement' part for the regular GF3 - it was a *little* slower....very, very, little slower. Almost identical, though. The Ti500 was substantially faster.

http://graphics.tomshardware.com/graphic/20021218/vgacharts-05.html

i found 1...i own a gf3 vanilla.....and THEY DID MAKE EM!!
 
dderidex said:
I SAID they made Ti200 and Ti500 128mbs, they did NOT make regular GF3 128mbs.

Hmmm (while looking at his pc with a nvidia gf3 (non ti) with 128megs of ram) Ok your right man.

DASHlT
 
How the hell do you get 370/600 clocks? Every GF5200 I've ever seen was like 250/400. I never even got an FX5200 to OC past 300/500? Or has there been a revision to the cards since I got mine (PNY was about 2 months after the FX line was launched, Pixelview was about 5 months after that).
 
’m‚³‚ñ said:
Every GF5200 I've ever seen was like 250/400.
Yeah, that's the stock 5200 speed. The cores can run with passive cooling at 250MHz.

I have a PCI MD 5200 "Plus" that comes with a crystal-orb looking cooler and 4ns Samsung memory. It's called a 5200 Plus because the memory comes at 500MHz default. I've overclocked it to 300MHz core and 600MHz memory. The other cards from the same manufacturer i've owned before did up to 315MHz core and 630MHz memory, making it close to 5200 Ultra speeds.
 
ahhh the culprit is that you all buy into 3dmark and futuremark. allow me to set myself up for some flames here, but its time you boys felt a little cold hard truth- 3DMARK IS VERY OFFBALANCED! you can tweak that crap to tell you all sorts of amazing things you never knew, like that a 9500 can beat a 9800XT...right...or maybe its that...the software is shit. just maybe. 2003 is better in some ways but honestly, if you try the good old fashioned game performance test with a few high ened 3d games, you will find that 3dmarks is only useful as a guide, not an absolute deciding factor in which card is king. it just got overrated because it was a cool idea when it came out, and everyone just assumed that it was accurate so now when you try to say that its bull in some ways, you get a mass of bandwagon jumping bastards arguing with you.


please dont bother trying to convince me that 3dmarks "really is a good program" though, ive seen too many damned descrepencies to trust it anymore.
 
a 9500 can beat a 9800XT...right...or maybe its that...the software is shit.
heh my 9500 w/ 256 bit memory and 8 pipelines was owning STOCK 9800xt's left and right after my volt mod. I had it going at around 440 mhz, so yes it's possible.
 
holy shit, what have i started?? ;)

i found out this morning that it was the 64-bit edition, so i went and returned it. this was indeed the 'plus' model, as the stock speeds were 250/500 (although the front of the box said max of 450... rivatuner said otherwise) but i returned it anyways. i didnt do any benmarks, but it scored just over 1000 3dmarks in 3dmark2k3. i hope i made the right decision in returning it instead, but im not going to lose sleep over it.

i think ill pick up a 9800 or something similar, although i tend to prefer nvidia products.

continue flame war...
 
I think you made the right choice.Also, comparing 3dmark scores, on different systems, with varying clock speeds, and using 03 which is DX9 biased, has got to be the stupidest thing I've ever seen.
 
I had a Gainward Geforce3, I got a 6800GT,and then gave my dad the geforce3, (He had a 5600xt before) And from tests, the Gefroce 3 outperformed the 5600XT so I'm taking it will out perform a 5200.
 
archevilangel said:
well I would say it only depends on whether it's the 64 bit or 128 bit ram 5200. 5200 would kill a gf3 if it had 128 bit ram. http://graphics.tomshardware.com/graphic/20031229/images/image009.gif (eww tom's but it does show 5200 putting out respectable numbers which I am sure are faster than a gf3)

1. Toms sucks
2. Check your link before you post.
3. You're wrong. The 5200U is approx == GF4MX in overall speed, and the GF4MX < GF3.
 
dderidex said:
Mostly true - there are SOME games the DX9 effects on the 5200 can be used (Doom 3, for example), in which case the GF3 is going to be coming up lacking.

I dunno WHAT anyone is smoking who claims a GF3 is faster in ANYTHING, though. Maybe they just don't know how to configure their PCs? Anyway...

A brief search on Futuremark's ORB shows right away the difference:
FX5200: 2088
GF3: 1514

Searching on various other review sites (FiringSquad, XBit, etc) shows similar results, although you have to compare across multiple reviews. The 5200 (128-bit version) IS faster than the GF3, period, end of story, in all cases.

Now, the 64-bit 5200....well....is a LOT slower. Usually it doesn't work that when you cut part of something in a computer in half, it exactly halves performance...but that's pretty much what is happening, here. The 64-bit 5200 may well be slower than a GF3 in every area, but the 'regular' 5200 is NOT.

1. Not using synthetic benchmarks, I think you'll find the 5200 way behind. 3dmark 03 sucks.
2. Both of your 5200 scores are on the invalid drivers. Use the approved drivers to make the comparison valid.

EDIT: I'm not totally right.

http://www.digital-daily.com/video/nvidia-nv34/index03.htm
Puts the 5200 as slower than the MX480, badly...

http://www.bjorn3d.com/read.php?cID=279
I don't know about you, but those are pretty damn close to the numbers I got for my GF3.

They're about equal, I'd say, with the 5200 getting a knock for being a total piece of shit in design.
 
Back
Top