Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
++ QFTpxc said:If you use anisotropic filtering and/or FSAA, the 5200 is way faster.
I'd pick the 5200 over the GF3.
pxc said:You won't really gain anything. DX9 on the 5200 isn't fast enough to be usable and both cards run DX8 (5200 has PS1.0-1.4, GF3 has PS1.0-1.1). If you use anisotropic filtering and/or FSAA, the 5200 is way faster.
I'd pick the 5200 over the GF3.
0ldman said:I would be more convinced by 3d2001 scores myself.
I've played with both on customer's systems, the Geforce 3 is a tad slower than a Geforce 4 Ti4200. The 5200 is on par with a Radeon 9000.
More often than not, it depends on the game, but the 5200 just doesn't feel as smooth. I did not try AA or AF tho.
dderidex said:Why would you be 'more convinced' by a test that is primarily CPU-bound instead of graphics-card limited? Both the GF3 and FX5200 score well over 10k in it with 'modern' systems!
In any case, check out the 3dMark03 compare links I provided - it breaks it down by game test. Remember, even in 3dMark03, you aren't looking at STRICTLY DX9 tests. Game test 1 is primarily DX7, game test 2 and 3 are DX8, and only game test 4 is DX9. The FX5200 whips the GF3 in all 4 tests on comparable systems!
As an aside, the FX5200 DOES tend to be a bit limited in its memory bus, which means you really can't do high-res gaming with it. Keep the resolution low, though (800x600 or lower), and the substantially more powerful core than the GF3 will allow it to pull ahead quite a bit.
No color or Z compression - Unlike the rest of the NV3X line, NV34 can't do color or Z compression. The lack of color compression should hamper the chip's performance primarily with antialiasing enabled, but the lack of Z compression will hurt across the board. Without advanced lossless compression schemes, NV34 doesn't make as efficient use of the bandwidth it has available, which reduces the chip's overall effective fill rate (or pixel-pushing power).
Z compression Like the Radeon, the GeForce3 is capable of compressing and decompressing Z data on the fly. This info, which describes the depth of each pixel (its position on the Z axis), chews up a lot of memory bandwidth. NVIDIA claims a peak compression ratio of 4:1 on Z data, and that compression routine is "lossless," so visual fidelity isn't compromised.
archevilangel said:The geforce 3 owns by having more memory bandwidth and z compression The only reason a 5200 scores higher on 3d mark 03 is because it can complete the direct x 9 tests.
m³ñ said:You just now figured that out?!
Do you have ANY idea how pathetic a 5200 will score in '03? IIRC, both of mine scored under 600 points... I don't think I could even get a score from the MX400 in '03, though.
archevilangel said:The geforce 3 owns by having more memory bandwidth and z compression The only reason a 5200 scores higher on 3d mark 03 is because it can complete the direct x 9 tests.
dderidex said:That's pretty amazing, given that I just linked a 5200 score above that scored over 2000 in 3dmark03. Hell, *I'm* scoring over 1400 in 3dMark03, and that's on a 64-bit version of the 5200! Pardon me while I stand in awe of your PC optimizing capabilities.
Like I said, some of the people in this thread obviously have no idea how to configure their systems.
Hey, here's an idea, how about you try again and READ THE THREAD.
The 5200 outscores the GF3 because, as I pointed out above, it beats the GF3 in *every* *single* *test*. Not just because it can complete the DX9 test while the GF3 can't (althoug that certainly plays a part) - if you look at the scores I linked, you'll see that the FX5200 is faster in EVERY GAME TEST.
Here is another FX5200 that scores 2167 in 3dMark03. You can see the details on this one for the synthetic tests, and how it stacks up to the GF3.
Does it lose in multi-texturing? Well, yeah, of couse, it's 4x1 instead of 4x2. But, it has a MUCH more powerful core, and so wins everything else.
DASHlT said:and 3dmark2k3 is a useless benchmark...no need to compare a DX9 card with a DX8 card, in which a DX8 card WILL NOT BE ABLE to finish the DX9 part of the test (and even saying the fx5200 is a dx9 card is stretching the truth)...so comparing the 2 cards with 3dmark2k3 is useless....use 3dmark2k1 then compare.
DASHlT
Why would I even try? 3dMark2001 is completely CPU limited at this level. The graphics card has virtually nothing to do with the score at that point. Hell, 6800s hardly get higher scores than that at that CPU speed!botreaper said:show me a 5200 that outscores that please.
botreaper10 said:OMG...you found a p4extreme edition system that ran a 5200. How many more mhz is that running over my old rig? i believe it was somewhere around 1.3 ghz more. I would bet that the gf3 would outscore it if it had a p4ee running it.
And i did a quick little search and found that the highest gf3 score for 3dmark01 was 15904 while the highest 5200U score was 14238. Tell me why the gf3 is getting a higher score when you say it is an inferior card? I'll tell you. The gf3 is a much better card. It outperforms the 5200 in every way i use my card for. I play FPS and RPG games on my computer.
DASHlT said:its because it can run the DX9 test on 3dmark2k3, which the gf3 cannot run....plus the system setup is totally different...not a good compare or arguement on his part. We all know the gf3 is faster then an fx5200...
botreaper10 said:OMG...you found a p4extreme edition system that ran a 5200. How many more mhz is that running over my old rig?.....Tell me why the gf3 is getting a higher score when you say it is an inferior card?
dderidex said:Okay, look, for the last time, IT DOES NOT MATTER IF THE 5200 IS DX9 OR NOT. It beats the GF3 in tests that have NOTHING TO DO WITH DX9 AT ALL!!!
Hell, take a look here. Digit-Life did a rundown of every card released from 1999-2003. Notice that the GF3 Ti200 loses in *every* *single* *test* to the FX5200? They test in Code Creatures, Serious Sam: Second Encounter, Return to Castle Wolfenstein, UT2k3, Unreal II, RightMark3d, and Splinter Cell.
Granted, the Ti200 is usually *right* under the FX5200 in these tests, but, then, they used the 53.03 nVidia drivers rather than the newer WHQL drivers (that do continue to offer performance boosts for the GF-FX line).
Well, you answered your own question if you are smart enough to see it. 3dMark2001 is useless on modern graphics cards, because the test is ENTIRELY CPU bound at the moment. 3dMark03 is approaching that same place for the same reason, so it's cool they are about to put out a new version.
DASHlT said:Um you need to look at the post for the topic...he's asking about a GF3 not a ti200 (which was slower) LOL dood we all know the ti200 is slower...but the vanilla gf3 <which is what he has> is faster then the fx5200.
DASHlT
P.S. that digit-life review also only has the ti200 wth 64megs of ram...the gf3 vanilla also has 128 megs....not a VERY good compare at all LOL
dderidex said:Wow, you really are a little slow, aren't you?
1) Care to find me a link of a GF3 with 128mb of ram?
Oh, that's right, THEY NEVER MADE ONE. There were 128mb versions of the Ti200, but never of the regular GF3 (that was the start of marketting's discovery that people bought on 'mb of ram' numbers alone, rather than actual performance of the card). Some manufacturers eventually did make 128mb Ti500s, but there were never many of them.
2) You don't remember the launch of the GF3 Ti series at all, do you? The Ti200 was the 'replacement' part for the regular GF3 - it was a *little* slower....very, very, little slower. Almost identical, though. The Ti500 was substantially faster.
DASHlT said:http://graphics.tomshardware.com/graphic/20021218/vgacharts-05.html
i found 1...i own a gf3 vanilla.....and THEY DID MAKE EM!!
dderidex said:I SAID they made Ti200 and Ti500 128mbs, they did NOT make regular GF3 128mbs.
Yeah, that's the stock 5200 speed. The cores can run with passive cooling at 250MHz.m³ñ said:Every GF5200 I've ever seen was like 250/400.
heh my 9500 w/ 256 bit memory and 8 pipelines was owning STOCK 9800xt's left and right after my volt mod. I had it going at around 440 mhz, so yes it's possible.a 9500 can beat a 9800XT...right...or maybe its that...the software is shit.
archevilangel said:well I would say it only depends on whether it's the 64 bit or 128 bit ram 5200. 5200 would kill a gf3 if it had 128 bit ram. http://graphics.tomshardware.com/graphic/20031229/images/image009.gif (eww tom's but it does show 5200 putting out respectable numbers which I am sure are faster than a gf3)
dderidex said:Mostly true - there are SOME games the DX9 effects on the 5200 can be used (Doom 3, for example), in which case the GF3 is going to be coming up lacking.
I dunno WHAT anyone is smoking who claims a GF3 is faster in ANYTHING, though. Maybe they just don't know how to configure their PCs? Anyway...
A brief search on Futuremark's ORB shows right away the difference:
FX5200: 2088
GF3: 1514
Searching on various other review sites (FiringSquad, XBit, etc) shows similar results, although you have to compare across multiple reviews. The 5200 (128-bit version) IS faster than the GF3, period, end of story, in all cases.
Now, the 64-bit 5200....well....is a LOT slower. Usually it doesn't work that when you cut part of something in a computer in half, it exactly halves performance...but that's pretty much what is happening, here. The 64-bit 5200 may well be slower than a GF3 in every area, but the 'regular' 5200 is NOT.