5200 vs gf3 - quick and dirty question

SlimShady said:
i found out this morning that it was the 64-bit edition, so i went and returned it.
Too bad you already returned it. I was going to ask about the packaging. :(

CompUSA carries 2 versions of the card, an old version that is the "good" one and a newer crummy one (VGA port on a ribbon cable). I wanted to ask if the back of the box compared the 5200 to the 9200SE or not. I'm pretty sure that's the lame version. The old box compared to another 5200 and a 7500.
 
I believe a Ti 200 ran at 175/475 stock. My gainward golden sample runs at 250/500, on stock cooling. No volt mods or anything. I'm pretty sure that it's a much better card than a 5200. I can play Painkiller at 1024x768 with all the options turned up as high as they go, and its smooth as silk (with a few very rare exceptions.)
 
This just seems to becoming a very funny and competitive topic. Alright screw 3dmark it is a benchmark not a game. I have a geforce 3 and a 5200 the gf3 does better on games. For example even though this is an older games and it is not a FPS I play GTA3 once in a while. I have everything in GTA3 on max on 1280x1024 and its smooth I mean really smooth with my geforce 3 and I have to bump everything down a notch with the 5200. Even when the geforce 3 is running stock and the 5200 is overclocked. Okay thats one game now another. It did this constanly with the 5200 being slower the only thing the 5200 did better in was 3dmark03 not better in 01 though. Wow thats a fun game 3dmark03 I could watch it all day WOW. Hey guys I may not get the frames in the games but look at 3dmark 03 it beats the gf3 woop di do. And maybe I don't know how to exactly configure my system to the max, but if the GF3 beats the 5200 before I have to spend time tinkering with my system then guess what that is a con for the 5200. Trust me I spend time testing cards on all sorts of systems and I am currently working on a video card model guide on the [H]. Does this make my point stronger no not really, but I know what I know. Oh ya and to answer the original question no they are too close anyways for you going from a gf3 to a 5200 to be wroth it anyways its time to get out of these slower cards and into a cheap 9500, cheat GF4ti or above card which can play doom3 and other newer games decently.
 
Ok, I had a visiontek gf3 ti200 in my brother's box. It's a 100% kickass card. Had it for 2 years, and it still is kicking ass. My best 2k1 score on it is 12k or so, can't prove it as it's long gone to make room for my other scores, but i'm sure you won't think i'm lying heh. Anyways, my friend's roommate had a 5200 in his computer, it ran ut2k3 worse than my geforce 3 did. And it was set to all medium settings. The gf3 could run high no problem with a few minor slowdowns.
 
Pick up a Ti4x00 from ebay... it'll be faster than the gf3 and give you a nice boost :)
 
pxc said:
Too bad you already returned it. I was going to ask about the packaging. :(

CompUSA carries 2 versions of the card, an old version that is the "good" one and a newer crummy one (VGA port on a ribbon cable). I wanted to ask if the back of the box compared the 5200 to the 9200SE or not. I'm pretty sure that's the lame version. The old box compared to another 5200 and a 7500.

it compares it to the 9200 on the back. it says it outperforms the 9200, and makes a good upgrade from the 9000. of course, it outperforms by like maybe 50 3dmarks, but oh well.

so yes, this is the lame version :) it was the one in the flyer this past weekend. it seemed like a good deal at the time, but of course didnt have benchmarks to compare it to while standing in the store.
 
dderidex said:
Well, you answered your own question if you are smart enough to see it. 3dMark2001 is useless on modern graphics cards, because the test is ENTIRELY CPU bound at the moment. 3dMark03 is approaching that same place for the same reason, so it's cool they are about to put out a new version.

11000 points is cpu bound? Bullshit. If that were so, I wouldn't have 22000 on my 6800GT, a jump of 6000 points from my 9800 Pro on the SAME cpu.
 
Heh... I, uh, posted earlier using AquaMark as an example of one way where my MX400 beat my FX 5200. :p I also got much crappier framerates in CMR3, had issues with things even running in UT2k3, and for some reason, UT GotY ran really well on the FX5200.

The machine I used to benchmark the 5200's, MX400, GTS, and my new FX 5900 was as follows:
1900+ (stock speeds)
512 megs of generic crap PC2700
FIC AU13 mobo

I upgraded to a 2500+ (oced to 2.3 ghz), scores didn't change.
 
i've had 5200, mx440 (prob 4 diff ones), gf3 (3 diff ones).

with gf fx5200 i only had one and got rid of it asap. there was a good reason for that. fxkin shit performance.

mx440 was slightly faster than 5200. gf3 is way faster than 5200.
there is a one clueless fxxker in this thread talkin shxt as usual..

look at these;
http://graphics.tomshardware.com/graphic/20030120/vgacharts-02.html#aquanox
u can see that gf3 is about same or bit faster than r8500 in most bench.

http://graphics.tomshardware.com/graphic/20031229/vga-charts-04.html
r8500 is way faster than fx5200 in most bench.

and lets not forget most 5200 are 64bit can they are also way underclocked stock.
tom tested the 128bit version running at full speed.
 
Chaballaman said:
look at these;
http://graphics.tomshardware.com/graphic/20030120/vgacharts-02.html#aquanox
u can see that gf3 is about same or bit faster than r8500 in most bench.

http://graphics.tomshardware.com/graphic/20031229/vga-charts-04.html
r8500 is way faster than fx5200 in most bench.

Yeah, after all, it's not like there isn't an ENTIRE YEAR between those two benchmarks. :rolleyes: The 8500 drivers were crap for a VERY long time after launch. If you'll look at the second chart you just posted, you'll see the 8500 (with later drivers) was actually faster than the GeForce4 Ti4200.

Are you going to argue now that the GF3 is faster than the GF4 Ti 4200 based on the same logic as quoted above? After all, if the GF3 is faster than the 8500 (chart 1) and the 8500 is faster than the GF4 (chart 2), then the GF3 MUST be faster than the GF4, huh? Or, maybe, admit you are wrong?
 
dderidex said:
Yeah, after all, it's not like there isn't an ENTIRE YEAR between those two benchmarks. :rolleyes: The 8500 drivers were crap for a VERY long time after launch. If you'll look at the second chart you just posted, you'll see the 8500 (with later drivers) was actually faster than the GeForce4 Ti4200.

Are you going to argue now that the GF3 is faster than the GF4 Ti 4200 based on the same logic as quoted above? After all, if the GF3 is faster than the 8500 (chart 1) and the 8500 is faster than the GF4 (chart 2), then the GF3 MUST be faster than the GF4, huh? Or, maybe, admit you are wrong?

dood just admit your wrong.....cant ya live with that? or is that tough to deal with? a GF3 is betrter then an FX5200....you are the ONLY one who thinks otherwise

DASHlT
 
Geforce 3 should be faster than the regular FX5200. Pretty close with the ultra depending on the game. Gforce 3 had 2 TMU and the FX had 1 TMU.

Geforce 3 had more fill rate than FX5200 or the ultra.
 
DASHlT said:
dood just admit your wrong.....cant ya live with that? or is that tough to deal with? a GF3 is betrter then an FX5200....you are the ONLY one who thinks otherwise

DASHlT
i hope he cant...

ask nvidia, even they'll tell u gf3 is faster than fx5200.
 
man i have this clueless guy keep on buttin in on my threads.. talkin bs.. startin to piss me off. :( and as you can see from this thread this guy never admits that his wrong, according to him, he's bs is always right. lol.. i haven't seen any morons in this forums, except for one newbie who posted some sick photos.. but i think there is one pest in this forum.
 
Chaballaman said:
man i have this clueless guy keep on buttin in on my threads.. talkin bs.. startin to piss me off. :( and as you can see from this thread this guy never admits that his wrong, according to him, he's bs is always right. lol.. i haven't seen any morons in this forums, except for one newbie who posted some sick photos.. but i think there is one pest in this forum.

He stopped after another thread where I pointed out that his own source showed the GF3 beating the 5200 in one more benchmark.
 
lopoetve said:
He stopped after another thread where I pointed out that his own source showed the GF3 beating the 5200 in one more benchmark.
#1) I never stopped, I just didn't see the point in replying to it, since you obviously picked the weakest FX5200 in the list to compare against the GF3 to prove your point. Not worth carrying on an argument over. Unlike Chab, who doesn't seem to have anything better to do than flaming, if I don't see a point in continuing an argument, I don't.

#2) Both of your contentions from the start of the thread was that a GF3 is faster than a 'plain' FX 5200 non-ultra. That link I provided proved that yes, some *regular* FX 5200s are, indeed, faster than GF3s nearly across the board (that means, winning in almost every test). If you use the slower clock-speed ones, or the 64-bit version, sure, they are slower. That doesn't mean EVERY FX5200 is slower, sorry.

#3) There are OTHER considerations when buying a graphics card. AFAIK, BFG was not manufacturing cards when the GF3 was out, and no other manufacturer has a 'true lifetime' warranty....so all the GF3s are out of their warranty period by now. One breaks...you are out of your money. If you buy an FX5200 and it breaks...hey, guess what, you get a replacement!
 
dderidex said:
#1) I never stopped, I just didn't see the point in replying to it, since you obviously picked the weakest FX5200 in the list to compare against the GF3 to prove your point. Not worth carrying on an argument over. Unlike Chab, who doesn't seem to have anything better to do than flaming, if I don't see a point in continuing an argument, I don't.

#2) Both of your contentions from the start of the thread was that a GF3 is faster than a 'plain' FX 5200 non-ultra. That link I provided proved that yes, some *regular* FX 5200s are, indeed, faster than GF3s nearly across the board (that means, winning in almost every test). If you use the slower clock-speed ones, or the 64-bit version, sure, they are slower. That doesn't mean EVERY FX5200 is slower, sorry.

#3) There are OTHER considerations when buying a graphics card. AFAIK, BFG was not manufacturing cards when the GF3 was out, and no other manufacturer has a 'true lifetime' warranty....so all the GF3s are out of their warranty period by now. One breaks...you are out of your money. If you buy an FX5200 and it breaks...hey, guess what, you get a replacement!

Depends on the program, the fx 5200 really has issues with shader performance, were as it does just about equal or a bit higher with polygon transformations but only when uring dx 8.1 path
 
rancor said:
Depends on the program, the fx 5200 really has issues with shader performance, were as it does just about equal or a bit higher with polygon transformations but only when uring dx 8.1 path

Hence the point of my linking this review earlier in the thread. Can you find FX5200s models slower than a GF3? Sure! But, then, the FX5200 was designed as the entry-level budget card of it's time, and there are a WIDE range of models of it! There are some really, REALLY slow FX5200s! But, then, you can also buy those brand-new with warranty still good for $40.

A standard 'real' FX5200nu (not lower clocked, not 64-bit), that you can find on Newegg or Pricewatch, will outscore a GF3 in most game tests, as that review shows.
 
they only used a gf 3 ti 200 and that was very close to the 5200, the gf 3 regular or ti 500 would be faster in most situations.
 
rancor said:
they only used a gf 3 ti 200 and that was very close to the 5200, the gf 3 regular or ti 500 would be faster in most situations.
Code:
	Fillrate	Ops/sec
	--------	-------
Ti500	3.84 billion	960 billion
GF3	3.2 billion	800 billion
Ti200	2.8 billion	700 billion
The 'regular' GF3 was a LITTLE faster than Ti200. The Ti500 being much faster than a regular GF3. It'd probably tie an FX5200 in all tests. But to say it is a 'better card' than the FX5200 is quite the stretch.
 
dderidex said:
Code:
	Fillrate	Ops/sec
	--------	-------
Ti500	3.84 billion	960 billion
GF3	3.2 billion	800 billion
Ti200	2.8 billion	700 billion
The 'regular' GF3 was a LITTLE faster than Ti200. The Ti500 being much faster than a regular GF3. It'd probably tie an FX5200 in all tests. But to say it is a 'better card' than the FX5200 is quite the stretch.


You can't go by fillrates, and opt/sec

If that was the case, the fx 5900 would destroy the 9800's


Shader performance has alot to do with which card is better, with higher shader performance there is a direct proportional increase of end performance results with higher fill rates.

And this is where the 5200 fx falls, with its poor shader performance there is a direct proportional decrease it its performance comparing it to the previous 2 lines.


http://www.xbitlabs.com/articles/video/display/nv31-nv34_10.html

Really its only about a gf 4 mx 440
 
dderidex said:
#1) I never stopped, I just didn't see the point in replying to it, since you obviously picked the weakest FX5200 in the list to compare against the GF3 to prove your point. Not worth carrying on an argument over. Unlike Chab, who doesn't seem to have anything better to do than flaming, if I don't see a point in continuing an argument, I don't.

#2) Both of your contentions from the start of the thread was that a GF3 is faster than a 'plain' FX 5200 non-ultra. That link I provided proved that yes, some *regular* FX 5200s are, indeed, faster than GF3s nearly across the board (that means, winning in almost every test). If you use the slower clock-speed ones, or the 64-bit version, sure, they are slower. That doesn't mean EVERY FX5200 is slower, sorry.

#3) There are OTHER considerations when buying a graphics card. AFAIK, BFG was not manufacturing cards when the GF3 was out, and no other manufacturer has a 'true lifetime' warranty....so all the GF3s are out of their warranty period by now. One breaks...you are out of your money. If you buy an FX5200 and it breaks...hey, guess what, you get a replacement!

1. Wrong. I showed the results against ALL of the FX5200's in the list. The wins for the GF3 were wins over ALL 5200's (128bit), except the Ultra. The losses were generally by one or two of the 5200's.

2. No, you showed that for some programs the 5200 was faster, and I showed that for many others (more, in fact), the GF3 was faster, and normally by a much larger margin. Sorry.

3. Other considerations were never a part of this discussion, don't try to bring them in to sway your point. We were discussing SPEED, nothing else. The GF3 is faster in more things than the 5200.
 
dderidex said:
Code:
	Fillrate	Ops/sec
	--------	-------
Ti500	3.84 billion	960 billion
GF3	3.2 billion	800 billion
Ti200	2.8 billion	700 billion
The 'regular' GF3 was a LITTLE faster than Ti200. The Ti500 being much faster than a regular GF3. It'd probably tie an FX5200 in all tests. But to say it is a 'better card' than the FX5200 is quite the stretch.

The 'regular' was about 5% faster for normal ops, IIRC.
 
dderidex said:
Hence the point of my linking this review earlier in the thread. Can you find FX5200s models slower than a GF3? Sure! But, then, the FX5200 was designed as the entry-level budget card of it's time, and there are a WIDE range of models of it! There are some really, REALLY slow FX5200s! But, then, you can also buy those brand-new with warranty still good for $40.

A standard 'real' FX5200nu (not lower clocked, not 64-bit), that you can find on Newegg or Pricewatch, will outscore a GF3 in most game tests, as that review shows.

WRONG AGAIN.

To quote from the other thread:

dderidex said:
I've yet to see a 'brand new' Ti4200 for $50 was all I was saying. You CAN get a brand new, unopened, warranty still not even started yet, FX5200 for $50. That matters to some people. (And the GF3 is NOT faster than the FX5200, despite what some on the forum seem to think).

Still, the 4200 is a better buy overall, even if you do have to get it used or refurb or anything else - it's just that much faster. And he's talking about a 4400, which is even faster.

And thanks for the link. Did you read it? The results:
Code creatures -> 5200, by 3 fps
SS SE->GF3, by 40 fps
"" -> GF3, by 30fps (1024x768)
RTCW -> GF3 by 40fps, 30fps for 1024x768
UT2k3, beaten by the top 5200, but beats the others by 15fps. At 1024x768, it beats all but the ultra
Unreal 2 -> 5200, by 10 fps, 5 @ 1024x768
Splinter cell, same as UT2k3, but 1024x768 < 1fps difference.


So, SOME 5200's can match (or barely beat) the GF3 at some games, but when the GF3 wins, it does so by a LARGE margin (30+fps). And, that's the Ti200, unoverclocked, not the GF3 or the ti500.

2 for 5200, 2 ties, 3 for GF3. If you use the faster 5200's for the comparisons, the GF3 still wins the ones it won, but by a smaller margin. The 5200's wins are unchanged, and the ties are still ties (mixed win/loss)

GF3 wins.

Argument over, from your OWN source. Thanks for playing.
 
You must have missed some FX5200 scores, then?

From what I read:

Code Creatures
FX 5200: 19
GF3: 9.3
Win: FX 5200 (stomps the GF3 with more than twice its score!)

Serious Sam: TSE 800x600
FX 5200: 145.7
GF3: 138.9
Win: FX 5200

Serious Sam: TSE 1024x768
FX 5200: 98.5
GF3: 94.6
Win: FX 5200

Return to Castle Wolfenstein: 800x600
FX 5200: 133.1
GF3: 130.0
Win: FX 5200

Return to Castle Wolfenstein: 1024x768
FX 5200: 83.4
GF3: 85.0
Win: HOLY CRAP! The GF3 finally wins one!!!

Unreal Tournament 2003: 800x600
FX 5200: 68.3
GF3: 64.8
Win: FX 5200

Unreal Tournament 2003: 1024x768
FX 5200: 47.2
GF3: 47.6
Win: GF3 (by a WHOPPING .4 FPS! And, at these framerates, the game is only 'playable' at 800x600, where the FX5200 wins)

Unreal II: 800x600
FX 5200: 39.5
GF3: 33.5
Win: FX 5200

Unreal II: 1024x768
FX 5200: 30.0
GF3: 27.3
Win: FX5200

RightMark3d: 800x600
FX 5200: 40.3
GF3: 16.1
Win: FX 5200 (stomps on the GF3 again - working on triple its score!)

RightMark3d: 1024x768
FX 5200: 35.2
GF3: 15.0
Win: FX 5200

Splinter Cell: 800x600
FX 5200: 20.0
GF3: 15.9
Win: FX 5200

Splinter Cell: 1024x768
FX 5200: 17.9
GF3: 14.0
Win: FX 5200

Scores taken as a Prolink PixelView GeForce FX 5200 vs an ABIT Siluro GF3 Ti200.

Total Wins
FX 5200: 11
GF3: 2

Seems like a pretty clean sweep for the FX5200 over a Ti200, doesn't it? Now, granted, as noted, the 'regular' GF3 manages a good 5-10% performance difference over the Ti200 in certain cases, which is plenty enough to pull ties out of virtually all of the tests, even if it doesn't really get any more wins.

Finally, as I had already pointed out previously, the FX 5200 wins over the regular GF3 in Unreal Tournament 2003 and 3dMark03, as well.

I don't know how you can argue the GF3 is a 'better' card! Equal, MAYBE, but surely not 'better'! And the FX 5200 will have better manufacturer support, ship with newer software and games, and let's not forget have dual-head capability! The GF3 only has a single RAMDAC, so can only output to one monitor at a time!
 
dderidex said:
You must have missed some FX5200 scores, then?

From what I read:

Code Creatures
FX 5200: 19
GF3: 9.3
Win: FX 5200 (stomps the GF3 with more than twice its score!)

Serious Sam: TSE 800x600
FX 5200: 145.7
GF3: 138.9
Win: FX 5200

Serious Sam: TSE 1024x768
FX 5200: 98.5
GF3: 94.6
Win: FX 5200

Return to Castle Wolfenstein: 800x600
FX 5200: 133.1
GF3: 130.0
Win: FX 5200

Return to Castle Wolfenstein: 1024x768
FX 5200: 83.4
GF3: 85.0
Win: HOLY CRAP! The GF3 finally wins one!!!

Unreal Tournament 2003: 800x600
FX 5200: 68.3
GF3: 64.8
Win: FX 5200

Unreal Tournament 2003: 1024x768
FX 5200: 47.2
GF3: 47.6
Win: GF3 (by a WHOPPING .4 FPS! And, at these framerates, the game is only 'playable' at 800x600, where the FX5200 wins)

Unreal II: 800x600
FX 5200: 39.5
GF3: 33.5
Win: FX 5200

Unreal II: 1024x768
FX 5200: 30.0
GF3: 27.3
Win: FX5200

RightMark3d: 800x600
FX 5200: 40.3
GF3: 16.1
Win: FX 5200 (stomps on the GF3 again - working on triple its score!)

RightMark3d: 1024x768
FX 5200: 35.2
GF3: 15.0
Win: FX 5200

Splinter Cell: 800x600
FX 5200: 20.0
GF3: 15.9
Win: FX 5200

Splinter Cell: 1024x768
FX 5200: 17.9
GF3: 14.0
Win: FX 5200

Scores taken as a Prolink PixelView GeForce FX 5200 vs an ABIT Siluro GF3 Ti200.

Total Wins
FX 5200: 11
GF3: 2

Seems like a pretty clean sweep for the FX5200 over a Ti200, doesn't it? Now, granted, as noted, the 'regular' GF3 manages a good 5-10% performance difference over the Ti200 in certain cases, which is plenty enough to pull ties out of virtually all of the tests, even if it doesn't really get any more wins.

Finally, as I had already pointed out previously, the FX 5200 wins over the regular GF3 in Unreal Tournament 2003 and 3dMark03, as well.

I don't know how you can argue the GF3 is a 'better' card! Equal, MAYBE, but surely not 'better'! And the FX 5200 will have better manufacturer support, ship with newer software and games, and let's not forget have dual-head capability! The GF3 only has a single RAMDAC, so can only output to one monitor at a time!

What clocks on that 5200? I have that same card, stock clocks were 250/400.

Oh, one other question... How many of those games had issues? I know in CMR3, my MX400 would run at 1024X768 with the options turned to the middle usually, and it would run smoothly with 30-35 fps... however, my PNY 5200 would run really choppy with 45 fps (unplayably choppy) with the exact same settings. My Pixelview 5200 would run less choppy, but unless I turned the resolution down, it was impossible to get rid of the mad chop. This happened on more than 5 different sets of drivers, including beta drivers.

Also, in Flight Simulator 2002, the MX400 would only get 25 fps but it would be playable (barely). The 5200 would get 35 fps, but would pause in dense areas. I had to turn a number of features off (mostly DX9 related) to get rid of that.
 
The 5200 is quite frankly the worst card in Nvidia's lineup EVER. Not for performance reasons. But because it's name is so misleading. People think that it's like the Ti4200 was. It gets mistaken as a higher end part.

I've heard salesman tell customers that the 5200 Ultra 256MB is a badass card. At which point I must correct them and tell them that a sub $100 card is not going to play Doom III well.

Plainly the Ti4200 kicks it's ass. People would think that since 5200 is a bigger number that it would be better.

The words Ultra and 256MB fool people into thinking that it's a good product and a gamers card. When it's not. It performs only slightly better and in some cases worse than the Geforce 4 MX440 which is essentially a Geforce 2 core.

The 5200 in any flavor is a piece of shit. It's not good at anything but has alot of features that are executed so poorly that the card is barely usable for anything other than 2D work.

With the prices for cards such as the Ti4600 and ATi Radeon 9600's being less than $100 I think that the Geforce 5200 is a poor choice. For anyone.

In short just about every card save for the MX4000 are better than that piece of shit.
 
rancor said:
http://www.sharkyextreme.com/hardware/videocards/article.php/3211_925271__7


There is quite a big difference between the gf 3 ti 200 and the gf 3 when you get to higher resolutions.

http://www.hardocp.com/article.html?art=MTI2LDQ=

There is a good 10 fps difference between the ti 200 and a regular gf 3, at 1024x768, the GF 3 will win out in all benchmarks over the 5200 ultra going by that.

Bingo.

And dderidex, 3dmark:
1, isn't a game,
2, is useless for performance comparisons between archetectures/designs, and
3, is stupid. I count it for NOTHING.

I'd take the good old GF3 over the 5200 any day.

And indeed, go with the old GF3, and it's more than fast enough.
 
lopoetve said:
Bingo.

And dderidex, 3dmark:
1, isn't a game,
2, is useless for performance comparisons between archetectures/designs, and
3, is stupid. I count it for NOTHING.

I'd take the good old GF3 over the 5200 any day.

And indeed, go with the old GF3, and it's more than fast enough.

Kinda funny how most of my experience behind my accusations of the FX5200 were based not only in 3d mark, but in games. Not only did both of my 5200's score exceptionally poorly in 3d mark, they also were abhorrant scores from Aquamark, had exceptionally bad framerates, and were always a pain in the ass to game on.
 
’m‚³‚ñ said:
Kinda funny how most of my experience behind my accusations of the FX5200 were based not only in 3d mark, but in games. Not only did both of my 5200's score exceptionally poorly in 3d mark, they also were abhorrant scores from Aquamark, had exceptionally bad framerates, and were always a pain in the ass to game on.

Well, sometimes it's right :p
 
dderidex said:
You must have missed some FX5200 scores, then?
...
Return to Castle Wolfenstein: 1024x768
FX 5200: 83.4
GF3: 85.0
Win: HOLY CRAP! The GF3 finally wins one!!!
:D

What people "know" about the 5200 doesn't always mesh with reality. Sure some people got crummy 64-bit memory bus 5200 cards (do you judge a TNT2 by how a TNT2 M64 performs... or a real 9600Pro by how a 9600Pro EZ performs?), or got ripped off by buying an overpriced 256MB 5200. Then there's the majority who never owned one, but "heard" about it. I like the 5200. When it's often on sale at retail, it's the best budget gaming card available for $50.

Yes, it's a very weak DX9 card. No, it's not the worst (the x300SE gets that title... plus the x300SE is more expensive). DX8.x-level performance is actually pretty good on the 5200.

-------
http://www.hardforum.com/showthread.php?t=813082 <--- 5200 PCI (generally much slower than the AGP version) benchmarks in Doom3/various Q3 engine games, 3DMark2001SE/03/aquamark, Far Cry, CS:S VST, TR:AOD, UT2003, Halo at stock speed and overclocked.

The results pretty much speak for themselves. IMO, it's pretty impressive for a $50 card. I was playing Doom3 with all effects enabled at 512x384 with the ARB2 path and 2xAF pretty well. It also plays CS:S decently (DX8 path is forced by Valve) at 800x600 with all effects enabled (reflect world, trilinear default) and Far Cry also plays well with default quality settings (medium) at 800x600.
 
pxc said:

You get what you pay for... Sometimes less. :p

All I can say is, I was expecting a big difference in performance when I upgraded from my MX400 to a 5200. I can honestly say I was really disappointed. Compared to my friend's 5600 and GTS, my 5200 just didn't have any oomph. It had more power than a GTS, but it spread it out too much. They should have never made the 5200. Or at least left some of the DX9 stuff out.

They have a 5700 and 5500 now... What about a 5400? How about a 5300?
 
You know what? It's fucking moot. The card sucks. For $50 you can get a Ti4200, which is hella faster period.

And pxc, most people with a 5200 won't have a Athlon64 @ 2.4ghz, etc. On a more midrange system, the other benchmarks show the GF3 winning more than the 5200, but it doesn't matter. the Ti4200 is faster and just as cheap.
 
lopoetve said:
You know what? It's fucking moot. The card sucks. For $50 you can get a Ti4200, which is hella faster period.

And pxc, most people with a 5200 won't have a Athlon64 @ 2.4ghz, etc. On a more midrange system, the other benchmarks show the GF3 winning more than the 5200, but it doesn't matter. the Ti4200 is faster and just as cheap.
Where do you find Ti4200s for $50? Especially with a warranty?

The video card was the limiting factor in virtually all my game tests (UT2003 botmatch in lower resolution is CPU limited, of course). That 5200 PCI is permanently installed into a P4 2.4GHz now and game performance is almost identical. Synthetic tests got different scores, but those aren't games.

Pwnt.
 
pxc said:
Where do you find Ti4200s for $50? Especially with a warranty?
They aren't really worth arguing with. Both him and Chab both seem to have their intarw3b p3n0s size determined by how crappy the FX5200 is, so they are constantly bashing it in every thread it comes up in.

It's not a *great* card by any stretch of the imagination, but it IS a *good* card, and if Joe Consumer said he was going to Best Buy tomorrow and wanted to know the cheapest graphics card he could buy that could do minimal amounts of gaming acceptably, I wouldn't lose any sleep recommending an FX 5200 wholeheartedly.

Remember that Joe Consumer doesn't even know what FSAA *IS*, he's CERTAINLY never going to use it whether he drops $50 on a graphics card, or $500 (no, really, has anybody here worked at a retail computer shop? Yes, people DO come in and drop $500 on a GeForce 6800 Ultra and NEVER USE FSAA because they DON'T KNOW WHAT IT IS! They just know they have buckets of money and want to buy 'the best there is' and don't even care what they are really getting.) And, no, he doesn't really care if he gets 80fps or 60fps or 30fps. Hell, he doesn't even know what a 'frame per second' IS!

Joe Consumer-on-a-budget would have no problem with an FX5200. It can't do FSAA well, but he doesn't care, he'll never use it. He is used to Playstations which do 640x480 graphics (sort of) on a TV screen. Something that can do 800x600 on his monitor is going to look godly in comparison, and there isn't a game out (well, maybe Far Cry) that can't detect the detail level to set an FX5200 at to make an 800x600 game playable and enjoyable.

Yes, a Ti4200 is faster for the same price, but:
A) Virtually all of them have had their warranties run out by now - something Joe Consumer is KEENLY interested in.
B) Good luck finding one retail. Only way to get them is online, and that's something else 90% of people are unwilling to do - hell, even read through *these very forums* and see how many people are looking for which graphics card to buy "from a store", since they won't shop online!

To say the FX5200 is crap and bash it at every opportunity is simply unreasonable. It's a good card. Hell, if you qualify the statement, it's a GREAT card for *what it sets out to do*. Someone can walk into Best Buy or Circuit City or Office Max or Staples, or....see an FX5200 on the shelf for $80 with $30 mail-in-rebate, buy it, take it home, plug it into their computer, and get a kick ass Doom3 experience over their crappy Intel Integrated graphics (or nForce2 integrated graphics).
 
Back
Top