Haha what the hell Kyle

Joined
May 15, 2006
Messages
2,062
This article is pretty unintentionally hilarious.

1010877282OXBx7YcoJk_3_6.gif

GF2 absolutely destroying a Voodohwait.

Video card reviews sure have come a long way.
 
How to lie with statistics 101... 0.7 FPS difference...lol.
 
This made me laugh.

I remember sometime a few years ago when I got my core2duo ([email protected]) and 8800gts I fired up Quake 3, set it to 1680x1050 and maxed out all the optoins. For shits and giggles I wanted to run the timedemo to see what kind of results I get.

It was something along 800FPS.

I wonder what I would get on my quad @ 3.2ghz and 260GTX now. Well, guess I know what I'm installing tonight ;-)

The saddest part of all of this, I remember running Q3 in 320x240 and everything on low to get it working on my voodoo3. Then I remember how amazing it was when I got my GF3 Ti 500 and could do 1024x768 with everything maxed at like 100fps....
 
:p haha

Again we have our beauty of a graph showing a pretty big difference. One day I will learn to make proper graphs and end this nonsense. :) ONLY a 0.7 frame per second difference between the Voodoo5 and GeForce2. And while the Voodoo5 did NO BETTER than the nVidia neither came close to the 60FPS mark.
 
I still have my original leadtek geforce 2 gts........and geforce 3 also. these cards are sure small....
 
I can't even believe it's been ten years since the GeForce 2. Seems like only yesterday I was...oh, sorry kids. My mind tends to wander.
 
Like my conclusion statement, even I did not like my graphs. Think it was the first day I had ever used Excel. :eek:

Again we have our beauty of a graph showing a pretty big difference. One day I will learn to make proper graphs and end this nonsense. :) ONLY a 0.7 frame per second difference between the Voodoo5 and GeForce2. And while the Voodoo5 did NO BETTER than the nVidia neither came close to the 60FPS mark.
 
Kyle was just ahead of his time... Doesn't nVidia use a similar scale today? :p
 
pretty good stuff there. At first i was thinking the one card killed the other and it was a whole 2FPS
 
I can't even believe it's been ten years since the GeForce 2. Seems like only yesterday I was...oh, sorry kids. My mind tends to wander.

I just put a GeForce 2 MX400 in an old box of mine to use with Win98 for some old school games. I was surprised. I actually had 3 of them lying around. I don't even know where they came from.
 
I've got a MX400 in an Athlon XP system running XP Pro. It's the "guest" computer.
 
And I feel bad sticking my guests with a system that only has an E4400 with a 8800GT in it.
 
Eh, they've got no intention of gaming, it's usually for checking email, facebook, things of that nature. Most of them click on every link and hit yes to every question... A few of my friends who know their way around regularly use my personal machine.
 
Eh, they've got no intention of gaming, it's usually for checking email, facebook, things of that nature. Most of them click on every link and hit yes to every question... A few of my friends who know their way around regularly use my personal machine.
Seems like it would be more cost, energy, home clutter efficient just to have a virtual PC image on your system or laptop, set to not save state on exit, or whatever the option is.
 
Seems like it would be more cost, energy, home clutter efficient just to have a virtual PC image on your system or laptop, set to not save state on exit, or whatever the option is.

Certainly less clutter, not so much energy as I turn my computers off when not in use.
 
I think today's heatsinks cost more to make than that entire card would lol... The simpler days, and arguably, more exciting too as PC gaming was at the forefront.
 
I think today's heatsinks cost more to make than that entire card would lol... The simpler days, and arguably, more exciting too as PC gaming was at the forefront.

Just think of how small the GPU dies would be if they were made today with the current processes. Also, just think of how much higher the clock speeds could go. Sure, the actual performance would suck since the architecture is practically worthless anymore but it would be interesting to see something like that.

 
I have a Geforce2 on my desk in front of me :O I have no idea what to do with it
 
It took about 3 glances at the graph for me to realize what the joke was, lol /facepalm
 
Wonder if nvidia\ati followed that graphs lead for their marketing slides. They could owe some royalty money if thats the case. :p
 
And I feel bad sticking my guests with a system that only has an E4400 with a 8800GT in it.

My guests have to bring their own computer. No one touches mine unless I'm watching them very closely.
 
Back
Top