The GPU Nvidia would rather forget – GeForce FX

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
10,929
nVidia GeForce FX 5800 Ultra

“Bizarrely, GeForce FX 5800 Ultra cards now fetch decent prices among collectors, thanks to their rarity and the story that surrounded them at the time. If you were unlucky enough to buy one and still have it lurking in a drawer, it might be worth sticking it on eBay.

We hope you’ve enjoyed this personal retrospective about GeForce FX. For more articles about the PC’s vintage history, check out our retro tech page, as well as our guide on how to build a retro gaming PC, where we take you through the trials and tribulations of working with archaic PC gear.

We would like to say a big thank you to Dmitriy ‘H_Rush’ who very kindly shared these fantastic photos of a Gainward GeForce FX 5800 Ultra with us for this feature. You can visit vccollect to see more of his extensive graphics card collection.”

View: https://youtu.be/LVEOL4BYqcQ?si=uuJJWpS8nn55jkra
Source: https://www.pcgamesn.com/nvidia/geforce-fx
 
The FX series may have been a hairdryer, but it was competitive with the ATi 9800 Pro's at the time. FX was better if you used AA.
https://hothardware.com/reviews/ati-256mb-radeon-9800-pro-vs-geforce-fx-5900-ultra?page=2
For the record, I owned the ATi card. Whatever offers the best price/performance, I go with.
"Terratec Nvidia Geforce FX 5800 Ultra 128MB GPU Retro" https://www.ebay.com/itm/2354958799...PSCYiP7c7zqYIX+DZ+8QdhLNc=|tkp:Bk9SR6aLs4vSYw

1711824913349.png
 
As an eBay Associate, HardForum may earn from qualifying purchases.
nVidia GeForce FX 5800 Ultra

“Bizarrely, GeForce FX 5800 Ultra cards now fetch decent prices among collectors, thanks to their rarity and the story that surrounded them at the time. If you were unlucky enough to buy one and still have it lurking in a drawer, it might be worth sticking it on eBay.

We hope you’ve enjoyed this personal retrospective about GeForce FX. For more articles about the PC’s vintage history, check out our retro tech page, as well as our guide on how to build a retro gaming PC, where we take you through the trials and tribulations of working with archaic PC gear.

We would like to say a big thank you to Dmitriy ‘H_Rush’ who very kindly shared these fantastic photos of a Gainward GeForce FX 5800 Ultra with us for this feature. You can visit vccollect to see more of his extensive graphics card collection.”

View: https://youtu.be/LVEOL4BYqcQ?si=uuJJWpS8nn55jkra
Source: https://www.pcgamesn.com/nvidia/geforce-fx

1711852235036.png
 
I almost won one in an UT 2004 tournament

Came in 2nd and got a T-shirt instead which was still probably actually faster in DX9 games

Ha! I played the hell out of that game competitively. We may have even matched each other at some point. I can still hear the rusty creak from those giant glass windows on DM-Rankin in my dreams...

I got first at a college LAN event and won a 1GB thumb drive back when that was an exciting number. I was definitely happier with that than I would have been with an Nvidia card from that generation. The only good FX back then was Athlon 64!
 
I remember that the FX cards had issues with DX9 due to FX cards using either 16FP or 32FP, but realistically only 16FP was usable in games. ATI got away with it because they used 24-bit and only 24-bit. Nvidia made sure to avoid this problem by pushing game developpers to use DX8.1 over true DX9. This worked for a while until Half Life 2 was released and actually used true DX9. FX cards were shit for a reason.


View: https://youtu.be/J3ijyDOTGOc?si=nDHzpsooeyddGi_k
 
The FX series may have been a hairdryer, but it was competitive with the ATi 9800 Pro's at the time. FX was better if you used AA.
https://hothardware.com/reviews/ati-256mb-radeon-9800-pro-vs-geforce-fx-5900-ultra?page=2
For the record, I owned the ATi card. Whatever offers the best price/performance, I go with.
As I recall they weren't really all that competitive until the end of their run with the GeForce FX 5950Ultra. Prior to that, the 9800Pro/XT etc. had a decisive lead over them in most cases. It wasn't as if you got bad performance with the high end GeForce FX cards. The biggest issues were that you had to listen to that hair dryer of a fan. Despite the ATi Radeon 9800 Pro and XT's being a bit faster than the FX series halo products, ATi's drivers were a shit show at the time. I had games that flat out weren't playable for various reasons. Typically, objects wouldn't render at all or the game would crash completely.
 
As I recall they weren't really all that competitive until the end of their run with the GeForce FX 5950Ultra. Prior to that, the 9800Pro/XT etc. had a decisive lead over them in most cases. It wasn't as if you got bad performance with the high end GeForce FX cards. The biggest issues were that you had to listen to that hair dryer of a fan. Despite the ATi Radeon 9800 Pro and XT's being a bit faster than the FX series halo products, ATi's drivers were a shit show at the time. I had games that flat out weren't playable for various reasons. Typically, objects wouldn't render at all or the game would crash completely.
9800 pro, last ATi/AMD i ever owned. If only they could recreate that. Picked it up the day of my now 16 yr old daughters birth as my 9700 pro fried that day.
 
9800 pro, last ATi/AMD i ever owned. If only they could recreate that. Picked it up the day of my now 16 yr old daughters birth as my 9700 pro fried that day.
What people need to understand about the 9700 Pro and later 9800 Pro/XT models is that they were the result of ATi's acquisition of a company called ArtX that developed the technology that would make those GPU's what they were. ATi did not develop that on their own. ATi has never been able to fully recreate that success or dominate NVIDIA to that degree again. Sure, they've had a few cards that were competitive here and there but overall, its been extremely rare since NVIDIA became dominant in the market. Interestingly, AMD, who acquired ATi some years ago had a similar history until recently as the majority of AMD's successes were more about purchasing other companies and repurposing their products than outright innovation on their own.

The biggest problem for AMD's graphics division is that NVIDIA is shockingly good at what it does. Not only from a technological standpoint, but from a business perspective. It's extremely hard to compete with. Intel has wanted a piece of that pie for a couple of decades and they are still nowhere near achieving parity with NVIDIA or even AMD for that matter. The days of some small startup creating technology that gets them bought out overnight and propels technology forward, allowing the buyer to leapfrog a powerhouse like NVIDIA are probably long gone.
 
As I recall they weren't really all that competitive until the end of their run with the GeForce FX 5950Ultra. Prior to that, the 9800Pro/XT etc. had a decisive lead over them in most cases. It wasn't as if you got bad performance with the high end GeForce FX cards. The biggest issues were that you had to listen to that hair dryer of a fan. Despite the ATi Radeon 9800 Pro and XT's being a bit faster than the FX series halo products, ATi's drivers were a shit show at the time. I had games that flat out weren't playable for various reasons. Typically, objects wouldn't render at all or the game would crash completely.
If memory serves me correctly, ATI drivers were shit but because they were locking away features for newer cards. Omega driver guys at the time were putting in those features for older ATI cards, which did push ATI to also include them into their drivers for older cards. Nvidia on the other hand had filed a lawsuit against the Omega developers since the Omega guys did the same thing for Nvidia drivers.

Games that had Nvidia's logo like Thief Deadly Shadows were only really good on FX cards because they ran on DX8.1 features even though it was advertised as DX9. That's why HL2 was so controversial because it did actually use DX9 features. With DX8.1 or a mix of DX8.1 and DX9 features, the FX cards could default to 16-bit but when actual DX9 features were being used then it had to default to 32-bit, which drastically slowed down these cards. The hair dryer noise was the least of the Geforce FX problems.
 
What people need to understand about the 9700 Pro and later 9800 Pro/XT models is that they were the result of ATi's acquisition of a company called ArtX that developed the technology that would make those GPU's what they were. ATi did not develop that on their own. ATi has never been able to fully recreate that success or dominate NVIDIA to that degree again. Sure, they've had a few cards that were competitive here and there but overall, its been extremely rare since NVIDIA became dominant in the market. Interestingly, AMD, who acquired ATi some years ago had a similar history until recently as the majority of AMD's successes were more about purchasing other companies and repurposing their products than outright innovation on their own.

The biggest problem for AMD's graphics division is that NVIDIA is shockingly good at what it does. Not only from a technological standpoint, but from a business perspective. It's extremely hard to compete with. Intel has wanted a piece of that pie for a couple of decades and they are still nowhere near achieving parity with NVIDIA or even AMD for that matter. The days of some small startup creating technology that gets them bought out overnight and propels technology forward, allowing the buyer to leapfrog a powerhouse like NVIDIA are probably long gone.
Thank you, I didn’t know this history
 
That 128-bit memory interface didn’t help either
 
If memory serves me correctly, ATI drivers were shit but because they were locking away features for newer cards. Omega driver guys at the time were putting in those features for older ATI cards, which did push ATI to also include them into their drivers for older cards. Nvidia on the other hand had filed a lawsuit against the Omega developers since the Omega guys did the same thing for Nvidia drivers.

Games that had Nvidia's logo like Thief Deadly Shadows were only really good on FX cards because they ran on DX8.1 features even though it was advertised as DX9. That's why HL2 was so controversial because it did actually use DX9 features. With DX8.1 or a mix of DX8.1 and DX9 features, the FX cards could default to 16-bit but when actual DX9 features were being used then it had to default to 32-bit, which drastically slowed down these cards. The hair dryer noise was the least of the Geforce FX problems.
Yeah… Half-Life 2’s method of rendering was tied to the mat_dxlevel parameter, which dictated a number of parameters tied to the feature set associated with each tier of support. At release originally I believe it was “70,” for DirectX 7 hardware like the original Radeons and GeForce256-Geforce2 and Geforce4 MX; “80,” for GeForce3 and 4 Ti hardware as well as GeForce FX 5200-5700 cards; “81,” for Radeon 8500-9250 and GeForceFX 5700+ kit; and “90,” vanilla Direct3D 9 hardware - which was gatekept to Radeon 9500 and up hardware because performance on GeForce FX kit was dire. Valve experimented with creating a mixed precision path to help FX hardware but found it was a huge hassle and that the Radeon hardware still kicked Nvidia’s parts down the stairs, so they junked it.

Why do I still know all of this 20 years later? I earned a master’s degree in a hard science and that STILL didn’t push this crap out.
 
  • Like
Reactions: erek
like this
That how I remember as well, more a story of Nvidia going for 32 bits color a generation too early and ATI being all in on the sweat 24 bits spots.
I think the rumor was that Nvidia had proposed their own version of DX9, while ATI did as well. Nvidia was confident that Microsoft would go with theirs, but because of the problems with the Xbox and Nvidia never giving Microsoft a break, they went with ATI's version of DX9. Since ATI was always using 24-bit, it would push Nvidia FX cards to go for 32-bit to match the precision. Of course this was just a rumor I remember hearing.
Yeah… Half-Life 2’s method of rendering was tied to the mat_dxlevel parameter, which dictated a number of parameters tied to the feature set associated with each tier of support. At release originally I believe it was “70,” for DirectX 7 hardware like the original Radeons and GeForce256-Geforce2 and Geforce4 MX; “80,” for GeForce3 and 4 Ti hardware as well as GeForce FX 5200-5700 cards; “81,” for Radeon 8500-9250 and GeForceFX 5700+ kit; and “90,” vanilla Direct3D 9 hardware - which was gatekept to Radeon 9500 and up hardware because performance on GeForce FX kit was dire. Valve experimented with creating a mixed precision path to help FX hardware but found it was a huge hassle and that the Radeon hardware still kicked Nvidia’s parts down the stairs, so they junked it.

Why do I still know all of this 20 years later? I earned a master’s degree in a hard science and that STILL didn’t push this crap out.
Half Life 2 was very accommodating to almost any hardware. I remember one guy ran on it DX5/6 GPU, and it looked like ass because no bump mapping. For Geforce FX owners it just meant worse water and glass. Unless you looked at it side by side, you couldn't tell. Half Life 2 was also around another controversy as DX9 evolved to DX9.0a 9.0b, and 9.0c. The FX cards and ATI Radeon 9500+ cards were all 9.0a, and Nvidia had moved onto 9.0c and so did many games. Which sucked ass because some games like the original BioShock required DX9.0c. I did buy a ATi Radeon X850 xt because it was cheap at MicroCenter, but that couldn't play the game either because it was DX9.0b. Which was just ATI's creation to try to half ass it to DX9.0c. There was a patch someone made to let you run the game on DX9.0b, but it looks like ass. I ended up buying a Geforce 6800 and unlocked all the pipes. Valve at the time was trying to prove you don't need DX9.0c to get features like HDR, which is why they made that demo. Unfortunately the industry moved onto DX9.0c, mostly because Nvidia pushed them.

View: https://youtu.be/__mU8d3TwdQ?si=lrCknOFQ62c0uSWC
 
Half Life 2 was very accommodating to almost any hardware. I remember one guy ran on it DX5/6 GPU, and it looked like ass because no bump mapping. For Geforce FX owners it just meant worse water and glass. Unless you looked at it side by side, you couldn't tell. Half Life 2 was also around another controversy as DX9 evolved to DX9.0a 9.0b, and 9.0c. The FX cards and ATI Radeon 9500+ cards were all 9.0a, and Nvidia had moved onto 9.0c and so did many games. Which sucked ass because some games like the original BioShock required DX9.0c. I did buy a ATi Radeon X850 xt because it was cheap at MicroCenter, but that couldn't play the game either because it was DX9.0b. Which was just ATI's creation to try to half ass it to DX9.0c. There was a patch someone made to let you run the game on DX9.0b, but it looks like ass. I ended up buying a Geforce 6800 and unlocked all the pipes. Valve at the time was trying to prove you don't need DX9.0c to get features like HDR, which is why they made that demo. Unfortunately the industry moved onto DX9.0c, mostly because Nvidia pushed them.

View: https://youtu.be/__mU8d3TwdQ?si=lrCknOFQ62c0uSWC

I’m not sure the DX7 path used bump mapping either, but the DX6 path definitely didn’t. DX6 also forced downsampled textures and I’m pretty sure dynamic lights were borked. There are videos on YouTube of various older cards struggling to manage the game; the Kyro II and Voodoo5 6000 struggle for different reasons, the Kyro simply not having that much bandwidth after accounting for its efficient rendering and not being able to compress alpha textures, the Voodoo struggling at various points that are probably running afoul of its quad SLI architecture and memory limitations.
 
I remember them getting into some hot trouble for bypassing intended rendering methods/applications with 3dMark. IIRC, this is what started the whole "Approved Driver" program or whatever it was called.
The benchmark called for rendering an object a certain way, and the drivers were application specific coded to do it a different way. Which is fine for games. But in a Apples-to-Apples comparison, when one vendor is rendering the entire apple, and the other vendor is only rending the part of the apple that is visible to the camera, well, that created a problem.
I remember quite a bit of hoopla surrounding this drivergate.
 
The good old days when 3dMark was a big deal (I gather automated benchmark in game were rare or a market to run a bunch of them and update them had yet to exist.. ?)
 
  • Like
Reactions: erek
like this
Man I remember being a broke kid when these were around and I could only afford a shitty PCI FX 5500 256MB. It wasn't great, but the games I ran were playable, and it pulled long hours in the cheap Compaq that housed it until it eventually died an early heat death ~3 years later.
 
I had a 5900 XT, which was like a 5700 Ultra with a wider memory bus. Even though it was only GDDR memory instead of GDDR2, the wider bus made it faster at higher resolutions than the 5700 Ultra. In DX8/8.1 games it could keep up with the 5900 Ultra.
 
I had a 5900 XT, which was like a 5700 Ultra with a wider memory bus. Even though it was only GDDR memory instead of GDDR2, the wider bus made it faster at higher resolutions than the 5700 Ultra. In DX8/8.1 games it could keep up with the 5900 Ultra.
Quoting, because I had an MSI-manufactured one that I flat-out adored. Within its limits the card was rock solid, and I loved running all my old games with angle-independent anisotropic filtering and antialiasing forced on. I think I sold it around the time PCI Express became standard, but I played through so many games with it the last couple of years I was in college. So many hours of UT2004, Doom 3, Alien Versus Predator 2...

Years later I snagged a Quadro FX 1300, a card that essentially mounted an FX 5950 onto a PCB featuring the same memory configuration as a 5800 Ultra, with a bridge chip for PCIe support. If you wanted to build a Core 2 era machine with support for early features like palettized textures and table fog, it would be genuinely hard to beat.
 
Last edited:
Sure, they've had a few cards that were competitive here and there but overall, its been extremely rare since NVIDIA became dominant in the market.
I would like to give an honorable mention to HD 4850/4870. The GTX 260/280 were another NVIDIA blunder from a price : performance standpoint and ATI made them pay. When was the last time NVIDIA had to reduce prices 23% just a couple weeks after launching a product? Those were the days! It's amazing how quickly NVIDIA recovered from all these cockups to become what they are today. Wish AMD would pull this off.
 
Last edited:
I’m pulling for Intel at this point - AMD’s GPU development may have been intractably stunted by their starvation days, and their software team just hasn’t been able to scale to broad competitiveness. I’m not downplaying how good their hardware team has been or the improvements they’ve worked hard to make either, but Intel has the resources and motivation to become #2 in the space, and OneAPI looks like a much stronger overall effort than ROCm has been even at this early phase.
 
I would like to give an honorable mention to HD 4850/4870. The GTX 260/280 were another NVIDIA blunder from a price : performance standpoint and ATI made them pay. When was the last time NVIDIA had to reduce prices 23% just a couple weeks after launching a product? Those were the days! It's amazing how quickly NVIDIA recovered from all these cockups to become what they are today. Wish AMD would pull this off.

I'm pretty sure I remember finding some planetary alignment of multiple rebates on a 260 right after it got discounted and I got it for like $200 or something silly.

I might have even found the deal on these forums

Served me well
 
I would like to give an honorable mention to HD 4850/4870. The GTX 260/280 were another NVIDIA blunder from a price : performance standpoint and ATI made them pay. When was the last time NVIDIA had to reduce prices 23% just a couple weeks after launching a product? Those were the days! It's amazing how quickly NVIDIA recovered from all these cockups to become what they are today. Wish AMD would pull this off.
AMD had acquired ATI by that point. They were just still using the ATI branding. I think the HD 3000 series was launching around the time the acquisition completed if I recall.
 
Last edited:
I would like to give an honorable mention to HD 4850/4870. The GTX 260/280 were another NVIDIA blunder from a price : performance standpoint and ATI made them pay. When was the last time NVIDIA had to reduce prices 23% just a couple weeks after launching a product? Those were the days! It's amazing how quickly NVIDIA recovered from all these cockups to become what they are today. Wish AMD would pull this off.
There was an odd phenomenon with wafers around 2008-2009 where the cost of the 65nm process used in the first second generation Tesla products increased in price over time instead of decrease, and the 55nm process used by ATi was a lot cheaper. This could be seen reflected in the price of the GTX 285 when it came out at $360 compared to the $650 of the GTX 280 a mere six months after the latter was launched. The GTX 285 was on 55nm. It's also important to recall that the half pitch on 55nm was the same as 65nm, so there was no improvement in transistor density from one process over the other, meaning they were not getting more chips off of a 55nm wafer compared to a 65nm one.

As I keep regularly banging this drum, the cost of wafers is directly reflected in the price of consumer products made from them.
 
What people need to understand about the 9700 Pro and later 9800 Pro/XT models is that they were the result of ATi's acquisition of a company called ArtX that developed the technology that would make those GPU's what they were. ATi did not develop that on their own. ATi has never been able to fully recreate that success or dominate NVIDIA to that degree again. Sure, they've had a few cards that were competitive here and there but overall, its been extremely rare since NVIDIA became dominant in the market. Interestingly, AMD, who acquired ATi some years ago had a similar history until recently as the majority of AMD's successes were more about purchasing other companies and repurposing their products than outright innovation on their own.

The biggest problem for AMD's graphics division is that NVIDIA is shockingly good at what it does. Not only from a technological standpoint, but from a business perspective. It's extremely hard to compete with. Intel has wanted a piece of that pie for a couple of decades and they are still nowhere near achieving parity with NVIDIA or even AMD for that matter. The days of some small startup creating technology that gets them bought out overnight and propels technology forward, allowing the buyer to leapfrog a powerhouse like NVIDIA are probably long gone.
One small exception… x1900/1950xtx cards did achieve that level of success. They didn’t top the charts for as long but they did top the charts.
 
  • Like
Reactions: Halon
like this
One small exception… x1900/1950xtx cards did achieve that level of success. They didn’t top the charts for as long but they did top the charts.
Most benchmarks from most sites at the time indicated that the X1950XTX was slower than the GeForce cards of the day. However, the [H]ard|OCP article on the X1950XTX indicated that in real world gaming performance, the X1950XTX provided a better gaming experience.

I remember this well as I had a pair of X1950XTX's in Crossfire. Truly one of the best cards ATi ever put out.
 
Back
Top