NVIDIA has lost a fanboy (rant)

That is the same exact argument I sold myself on with my X800XT regarding SM2.0 In the end, it bit me in the ass.
How can you miss what you quoted: "That shouldn't be a problem for someone who upgrades fairly often."

I mentioned that because the cards he listed in the first post show that he upgrades often. :p If you don't upgrade often, then what I wrote won't apply to you.
 
Edit: To clarify, I don't give a fuck about DX10.1, but NVIDIA just shows how lazy and unwilling to innovate they are by not including it in G92.

In the last three years, I have owned a 6800GT, two 7800GTXs, two 7800GTs, a 7600GS, a 7600GT, two nForce 4 motherboards, a 680i motherboard, and an 8800GTS. I bought so many of NVIDIA's damn products that they sent me a customer appreciation pack with stickers and junk like that in the mail (I shit you not). As someone who was really enthusiastic about their products, I've recently become pretty disappointed with NVIDIA as a company.

Point #1: 680i launch. Enough said.

Point #2: NVIDIA doesn't support Direct X 10.1 yet in their new products. For shame! We've known about this specification change for many months now and NVIDIA should have had time to add this to their new 65nm chips. Instead they went the lazy, bare minimum route (as usual) and released a card that slightly edges out ATI's card in performance but lacks in features. I'm deeply disappointed that they didn't come out with something that actually replaces the 8800 series like they should have. Why bother innovating when they don't have what they consider to be serious competition?

I'd rather buy two slightly slower cards that together will give me more performance and more features for cheaper than NVIDIA's ancient "flagship" card. The flagship card that they won't be bothering to replace with something significantly faster like they should have this month, since they can just milk money out of their current outdated technology.

That's how NVIDIA does business. The bare minimum. I'm aware that it make sense from a financial standpoint. But for a former enthusiast like me, that's a real kick in the balls.

riiight so you can point to a list of DX10.1 ONLY games that are on sale then can you?

DX10 is over-hyped and NV are just playing the wait and see game. Almost a year after launch and no DX10 games, and those that patch to give "DX10 effects" don't do much over DX9

NV arnt silly and if they see consumers starting to cotton-on to them being conned over DX10 potential and thus not go for DX10 hardware why should they waste dev&production time&money

simple economics
 
I dont get why opengl doesnt come out with a new standard and specifically make it work on XP just to spite those microbastards. :D then just maybe AMD will jump on the ball and beat Nvidia to it.
 
I dont get why opengl doesnt come out with a new standard and specifically make it work on XP just to spite those microbastards. :D then just maybe AMD will jump on the ball and beat Nvidia to it.

Usually NVIDA has(and always have had) better OpenGL drivers/support than AMD(ATi)
 
I dont get why opengl doesnt come out with a new standard and specifically make it work on XP just to spite those microbastards. :D then just maybe AMD will jump on the ball and beat Nvidia to it.

you mean OpenGL-3.0 which got released a while back and nvidia's 9xxx will be the first hardware to support it?
http://www.opengl.org/

OGL3 does what DX10-3D is "suppose" to be able todo and more, and will be available for Windows-XP. iD's new game is hinted at using OGL3 (it is written using ogl2.1 atm)
 
you mean OpenGL-3.0 which got released a while back and nvidia's 9xxx will be the first hardware to support it?
http://www.opengl.org/

OGL3 does what DX10-3D is "suppose" to be able todo and more, and will be available for Windows-XP. iD's new game is hinted at using OGL3 (it is written using ogl2.1 atm)

RTCW2? YES IT BETTER BE lol
 
you mean OpenGL-3.0 which got released a while back and nvidia's 9xxx will be the first hardware to support it?
http://www.opengl.org/

OGL3 does what DX10-3D is "suppose" to be able todo and more, and will be available for Windows-XP. iD's new game is hinted at using OGL3 (it is written using ogl2.1 atm)

OpenGL 3.0 is not out yet.
 
DX10.1 isn't even out yet. It won't be out until Vista's SP1 gets put out, so I agree, at this point DX10.1 support is merely a bulletpoint and wholly irrelevant.

Exactly, it was like when NVIDIA was the first with SMS 2 was it? HOW long did it take before any game used it a good year +

not buying a card because it doesnt have dx 10.1 is a lame excuse.
 
Usually NVIDA has(and always have had) better OpenGL drivers/support than AMD(ATi)

ATI's opengl support increased greatly i recall back with the x8** series release, they did a large revamp of their GL support putting it almost on par with nvidia.
 
Exactly, it was like when NVIDIA was the first with SMS 2 was it? HOW long did it take before any game used it a good year +

not buying a card because it doesnt have dx 10.1 is a lame excuse.

It was SM3.0 ;)

And I got a benefit in FarCry..HDR ;)

But you are right, it wasn't widespread.

ATI's opengl support increased greatly i recall back with the x8** series release, they did a large revamp of their GL support putting it almost on par with nvidia.
Almost being the keyword :)
 
The world is over, i hate nVidia for not releasing useless implementation at breakneck speed! Find us a game that uses 10.1, until then, moot point.
 
And the flipside to that coin is his point was that your limiting your choices due to your company loyalty, which is stubborn and narrowminded. I find people who brag about being a fanboy pretty funny...you willingly bought a product for more money that clearly did not perform as well...fanboism...yes, stupid......absolutely. As a consumer, why would you do that when you have other options? Do you not appreciate being a consumer or did you not have a choice in the matter?

Wow, where to begin. I like nVidia cards because I like their drivers, I like their board partners, I like that they work well in linux and have for a long time, and I like that they are typically the faster card.

When I bought the TNT1 it was the best card on the market, same for the GeForce 1, the GeForce 3 Ti 500, the 6800 GT, the 7950 GT (which I bought because it was the fastest fanless card available), and most recently the 8800 GT.

Granted, the 5900 Ultra turned out to be a poor decision, but I bought it based on the best information available at the time, which was that it was appreciably faster in Doom3 and that's what I bought it for.

For you to just jump in and accuse me of being an idiot fanboy is more than a little presumptuous on your part.
 
It was SM3.0 ;)

And I got a benefit in FarCry..HDR ;)

But you are right, it wasn't widespread.

Almost being the keyword :)

SM3.0 had nothing to do with HDR. Support for both happened to appear simultaneously on the 6800 series, but you can support 16-bit FP HDR without supporting SM3.0.
 
ATI's opengl support increased greatly i recall back with the x8** series release, they did a large revamp of their GL support putting it almost on par with nvidia.

ATi publicly stated that they had no intention of rewriting their OpenGL drivers back when they were looking like chumps in Doom3...they switched to a new architecture since then and have done a good job optimizing for the ever-decreasing number of OpenGL games...but if you want evidence of nVidia's OpenGL superiority, you need look no further than their total domination of the professional graphics market.
 
LOL...

Riva 128
Hercules TNT , 239$ (oc'd out of the box,way before BFG started selling oc'd cards (98/125)
TNT2 Ultra , 175/183 299$ (Diamond),and yes,I still have the box :)
Geforce 256 32Mb SDR Hercules Gullimot 429$ Cdn (still have the box !!)
Months later Geforce 256 DDR 64Mb - 449$ USD (Dell)
Geforce 2 GTS 64Mb - 459$ (creative)
Month later , Geforce 2 Ultra = 917 Cdn dollars !! (Hercules ,still have the box !!)
Geforce 4 TI 4400 from PNY 499$ each @ EB (Bought two of them,one for me,one for the wife)
Geforce 5800 Ultra @ 779$ Cdn :eek:
Geforce 5900XT MSI 359$ (whent through four of them !! still have the box)
Geforce 7800GT - 503$ times two,one for me,one for the wife + taxes (Asus EN7800GT , still have the boxs )
Geforce 8800 GTS 640 - 519$ eVGA (Dec 12/06 , box is sitting behind me on shelf)
Geforce 8800 GT 512 - 329$ , Gigabyte (Nov 20th 07)


Not to mention we own shares in the company,going back 5 years now.

haha, nice! you're one of the only people I've ever heard publicly admit to buying a 5800 Ultra...:)
 
Theres like what? 5 games that use DX10 in a somewhat noteworthy sense? And of those 5 how many of them really make that big of a graphical impact to warrant buying new cards new operating systems and new PCs all around.

Then you have the 10.1 conundrum. In a time period where we can hardly see any sort of benefit by going to DX10 right now, why be in such a rush to get into DX10.1. A whole whopping minor update set. The problem here is that early adopters, fan boys, enthusiasts, morons, etc. People hear 'new' and immediately assume better. Let me tell you, if I ever started a hardware developer, software developer, game developer, etc. Half of my budget would go into marketing, especially viral marketing to increase hype. Just remember that to the higher ups and the executives all they wanna do is make an extra buck, even if its at the cost of your own sanity.

Personally, when game developers get off their asses and utilize hardware better in gaming, then we will see the benefits of DX10. But as it stands right now. DX10 is a bunch of marketing hokey. Why buy a new PC, and a new operating system for 'the great DX10' when you can see and realize the same quality out of OpenGL?

Dont get me wrong, some day DX10 will make a difference in gaming. But as it stands right now, it was a piss poor decision for MS to start DX10 as Vista only availability (seems like MS knew vista wouldnt sell and wanted some DX10 help with that).

/My rant/
 
SM3.0 had nothing to do with HDR. Support for both happened to appear simultaneously on the 6800 series, but you can support 16-bit FP HDR without supporting HDR.

You mean less accurate ways, right?

And the last part dosn't make sense to me...supporting HDR..but not supporting HDR...at the same time?
 
OGL3 does what DX10-3D is "suppose" to be able todo and more, and will be available for Windows-XP. iD's new game is hinted at using OGL3 (it is written using ogl2.1 atm)
I kind of doubt that Rage is going to have an ARB3 path. It might happen, but it'd be very unusual for John to actually take that route. Now, for later Tech 5 licensees, that may be a somewhat different story.

OpenGL 2.0 still has advantages over D3D10 in some respects, too.

And the last part dosn't make sense to me...supporting HDR..but not supporting HDR...at the same time?
You don't need SM3.0 support to do HDR, at least not always. SM3.0 support is not inherently linked to "HDR support", and not all SM3.0 cards support all HDR methods.
 
You don't need SM3.0 support to do HDR, at least not always. SM3.0 support is not inherently linked to "HDR support", and not all SM3.0 cards support all HDR methods.

You did notice that I only mentioned FarCry, eh? ;)
 
...but if you want evidence of nVidia's OpenGL superiority, you need look no further than their total domination of the professional graphics market.

An excellent point. I once did a test for a school that teaches 3D Animation and I found the FireGL's to be terrible at OpenGL and their drivers problematic. There is a certain amount of irony in that situation that didn't go unnoticed by me.
 
linux and 3d rendering(3DS/MAYA/LW etc) =p?

Copied from Wikipedia.

Some notable games that include an OpenGL renderer include:

* America's Army
* Baldur's Gate 2 – Defaults to D3D
* Call of Duty
* City of Heroes
* City of Villains
* Counter-Strike
* Darwinia
* Doom 3
* Dwarf Fortress
* Enemy Territory: Quake Wars
* Far Cry – Defaults to D3D
* Frets On Fire
* Half-Life (not Half-Life 2)
* Homeworld 2
* Neverwinter Nights
* Prey
* Quake series
* Rage
* Serious Sam
* Serious Sam 2 – Defaults to D3D
* Starsiege: Tribes
* Ultima IX: Ascension
* Unreal series
* Warcraft 3 - Defaults to D3D in Windows
* Wolfenstein: Enemy Territory
* World of Warcraft - Defaults to D3D in Windows
 
I dont understand what your bitching about. Why dont YOU try making a 1TF card. I'm sure it's not so easy.
 
You did notice that I only mentioned FarCry, eh? ;)
Yeah, you tied SM3.0 and HDR using Far Cry as an example. Technically, SM3.0 didn't give you HDR in Far Cry -- the patch did, and it could have done that without SM3.0.
 
I dont understand what your bitching about. Why dont YOU try making a 1TF card. I'm sure it's not so easy.

I'd rather have a really effcient 0.5 TF card, than an ineffecient 1TF card.

PR-numbers are fluff, real world preformance is king.
 
Could somebody explain what's wrong with 680i? I don't have that motherboard, only nForce 4, which I can't say I'm disappointed with. Quick google linked me to andatech (sp) review of it and it came on top of all other boards in almost every category performance wise.
 
Interesting about the 680i boards...

I've pretty much always used VIA and Nvidia chipset boards but that is because I usually always have AMD stuff.

I usually never buy the "latest and greatest" just because the "mid-range" stuff I can get real cheap can usually be made to run at close to or higher than stock high-end stuff.

Right now my 7900GS has about a 40% overclock on the GPU and RAM and runs stuff just fine. Seeing the price on the ATI 3850s made me smile though and I will prolly go with one if my 7900GS dies or gets too slow.

Never really been an Intel fan just because of the much higher failure rate than AMD I have seen over the last 12 or so years.

But I am getting a G0 q6600, motherboard, RAM, and a 7600GT card for free so I guess I will be using Intel stuff for a while.

My current CPU, board, RAM, and the 7600GT will most likely be going into my sister-in-law's computer which is currently running a Athlon XP 2500+. It will be a super nice upgrade for her and should last her quite a while.

As for Nvidia needing to release a DX10.1 card really soon.. yeah right! How many games actually have DX10 support at all, and those that do.. a lot of them are barely different then when running them with DX9.

Once Vista goes more mainstream, I can see a lot more games supporting DX10 out of the box.. but for now, it is not financially responsible for Nvidia to come out wih all new tech when they really don't need to.
 
Interesting about the 680i boards...

I've pretty much always used VIA and Nvidia chipset boards but that is because I usually always have AMD stuff.

Those were about the only choices you had for motherboard chipsets, makes sense.

I usually never buy the "latest and greatest" just because the "mid-range" stuff I can get real cheap can usually be made to run at close to or higher than stock high-end stuff.

Sorry, I disagree. With processors this is true, with video cards it certainly isn't. You can't make a 7900GS run as fast as an 8800 series card. It just can't happen.

Right now my 7900GS has about a 40% overclock on the GPU and RAM and runs stuff just fine. Seeing the price on the ATI 3850s made me smile though and I will prolly go with one if my 7900GS dies or gets too slow.

Fair enough, it would certainly be a worthwhile upgrade. Though the 8800GT merrits consideration as well.

Never really been an Intel fan just because of the much higher failure rate than AMD I have seen over the last 12 or so years.

Now here is where I call BS. I've been a service technician at high end service centers, as well as an IT professional and even a system builder during that time. I've never seen a DOA Intel processor. Out of the thousands of them I've worked with, I've never had one failure. I've seen a handful of them die over the years, usually when the motherboard died and took the CPU with it, rarely on their own.

I've seen three DOA AMD processors in that time as well. I can't believe that anyone could possibly think Intel processors would have a higher defect rate than AMDs.

But I am getting a G0 q6600, motherboard, RAM, and a 7600GT card for free so I guess I will be using Intel stuff for a while.

It's a good setup, and that processor stock for stock outruns AMD's best. Overclocked, there is no comparison. The 7600GT is excellent for free, so I won't say anything about that. I don't know what board you got, but hopefully it's something decent and reliable.

My current CPU, board, RAM, and the 7600GT will most likely be going into my sister-in-law's computer which is currently running a Athlon XP 2500+. It will be a super nice upgrade for her and should last her quite a while.

Most definitely.

As for Nvidia needing to release a DX10.1 card really soon.. yeah right! How many games actually have DX10 support at all, and those that do.. a lot of them are barely different then when running them with DX9.

Precisely.

Once Vista goes more mainstream, I can see a lot more games supporting DX10 out of the box.. but for now, it is not financially responsible for Nvidia to come out wih all new tech when they really don't need to.

I wouldn't call it financially irresponsible, but not necessarily a good move. I think NVIDIA is probably working to make sure that G80's successor is at least what the 8800GTX was to the 7950GX2 or 7900GTX.
 
Could somebody explain what's wrong with 680i? I don't have that motherboard, only nForce 4, which I can't say I'm disappointed with. Quick google linked me to andatech (sp) review of it and it came on top of all other boards in almost every category performance wise.


This is the post you should be reading for what's wrong with 680i, IMO. Nothing compares to shooting yourself in the foot when it comes to achieving the ever elusive upgrade-ability.

http://hardforum.com/showthread.php?t=1244652
 
How can you miss what you quoted: "That shouldn't be a problem for someone who upgrades fairly often."

I mentioned that because the cards he listed in the first post show that he upgrades often. :p If you don't upgrade often, then what I wrote won't apply to you.


Good point, but one man's often is another man's not so often. :D

I think the great performance of the X800XT, or at least mine which was an All In Wonder with 2ns RAM which allowed me to overclock that card to near X1800XT peformance levels, contributed to that card being a usable asset long after one would have thought to upgrade.
 
Well for nvidia cards I have owned a TNT2 (I don't know what brand but it was an OEM card for gateway)
A jaton fx5200, replaced the TNT2, eventually died in a different computer, the caps blew out.
A leadtek 6800GT (430/1120) voltmodded, and it still runs today, never had a problem with it, not with the drivers, not with anything, I thought I killed it when it was just the motherboard, the pciE slot blew, thats how i got my x1950pro which is garbage.

I also have an nforce 2 IGP mobo (biostar), with a 2800+ semperon which is a peice of crap, but I don't think its nvidia's fault. It can't run at 100% load for more than 3 minutes without freezing, and the integrated graphics drivers only work for 3 reboots, then you have to reinstall he drivers to get it to work again, that could be nvidia screwing up.

I also have an nforce 4 SLi mobo, the one in the sig, it works great, it overclocks great, the only problem is the marvel gbit lan has been fried out in it since day one (this is my 3rd replacement board), Other than that it works great. The other two boards, the first one I ESD'd on it and killed it, the 2nd one, OCZ VX memory, and the mobo murder suicided on me.

Also on my recommendation, my brother has a foxconn nforce 590SLi mobo and an evga 8800GTS 320, if anything nvidia rocks.

ZINN your reasons for hating are childish at best, DX10.1 is meaningless, I don't see you bitching out ati for not supporting SM3.0 on the x800 series. The 680i chipset has had its problems, but when you think about it, what bleeding edge technology doesn't have problems? So what, they didn't have flawless product execution this time around, does that mean they can never get it right again, apparently you think so zinn.

Nforce 2-5 on the AMD side were simply the best chipsets you could buy, period. I believe nforce 7 AMD will be the same, nforce 5-6 on intel side have been the best for general performance, but as the cost of stability, which is something nvidia is going to have to work out, but I am confident they will.
 
Edit: To clarify, I don't give a fuck about DX10.1, but NVIDIA just shows how lazy and unwilling to innovate they are by not including it in G92.

In the last three years, I have owned a 6800GT, two 7800GTXs, two 7800GTs, a 7600GS, a 7600GT, two nForce 4 motherboards, a 680i motherboard, and an 8800GTS. I bought so many of NVIDIA's damn products that they sent me a customer appreciation pack with stickers and junk like that in the mail (I shit you not). As someone who was really enthusiastic about their products, I've recently become pretty disappointed with NVIDIA as a company.

Point #1: 680i launch. Enough said.

Point #2: NVIDIA doesn't support Direct X 10.1 yet in their new products. For shame! We've known about this specification change for many months now and NVIDIA should have had time to add this to their new 65nm chips. Instead they went the lazy, bare minimum route (as usual) and released a card that slightly edges out ATI's card in performance but lacks in features. I'm deeply disappointed that they didn't come out with something that actually replaces the 8800 series like they should have. Why bother innovating when they don't have what they consider to be serious competition?

I'd rather buy two slightly slower cards that together will give me more performance and more features for cheaper than NVIDIA's ancient "flagship" card. The flagship card that they won't be bothering to replace with something significantly faster like they should have this month, since they can just milk money out of their current outdated technology.

That's how NVIDIA does business. The bare minimum. I'm aware that it make sense from a financial standpoint. But for a former enthusiast like me, that's a real kick in the balls.

ok *shrugs*
 
Back
Top