AMD Accuses NVIDIA's Gameworks Of Being 'Tragic' And 'Damaging'

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
The "tragic and damaging" headline sounds like fighting words...until you actually read the quote below.

“Nvidia Gameworks typically damages the performance on Nvidia hardware as well, which is a bit tragic really,” he told PCR. “It certainly feels like it’s about reducing the performance, even on high-end graphics cards, so that people have to buy something new.
 
tragic? more like smart

i mean, they are after all a company that sells graphics cards..

so to make tech that requires more of their product to be sold is kinda the point
 
Apple does the same thing with iOS updates. The latest one made my iPad 2 useless because of how slow it runs. So long as fanboys do not hold their messiahs accountable, companies will continue to do this.
 
AMD is really on the ropes. It's amazing how often there is this pivot to focus on GameWorks to avoid looking at the real issue, which is that AMD is awful.
 
I have no problem with GameWorks. Witcher 3 runs flawlessly for me. GameWorks adds alot to the immersion.
 
A forum thread about AMD is like blood on the water for the usual trolls.

Amazing.
 
I have no problem with GameWorks. Witcher 3 runs flawlessly for me. GameWorks adds alot to the immersion.

Well, I should hope so, considering you are rocking a multi-thousand dollar system.
While your system is totally [H]ard, I believe the article is aimed at most gamers in general, those who are running mid-range GPUs and CPUs.

Even the Gameworks options have had adverse effects on my GTX770, and like PhysX, unless one is running the highest-end GPU, or a multi-GPU setup, the performance decrease, and cost of computational processing power, for most just isn't worth it.
I do think AMD is dancing around the issue of their existing hardware being lackluster, but that doesn't mean what they are saying in this article isn't the truth as well.
 
I have no problem with GameWorks. Witcher 3 runs flawlessly for me. GameWorks adds alot to the immersion.

You are running SLI'd Titan X's.... it better bloody well run flawlessly...

It would be nice if Nvidia were to drop gameworks and embrace OpenCL. AMD is right, it would help move everything forward.
 
What AMD is saying is definitely true. If you look at AMD Gaming Evolved games like Dragon Age Inquisition, Alien isolation, Civilization Beyond Earth they are much better performance optimized and provide a good experience at launch. Most importantly these games run very well on both Nvidia and AMD hardware.

http://www.hardocp.com/article/2014..._video_card_performance_review/8#.VbuBHrUXaZc

"Alien: Isolation performs extremely well across a wide swath of GPUs, which many times means the graphics are subpar, but that is not the case here. This game is a showroom of DX11 efficiency; Watch Dogs eat your heart out. We haven't seen the likes of this much attention to detail using DX11 on the PC in a very long time. We hope to see more titles focus on this. Less console-itis please.

We think that Alien: Isolation is a template that other game designers should take notice of. We love the use of open technologies based on DirectCompute and the absence of proprietary technologies that make Alien: Isolation truly a great title. Team Green and game devs please take notice of this, as this is what pushes the gaming industry forward as a whole. And that is simply good for all of us."

The Gameworks titles have been buggy, poorly optimized and downright pathetic at launch and take months to fix and run reasonably well. Even after months and multiple patches these games cannot provide the same kind of optimized performance which you get from a Gaming Evolved title. The Gameworks features like Hairworks use tesselation in a very unoptimized way. The tesselation factor is chosen to be a max of x64 when it has been shown that there is very less loss of image quality at x16 but vastly better performance.

http://wccftech.com/amd-announces-w...driver-coming-boost-tessellation-performance/

AMD users were enjoying better performance than Nvidia users with tesselation override from AMD CCC. Nvidia users were stuck and only now with the latest 1.07 patch (more than 2 months from launch) has a slider been introduced for tesselation level in Hairworks.

http://forums.anandtech.com/showthread.php?t=2439792

Why was this slider not available at launch. :D This shows how Gameworks licensees have been careless about balancing performance and image quality or maybe Nvidia woke up after their own Kepler owners bashed them for poor performance both with and without Hairworks. :p
 
I really wish that NVIDIA would implement some form of tessellation control in the drivers like AMD have.
After upgrading my 570 to a 960, I still can't enable the feature in many games (e.g. Tomb Raider 2013) and hope to keep a constant 60 FPS.

Same thing for Gameworks features.
Gameworks features seem designed for people that find 30 FPS gaming acceptable.
 
Crysis 2? You had to go that far back to find "bad" tessellation optimizations?
 
Gameworks: DRM for game optimizations. The consumer will decide how much rootkit they want in their lives.
 
At least die with some dignity AMD.

AMD is really on the ropes. It's amazing how often there is this pivot to focus on GameWorks to avoid looking at the real issue, which is that AMD is awful.

Mantle proves that AMD provides very well for their hardware. It does not even have to cripple performance on Nvidia hardware to achieve it.

If AMD is so awful explain DX12/Vulkan or did MS/Khronos take their cue from Nvidia on those ?
 
We have seen new IQ tech come to market for the entire time I have been running HardOCP that was not immediately "usable" by all cards at all resolutions. The argument that something is "wrong" with that is simply idiotic. I like Huddy, he is a good man, but this is marketing FUD at its best.
 
No major problem with GameWorks here, either. Running poorly on nVidia hardware is a bit sad, but so far the effects haven't been mandatory so having it is just a nice bonus.
 
No major problem with GameWorks here, either. Running poorly on nVidia hardware is a bit sad, but so far the effects haven't been mandatory so having it is just a nice bonus.

Yeah,

The problem - however - is that even if AMD gets their act together in the next gen of GPU's end users who want to play the latest titles with all the effects enabled will have to buy Nvidia GPU's, as Nvidia has provided a development SDK - free of charge - to game developers.

It's unethical business practices at best, and possibly illegal at worst.

The GameWorks SDK's reduce the needed work (and thus the cost) for game developers, so it's essentially "bribing developers to lock out the competition" or pretty damned close to it.

The PC Gaming market (and PC market in general) has always been great because of open standard and modularity allowing us to chose the components we want from among the different competitors. When Nvidia does something like this, it hurts us all, even if we happen to own their products, and benefit from the GameWorks effects.
 
You are running SLI'd Titan X's.... it better bloody well run flawlessly...

It would be nice if Nvidia were to drop gameworks and embrace OpenCL. AMD is right, it would help move everything forward.

What does Gameworks have to do with OpenCL?
 
Really wish that AMD's GPU division would break off into it's own company at this point.
Would be nice to go back to where we were a decade ago.

One can always dream~
 
Zarathustra[H];1041766207 said:
Yeah,

The problem - however - is that even if AMD gets their act together in the next gen of GPU's end users who want to play the latest titles with all the effects enabled will have to buy Nvidia GPU's, as Nvidia has provided a development SDK - free of charge - to game developers.

It's unethical business practices at best, and possibly illegal at worst.

The GameWorks SDK's reduce the needed work (and thus the cost) for game developers, so it's essentially "bribing developers to lock out the competition" or pretty damned close to it.

The PC Gaming market (and PC market in general) has always been great because of open standard and modularity allowing us to chose the components we want from among the different competitors. When Nvidia does something like this, it hurts us all, even if we happen to own their products, and benefit from the GameWorks effects.

Illegal? How? By courting the companies that provide software for NV's clients? That doesn't sound shady at all; it sounds like smart business.

There is literally nothing stopping AMD from doing the same, except AMD themselves. That's who you should be blaming, not NV.
 
Zarathustra[H];1041766207 said:
Yeah,

The problem - however - is that even if AMD gets their act together in the next gen of GPU's end users who want to play the latest titles with all the effects enabled will have to buy Nvidia GPU's, as Nvidia has provided a development SDK - free of charge - to game developers.

It's unethical business practices at best, and possibly illegal at worst.

The GameWorks SDK's reduce the needed work (and thus the cost) for game developers, so it's essentially "bribing developers to lock out the competition" or pretty damned close to it.


The PC Gaming market (and PC market in general) has always been great because of open standard and modularity allowing us to chose the components we want from among the different competitors. When Nvidia does something like this, it hurts us all, even if we happen to own their products, and benefit from the GameWorks effects.

agreed. There are a lot of people in denial that Gameworks is harmful to the PC gaming industry. The worst is the tech press which is not advocating against such practices. Looks like most of them are happy with their kickbacks, Nvidia event invites, frree hardware and monthly ad revenue from Nvidia.
 
Well he can't talk about Witcher 3 because he put his foot in his mouth by saying they were were working with the dev from the beginning, where game works was in there before the announcement of the game from 2013.

Gameworks was announced earlier on, but I remember hearing from amd that their performance in the witcher 3 was competitive until the hairworks binaries dropped into the game build about 2 months from launch. Inclusion of a feature on a slide is not the same as having it actually implemented in game. That came later, and when it did come NVidia got a major head start on optimizations because they have access to the hairworks code and could have been making tweaks all that time while adding it into the game.

So for amd hairworks was a double whammy. Late to the optimization party due to it being dropped into the game builds at a later date for the optimizations they could do without looking at source code, and not able to look at source code direction due to nvidias license terms.

Anyone who finds this "fair" is an NVidia apologist. That code is a built in NVidia performance advantage, nothing running it should be used as some performance indictment against amd. If they at least had access to the code, and NVidia just performed better that would be one thing. On tessellation heavy effects that would still be the case, but that is not enough for NVidia, they want to delay amds ability to optimize at all for day 1 benchmarks, and make it impossible for the more fine tuned optimizations source code allows.

@#$% NVidia, and their fans for defending this, or for being silent as if this style of business practice is OK. Silence or indifference is a tacit endorsement of this style of practice.
 
I really wish that NVIDIA would implement some form of tessellation control in the drivers like AMD have.
After upgrading my 570 to a 960, I still can't enable the feature in many games (e.g. Tomb Raider 2013) and hope to keep a constant 60 FPS.

Same thing for Gameworks features.
Gameworks features seem designed for people that find 30 FPS gaming acceptable.

What the hell is wrong with 30fps on a game(unless you are playing twitch-reflex games like CoD or CS)? Tomb Raider '13 plays great at a locked 30 on my machine. I remember the times when you were lucky when a 3D game ran at 20fps on your PC at 640x480, if that.
 
It is not about wrong it is about AMD not being able to access the source code to allow AMD to optimize for it. That his stance on this is somewhat of an exaggeration is not that uncommon in PR.

Do you think AMD gets the source code for every game released in order to optimize their drivers?

Either way the issue was Hairworks using a lot of tessellation. AMD has poor tesselation performance and Maxwell has more advanced performance. It was fixed on older NVIDIA cards and AMD cards with later patches/drivers whatever.

AMD is trying to distract from the fact that their cards are under performing across the board. They called Fury "an overclockers dream" and the card for 4K. You can read Hardocp's review to see how much pure BS that was.

AMD is in full panic mode right now. If their stock drops another dollar they could be delisted from the stock exchange and that would be the last nail in a nearly sealed coffin.
 
Mantle proves that AMD provides very well for their hardware.

oh lordy. mantle is used in what, 4 games, and its better than dx12 how? AMD windows drivers have a poor reputation and theyre slow to be released.

AMD's sound logic: so uh, instead of fixing that windows driver problem (you know, that problem that affects all games, esp. newer ones), how about we create a custom API instead? its bound to be popular. more popular than DX, i just know it! thats a good use of our resources.

... aw :eek: the quote was bait huh? :(
 
What the hell is wrong with 30fps on a game(unless you are playing twitch-reflex games like CoD or CS)? Tomb Raider '13 plays great at a locked 30 on my machine. I remember the times when you were lucky when a 3D game ran at 20fps on your PC at 640x480, if that.

It isn't 1992 anymore.

60 is the minimum standard with very few exceptions.

If i wanted a slideshow, I would open powerpoint.
 
I have no problem with GameWorks. Witcher 3 runs flawlessly for me. GameWorks adds alot to the immersion.

Witcher 3 pulls the same bullshit they described in the article.

I ran Witcher 3 through a profiler and found that it renders foliage and foliage shadows even when it is not visible.
 
SNIP

@#$% NVidia, and their fans for defending this, or for being silent as if this style of business practice is OK. Silence or indifference is a tacit endorsement of this style of practice.

It's a business not a fucking charity. Please stop deep throbbing your Awesome Masturbating Device.

Thank you,
Corporate world
 
It isn't 1992 anymore.

60 is the minimum standard with very few exceptions.

If i wanted a slideshow, I would open powerpoint.

30FPS isn't a slideshow although I understand the psychological need to justify spending a thousand dollars on video cards for only a marginal improvement over consoles.
 
I still like AMD. Have since the ATI days. I'm about ready to switch to NVIDIA, though. Just because I have no allegiance and will go with who has the best performance for my money.

I don't know if I really agree, though. Is this a technology that's released before it's prime or is it not optimized? I love seeing tech demos and tech that runs like shit on a video card, but bring in the new generation of cards and it's much faster. I've always been a fan of eye candy and image quality, sometimes sacrificing framerates for a better looking image (not too much, though... Although Skyrim with a shit ton of mods looks amazing, but I only got 3-4 FPS at the time, unplayable but damn it looked good!).

I love when they give us stuff that brings current stuff to it's knees. I gives me a reason to upgrade, and when I do, I can crank up some settings and it feels like a bigger and better upgrade. Makes it worth it. I remember being able to turn Quake 3 up to it's max settings after a video card upgrade... That was nice. It'll be good to be able to do that again.

It's not tragic to me. I enjoy it. I still love AMD, but I see this as more of mud slinging than anything.
 
oh lordy. mantle is used in what, 4 games, and its better than dx12 how? AMD windows drivers have a poor reputation and theyre slow to be released.

AMD's sound logic: so uh, instead of fixing that windows driver problem (you know, that problem that affects all games, esp. newer ones), how about we create a custom API instead? its bound to be popular. more popular than DX, i just know it! thats a good use of our resources.

... aw :eek: the quote was bait huh? :(

https://en.wikipedia.org/wiki/Mantle_(API)#Video_games

Just quite a bit more.

The "custom" API does something no other API did for the PC market. It works well and got MS and Khronos of their ass.

Windows driver has nothing to do with it. It is a closed system where you can not optimize for, regardless who does it.
 
oh lordy. mantle is used in what, 4 games, and its better than dx12 how? AMD windows drivers have a poor reputation and theyre slow to be released.

AMD's sound logic: so uh, instead of fixing that windows driver problem (you know, that problem that affects all games, esp. newer ones), how about we create a custom API instead? its bound to be popular. more popular than DX, i just know it! thats a good use of our resources.

... aw :eek: the quote was bait huh? :(

There was no DX12 until Mantle showed it could be done. MS had no plans for DX12 before that. At least a little honesty, please?
 
It is not about wrong it is about AMD not being able to access the source code to allow AMD to optimize for it. That his stance on this is somewhat of an exaggeration is not that uncommon in PR.

Its about AMD whining because NVidia make something they dont and now they are suffering for it.

NVidia make a gaming toolkit that makes game development faster with more features that push the limits of graphics cards.
AMD need to stop complaining and do the same.
 
30Hz gives me a headache after 5 - 10 minutes. Looks like a strobe light to me.

I think that developers using GameWorks are just incompetent. They simply are unable to write a complete game. They literally suck at game development. I only blame Nvidia for writing bad code and the black box. It's not their fault that development peons want to skip steps in the development cycle and are willing to implement shit code to finish their games faster.

Naught from naught leaves nothing. That's the performance that you get from a shit developer using shit tools; nothing. Can't wait for the next debacle from the usual suspects like Ass Creed / Batman / Witcher 3 HairWorks / Project Cars/ etc . Should be right around the corner going by Nvidia's track record.
 
Back
Top