NVIDIA & Multi-display Gaming Editorial @ [H]

Just food for thought:
I'm not a technology writer and I don't pretend to be someone that is. I'm an enthusiast who gets excited at the hardware/technology and loves seeing it in action from other peoples' rigs, if not my own.

I may be wrong and very ignorant on this, but is it possible that for a developer to add in 3D effects to a game, that it may not be possible/cost effective for them to support other features? I guess what I'm trying to say is, the 3D part may not be a simple Driver fix provided by NVIDIA; it may be something that has to be coded into the game specifically with the help of NVIDIA. If that's true, then yes... As a consumer, I would expect certain features to be locked out of the game then.

I wouldn't buy an HD-DVD player expecting to run all the movies that existed on the format; I would know that there is a competing brand of HD player on the market that has the support of different movie studios. It's up to me as the consumer to be aware of the product differentiation between the two companies and make the best decision for myself.

I fully expect to be flamed for this and flagged as an NVIDIA fan boy or in the minority of opnions; just realize that I happily own a 5850 from Diamond and anxiously await the thin bezel monitors from Samsung before I splurge on the monitors.

Anyways, thanks for the great article Kyle; as always, you say it like it is and whether or not I agree with you... I always enjoy reading your news/articles. Keep up the great work.


I can’t image that to enable 3D gaming it will require another API , in theory if it works with Batman AA it should work with any URE3 game of course the developer must have to do something since they only listed a few games. All the uproar will be from nV making TWIMTP games only able to do multi-display on nV hardware once they enable this feature, 3D is something else entirely.

Let’s face it though 3D is nothing new and has been touted for awhile now, heck I even remember a few games on my Packard bell that were advertised as 3D and this was 1992.. 3D is nothing new for nV and I imagine they are using multi-display gaming to push it, which is fine as long as it doesn’t leave us consumers without choice. But really if 3D Vision drops your performance in half that would mean if you buy another nV part your back up to 60FPS....hmmmm
 
Question:

Being able to display a 3D image on a 2d display has been around for a very very long time.

How would NVIDIA be able to stop ATI from using this technology also?

NVIDIA doesn't own the patents to reproduce a 3D image on a 2D device. This technology has been everywhere, the only difference is how to make it work on their 3 displays.
I would think that ATI could perform this easily and then just come up with their own way to span it across their eyefinity setup.
 
This is fantastic news. As a gaming goob who has been chasing the "ultimate" gaming setup for decades, I how hopeful it might finally arrive.

My first jump into this arena was when I purchased my first LCD 3d shutter glasses for my Amiga in the late 1980's. Playing Spud Wars (I think that was the name) was an amazing 3d gaming experience. My Matrox Parahelia was cool for early triplehead, but lack of game support plus weak 3d acceleration doomed that product.

I returned to 3d several years back with 3dfx and then Nvidia which was absolutely fantastic when it worked correctly. WoW in 3d really needsvto be seen to be appeciated.

Two years or so ago i dumped my CRT in exchange for three LCDs so I could run a Matrox Triplehead2Go. Triplehead gaming truly is everything Kyle makes it out to be, but I have to say it is not necessarily better than 3d.... just a different experience.

The thought of having BOTH simultaneously is quite exciting, yet having experienced the lack of native support for both technologies and the hoops one needs to jump through to get around those issues, makes me a bit apprehensive.

However, if developers do start natively supporting 3d Vision/Surround, PC gaming will be the ultimate gaming experience.
 
We are not talking about PS3, 360, or anything like that, we are talking about standard features like anti-aliasing and, knowing nvidias behavior, potentially multimonitor support being locked out for other vendors in games through TWIMTBP program. How about nvidia locking out physx when you have an ATI card as your main display card and an nvidia card as your physx card? Yeah real fair to all the people who bought nvidia cards and want to use ATI cards to push their graphics. Also nvidia and rocksteady got A LOT of bad press for locking ATI cards out of AA in Batman which is probably why the next patch they released allowed ATI cards to enable AA in game options. Defend nvidia all you want, no one else here will.

I just checked, and rocksteady has not released a patch enabling in game AA....do you have a link/source for this?
 
Question:

Being able to display a 3D image on a 2d display has been around for a very very long time.

How would NVIDIA be able to stop ATI from using this technology also?

NVIDIA doesn't own the patents to reproduce a 3D image on a 2D device. This technology has been everywhere, the only difference is how to make it work on their 3 displays.
I would think that ATI could perform this easily and then just come up with their own way to span it across their eyefinity setup.

You misinterpreted the final paragraph. 3D surround is nvidias multimonitor solution which is basically eyefinity while 3D-Vision is the actual 3D portion. Kyle was talking about TWIMTBP program possibly locking Eyefinity out of games only allowing 3D-Surround which has nothing to do with 3D at all.
 
Thanks for reminding me why I come to this site.

I agree completely with thought that nvidia would try to take advantage of the situation with TWIMTBP games.

I'm a long time returning customer for nvidia and since I put other things first I haven't bought a shiny new card from AMD just yet. I may wait until Fermi benchmarks hit the web a few months from now. But nvidia's lack of business ethics lately leaves me very cautious about putting money into another one of their products.

Thanks for the great read Kyle.
 
I just checked, and rocksteady has not released a patch enabling in game AA....do you have a link/source for this?

I think you still have to enable it in CCC but the game now supports it, I thought they patched it somehow I might be mistaken.
 
You misinterpreted the final paragraph. 3D surround is nvidias multimonitor solution which is basically eyefinity while 3D-Vision is the actual 3D portion. Kyle was talking about TWIMTBP program possibly locking Eyefinity out of games only allowing 3D-Surround which has nothing to do with 3D at all.


Oh, Ok. I see what you guys are saying. I did misinterpret.

I would think ATI would have a substantial law suit against nvidia if they tried to do that. They would basically be saying that the PC is now their proprietary gaming console.
 
Hmm, I don't care about the whole "new" 3D gimmic for theatre/blu-ray/cable/sat, or 3D glasses gaming, or even gaming on multi-monitors. Gaming on multi-monitors looks nice but I just can't get past the bezels (too distracting), so I'd rather have a single large screen or projector.

I'm convinced that the current 3D/480Hz/Smell-o-vision garbage is a way to keep TV prices high via creeping bullshit "features". 3D has always looked like shit and will continue to look like shit for the foreseeable future.

Gaming on multimonitors is awesome. The best part about Eyefinity is that it's free and comes standard on even the cheap cards. If you want to do it, cool. If you don't, that's cool too. My kids saw my setup and were asking how much they could get one for. Very cheaply if you shop smart(shop S-mart). After a while, you don't even notice the bezels unless they are made to stand out due to a gimpy overlap.
 
My torch is soaking in kerosene. Count me in if NV decides to F us!

I think you still have to enable it in CCC but the game now supports it, I thought they patched it somehow I might be mistaken.

The CCC will force AA but it's a performance hit versus faking the VID since it's being forced by the driver. The patch that removes the VID lock has not yet been released to my knowledge.
 
Great read Kyle. Agree with your points.

Any chance your camera crew can drop by the AMD booth and get some pics + specs on the new thin bezel displays from Samsung?
 
I think a better name would have been "NVision" instead of "3D surround", oh well. Thanks for the article Kyle :).
Might be good new for people with prior SLI setups though if we too will be able to take advantage of triple monitors too without having to use SofTh or the likes. Glad to hear they are thinking of us little guys that can't afford to run right out and grab the latest and greatest video cards :). This is good news for everyone :).
 
Im upset that a website is biased and admits to leaning towards nvidia in teh first place. Know there is NO "Trusted" website i can get legit reviews from without wondering how much there gettng paid to play with Nvidia. Oh ya i hate nvidia MUhahahahahahah:D
 
You shouldn't have Nvidia. They've brought a lot of excellent technology to the forefront just as ATI has over the last decade.
 
Kyle doesn't want the 3d vision only the multi monitor support portion to be open source.

Well concidering they have said it will work in 3D vision and 2D modes aswell, I sorta doubt they'll beable to do it at all. But the 3D Vision gaming thing might be a Nvidia only thing as I haven't heard or seen anything of the like from teh ATI camp.
 
Wow I do not understand why everyone seems to be in bed with ATI recently and trying to dog Nvidia at every turn. When ATI is late with a product that can compete you dont see all this doomsday press.

Did you try Nvidia setup at CES? Then how can you already write an article playing down 3D? I tried it and it was truly awesome, it is not some cheezy 3d crap like at the theaters. Notice that the glasses have to sync with the displays. This is not disneys 'Honey I shrank the audience' with those fucking throw away glasses. I also don't see why you are talking shit when they are bringing in support for multiscreen gaming. They are bringing something more to their products. You say they could have done this a year ago, well mayb and so could ATI. Look how great it worked for Matrox that they both had as an example. ATI just tried it again with faster hardware and it worked.

You are worried they needed SLI when ATI can run 3 screens just fine...... ok two things first ATI's is not running 3d that requires twice the power than 2d, also IT IS A TRADE SHOW... come on people of coarse you are going to use overkill hardeware for your demos and yes they all ran smooth even though it was 3d... on 3 screens... on old hardware. I can't wait to see what next gen hardware can do.

What is this talk about nvidia now making multiscreen gaming proprietary ATI came out with it first so I would be watching ATI. Seriously though they will probably just both have their proprietary methods (like GGPU computing remember ATI had their own proprietary shit too) untill a DirectX makes a standard.

This article is just sad. The [H] has always been one of my favorite tech sites because they seem to be down to earth and controversial if necessary, but now I have to wonder where some of this seemingly biased bullshit is coming from. I see anti-nvidia articles all over the place net and have to wonder why. It's not typical articles about a manufacturer being late with a product we want; it always seems to be, oh shit Nvidia is late they are in trouble they are going to close shop, doomsday brimstone and hellfire.

P.S.

I love my radeon 5850 :cool: . If Nvidias new part doesnt perform and they lose this time oh well we will see them again next time around.

Are you suggesting that [H] does not bust AMD/ATI's balls when they are late?
 
In no way shape or form have I been talking about anything that actual has to do with NVIDIA's 3D Vision (which is the 3D visual thing with the glasses you wear), only 3D Surround....which also has nothing to do with anything 3D, but is about multi-monitor play.
 
Are you suggesting that [H] does not bust AMD/ATI's balls when they are late?

We are always "in bed with" the company that we think represents the best value for the gamer and hardware enthusiast. ;) Some folks don't get it.
 
I hope Nvidia won't need 2 cards or an expensive adapter to use 3 monitors for gaming. If they throw in bezel management on top of that they could have a huge advantage over ATI. It could help offset the higher cost of the larger Fermi chips, if you don't need a $100 active display port adapter. Here's hoping they take the high road and keep multi monitor gaming open platform.
 
I remember reading somewhere that quadro multi-display, which I'm assuming this technology is based off, can handle displays of different resolutions. If so, that would be a boon for those of us using 30inchers right now (sorry, but two more of these will break the bank).
 
I remember reading somewhere that quadro multi-display, which I'm assuming this technology is based off, can handle displays of different resolutions. If so, that would be a boon for those of us using 30inchers right now (sorry, but two more of these will break the bank).

Different resolutions would be great and a large competitive advantage. If I had a choice I would use a 16:10 or 16:9 front display and two 16:12 side displays. I feel the side displays are more for spotting and doubt wider side displays would help the immersion factor, since your peripheral vision is blurry. Unfortunately 16:12 (4:3) displays are going the way of the dinosaur. It would sure be nice to be able to use our existing displays and not have to buy 3 new ones.
 
So you admit to being sellouts when you should remain neutral and blast either company on a need be basis. Nice.

We advise you to keep your sarcasm hat on during the duration of visit here at HardOCP.
 
Apparently I am mistaken and have copied and pasted the name from an incorrect source.


"NVIDIA Surround" is the official name as per the VP of GPU with NVIDIA. I will update the article to reflect. this.
 
I have to agree with Kyle and stopped reading at Page 4 since it was starting to get repetitive with the responses.

I can definitely see the issue where Nvidia's "The Way It's Meant to be Played" (TWIMTBP) is already causing issues for ATI cards-- Resident Evil 5, Batman AA, and many more with that logo. Games have performance issues and/or graphics anomalies and glitches.

It absolutely doesn't make sense from a consumer end for a game to be unfairly having subpar performance because the consumer had the wrong video card.

Should I, as a consumer, have to be forced to buy Nvidia GPUs so that I can play a game to its fullest because I only have an ATI card? No.

Does it make sense to do that? No.

Do I have to buy a card specific from now on for each game so they run at their fullest? No.

Why should I be forced to buy one brand of a video card because a game refuses to run in another brand that I have installed already? I shouldn't have to.​

The same for 3D Surround Vision from Nvidia. I wouldn't want a game developer lock out multi-monitor support and extreme display resolutions because it detected an ATI card. I wouldn't want to see under "Requirements" and "Recommended" on the back of the box, "Nvidia 9000-series, 200-series, GT100-series for multimonitor solutions. ATI cards not supported for multimonitor solutions."

Would you want that? Would anyone want to see that?

That is the whole point of Kyle's argument in that editorial. Games should not be loyal or exclusive to any one company-- Nvidia or ATI. Performance should be determined by the physical card themselves and the drivers, and not from exclusive features pushed or forced upon by AMD or Nvidia that makes them proprietary. Make a better video card that performs as it should and as-is for a game. Nvidia shouldn't strong arm themselves into some game developer's company office to introduce proprietary code so you can have a performance advantage over another company's card. Let your cards determine that-- and that applies to both ATI and mostly to Nvidia. If your card doesn't perform well, then improve upon its design and try again.

The gaming industry and the way they program and create games should not and must not be controlled or influenced by the money of any video card company, especially Nvidia's. It doesn't bring innovation. It doesn't bring fairness to the market. If Nvidia continues on this path, sooner or later games will only show performance improvements on Nvidia-only cards. TWIMTBP is already a clear sign of that starting, and hopefully it doesn't continue with this 3D Surround Vision thing. There should never be proprietary code in a game that will lock out or dumb down features and performance in a game so it performs better on one card than another. It should never be that way in the first place.

Just like with PhysX. Physics calculations and programming them shouldn't have to use proprietary code, and have it disabled because I use an ATI card. It absolutely doesn't make any fucking sense. Should I have to buy an Nvidia card now just to see enhance physics and special effects from now on? Absolutely not. The same applies to Kyle's argument and this 3D Vision multimonitor feature from Nvidia. Should I have to buy an Nvidia card because the game developer locked out multimonitor and 3D support because I have an ATI card? Hell no.

From a legal view, it's bordering on a near-monopoly and being criminal. Then, we'll be where Intel vs. AMD are in the CPU market-- with one company over-pricing a CPU just because they can and because they have dominance in the market.

This TWIMTBP is stupid, silly, ridiculous, and unfair to the consumer. If Nvidia manages to swing its wads of cash around to a game developer to purposely disable multimonitor support or disable performance enhancements in a game if an ATI card is detected, then the gaming industry is dead when that happens.

Aggressive exclusivity does not foster competition.

Healthy competition fosters innovation and change.

Unhealthy competition brings incompatibility and unfairness to the market, doesn't help bring innovation and change.​
 
I totally agree with octoberasian.

At this rate, I will have to buy a 5790 and a GTX 295/Fermi for 2 of my PCIe 16x slots and switch between them in Windows before each game. :mad:
 
Bolded=100% true IMO and honestly they havent had to make a move with it until now. (really I dont blame them from a biz aspect because we all know that Quadro is a cash cow, and you cant tell me that you really would have done it either.) Not to mention that Matrox has had a solution to "hold over" that .01% of customers that wanted it, albeit they were res capped.

Plus since they have had this FOREVER on Quadro, its going to be a BREEZE to implement in GeForce

NOW, TWIMTBP. personally 98% of the time I love it. I love the fact that I can buy that game on launch day and have all of these abilities w/o issue (with a driver install of course) I like that NV is pushing devs to implement new tech that they:
A) May not have been able to afford
B) Dont have the engineers or time to implement
C) Didnt know about

I really dont think that NVIDIA is cold hearted enough to say how can we break the way ATI does it. (physX I think was more of a we dont want the ati blames NV for X not working in this game and vice versa)

My non PC world example: BMW/Ford have patents on exhaust/turbo exiting the INSIDE of the head in a V motor (DOHC/PushRod respectively) for better heat energy consumption. Does this mean that they should Open source it for GM to use? HELL NO!! thats why patents are in place, to one up your competition. No different in the PC world.




IMHO this is 99% true, I think that Kyle on occasion is biased. I think in this case for his "hatred" for 3d kinda pushes him the red way, NVIDIA is late too...


now to my thoughts on "nFinity"

I have 2-GTX285's so this is awesome, how many 4800 series owners have eyefinity?????

I dont care for the 3d, its cool but not for me

My 285s are fast, there isnt a game I cannot play currently at my 19x12 or 2560x16 res at full settings

I was going to go ATI just for eyefinity, but I use CUDA stuff too much and I havent been impressed with "bug fixes" by ATI in the past.

The fact that in the middle of that post you called someone else biased is just... wow. Plus any high res triple monitor setup on a 4800 series or GTX200 cards would be painful to watch.
 
Well, whatever the various issues are, if Nvidia will support triple monitors, bezel management, and have solid drivers for the 200 series, my GTX 285 SLI/X58 i7 setup will be sporting triple 120hz thin-bezel monitors soon, hopefully! My one hope is that Nvidia's "Surround" will support 20/30/20 configurations too!

I'm a flight-sim/driving sim enthusiast and i've seen triple monitors in action. It is THE way to go.

C.
 
Glad I read that write up. And I'm really proud to read HardOCP after reading Kyle's comments. Someone has to stand up to companies when they try to make technologies too proprietary. I honestly don't think nV's current stances on their tech benefits them (I refuse to buy their products already due to it). I support innovation, not AA lockouts (as if that's proprietary).

Thank you Kyle.
 
Glad I read that write up. And I'm really proud to read HardOCP after reading Kyle's comments. Someone has to stand up to companies when they try to make technologies too proprietary. I honestly don't think nV's current stances on their tech benefits them (I refuse to buy their products already due to it). I support innovation, not AA lockouts (as if that's proprietary).

Thank you Kyle.

Oye vay, more of that B:AA AA stuff. Eidos made the dicsion not not allow ATI to reuse the code Nvidia developed for a game engine THAT NEVER FUCKING HAD SUPPORT FOR AA FROM THE START. ATI is completely free to do their own code and help them verify it works without issue. I could see this as an issue if a game engine had AA support from the start, but in this case, UE3 never freaking had it til Nvidia wrote the damn code or it.
 
Thank you Kyle for such an _Amazing_ read! That's why I LOVE this site :D

As for the mob part, you can count on me (as long as ATi comes up with GOOD Linux support) ;)
 
I think nVidia should skip the 'hybrid' middle ground and just go for gold with head mount displays in 3D.

For example:
http://www.vrealities.com/hmd.html

But what if they drove the market to adopt hmd's in a larger way? This would drive down cost very quickly. If a head mounted display were developed where each eye were able to see say... a 5760 * 3240 pixel stereoscopic display, and have an accelerometer to change the view with the user's head movements, that would trump just about anything else out there. As it is, it seems there is a push for more portable and compact devices anyways. How many gamers can convince the other people in their lives that having 3-6 22-24" LCD's is a 'good investment' for gaming? Not much I bet. This route would also take advantage of recent advanced in OLED technology.
 
Oye vay, more of that B:AA AA stuff. Eidos made the dicsion not not allow ATI to reuse the code Nvidia developed for a game engine THAT NEVER FUCKING HAD SUPPORT FOR AA FROM THE START. ATI is completely free to do their own code and help them verify it works without issue. I could see this as an issue if a game engine had AA support from the start, but in this case, UE3 never freaking had it til Nvidia wrote the damn code or it.

The problem is that it's not proprietary code. It's standard stuff implemented through Direct X. It was not some proprietary code path which implemented it's own MSAA algorithm. If that was the case no harm no foul. But this is not the case with this game. Simply changing the vendor ID gives you perfect AA without issue since Direct X is what is providing it. It would be like Seagate providing code which locked out the ability to format Western Digital hard drives in Windows.
 
The new 3D technology that Nvidia has is great, has a fantastic effect and doesn't produce headaches.

I'd give up my 3 monitors and 5870 in a heartbeat, repurchase 3 120hz monitors and an Nvidia card if they can provide a single card solution that drives 1920x1200x3 in 3D ;)
 
The problem is that it's not proprietary code. It's standard stuff implemented through Direct X. It was not some proprietary code path which implemented it's own MSAA algorithm. If that was the case no harm no foul. But this is not the case with this game. Simply changing the vendor ID gives you perfect AA without issue since Direct X is what is providing it. It would be like Seagate providing code which locked out the ability to format Western Digital hard drives in Windows.

But that doesn't answer the question, why is it Nvidia fault for what the publisher decides to do?
 
But that doesn't answer the question, why is it Nvidia fault for what the publisher decides to do?

Because Nvidia probably paid them? What do you think happens when a game has the "Plays best on.." stuff on it? They got paid to put it there, buddy.
 
Back
Top