DoubleParadoxx
n00b
- Joined
- Oct 1, 2009
- Messages
- 31
So tempted to RT@bburke_nvidia
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Just food for thought:
I'm not a technology writer and I don't pretend to be someone that is. I'm an enthusiast who gets excited at the hardware/technology and loves seeing it in action from other peoples' rigs, if not my own.
I may be wrong and very ignorant on this, but is it possible that for a developer to add in 3D effects to a game, that it may not be possible/cost effective for them to support other features? I guess what I'm trying to say is, the 3D part may not be a simple Driver fix provided by NVIDIA; it may be something that has to be coded into the game specifically with the help of NVIDIA. If that's true, then yes... As a consumer, I would expect certain features to be locked out of the game then.
I wouldn't buy an HD-DVD player expecting to run all the movies that existed on the format; I would know that there is a competing brand of HD player on the market that has the support of different movie studios. It's up to me as the consumer to be aware of the product differentiation between the two companies and make the best decision for myself.
I fully expect to be flamed for this and flagged as an NVIDIA fan boy or in the minority of opnions; just realize that I happily own a 5850 from Diamond and anxiously await the thin bezel monitors from Samsung before I splurge on the monitors.
Anyways, thanks for the great article Kyle; as always, you say it like it is and whether or not I agree with you... I always enjoy reading your news/articles. Keep up the great work.
We are not talking about PS3, 360, or anything like that, we are talking about standard features like anti-aliasing and, knowing nvidias behavior, potentially multimonitor support being locked out for other vendors in games through TWIMTBP program. How about nvidia locking out physx when you have an ATI card as your main display card and an nvidia card as your physx card? Yeah real fair to all the people who bought nvidia cards and want to use ATI cards to push their graphics. Also nvidia and rocksteady got A LOT of bad press for locking ATI cards out of AA in Batman which is probably why the next patch they released allowed ATI cards to enable AA in game options. Defend nvidia all you want, no one else here will.
Question:
Being able to display a 3D image on a 2d display has been around for a very very long time.
How would NVIDIA be able to stop ATI from using this technology also?
NVIDIA doesn't own the patents to reproduce a 3D image on a 2D device. This technology has been everywhere, the only difference is how to make it work on their 3 displays.
I would think that ATI could perform this easily and then just come up with their own way to span it across their eyefinity setup.
I just checked, and rocksteady has not released a patch enabling in game AA....do you have a link/source for this?
You misinterpreted the final paragraph. 3D surround is nvidias multimonitor solution which is basically eyefinity while 3D-Vision is the actual 3D portion. Kyle was talking about TWIMTBP program possibly locking Eyefinity out of games only allowing 3D-Surround which has nothing to do with 3D at all.
Hmm, I don't care about the whole "new" 3D gimmic for theatre/blu-ray/cable/sat, or 3D glasses gaming, or even gaming on multi-monitors. Gaming on multi-monitors looks nice but I just can't get past the bezels (too distracting), so I'd rather have a single large screen or projector.
I think you still have to enable it in CCC but the game now supports it, I thought they patched it somehow I might be mistaken.
Kyle doesn't want the 3d vision only the multi monitor support portion to be open source.
Wow I do not understand why everyone seems to be in bed with ATI recently and trying to dog Nvidia at every turn. When ATI is late with a product that can compete you dont see all this doomsday press.
Did you try Nvidia setup at CES? Then how can you already write an article playing down 3D? I tried it and it was truly awesome, it is not some cheezy 3d crap like at the theaters. Notice that the glasses have to sync with the displays. This is not disneys 'Honey I shrank the audience' with those fucking throw away glasses. I also don't see why you are talking shit when they are bringing in support for multiscreen gaming. They are bringing something more to their products. You say they could have done this a year ago, well mayb and so could ATI. Look how great it worked for Matrox that they both had as an example. ATI just tried it again with faster hardware and it worked.
You are worried they needed SLI when ATI can run 3 screens just fine...... ok two things first ATI's is not running 3d that requires twice the power than 2d, also IT IS A TRADE SHOW... come on people of coarse you are going to use overkill hardeware for your demos and yes they all ran smooth even though it was 3d... on 3 screens... on old hardware. I can't wait to see what next gen hardware can do.
What is this talk about nvidia now making multiscreen gaming proprietary ATI came out with it first so I would be watching ATI. Seriously though they will probably just both have their proprietary methods (like GGPU computing remember ATI had their own proprietary shit too) untill a DirectX makes a standard.
This article is just sad. The [H] has always been one of my favorite tech sites because they seem to be down to earth and controversial if necessary, but now I have to wonder where some of this seemingly biased bullshit is coming from. I see anti-nvidia articles all over the place net and have to wonder why. It's not typical articles about a manufacturer being late with a product we want; it always seems to be, oh shit Nvidia is late they are in trouble they are going to close shop, doomsday brimstone and hellfire.
P.S.
I love my radeon 5850 . If Nvidias new part doesnt perform and they lose this time oh well we will see them again next time around.
Are you suggesting that [H] does not bust AMD/ATI's balls when they are late?
Great read Kyle. Agree with your points.
Any chance your camera crew can drop by the AMD booth and get some pics + specs on the new thin bezel displays from Samsung?
I remember reading somewhere that quadro multi-display, which I'm assuming this technology is based off, can handle displays of different resolutions. If so, that would be a boon for those of us using 30inchers right now (sorry, but two more of these will break the bank).
So you admit to being sellouts when you should remain neutral and blast either company on a need be basis. Nice.
So you admit to being sellouts when you should remain neutral and blast either company on a need be basis. Nice.
Bolded=100% true IMO and honestly they havent had to make a move with it until now. (really I dont blame them from a biz aspect because we all know that Quadro is a cash cow, and you cant tell me that you really would have done it either.) Not to mention that Matrox has had a solution to "hold over" that .01% of customers that wanted it, albeit they were res capped.
Plus since they have had this FOREVER on Quadro, its going to be a BREEZE to implement in GeForce
NOW, TWIMTBP. personally 98% of the time I love it. I love the fact that I can buy that game on launch day and have all of these abilities w/o issue (with a driver install of course) I like that NV is pushing devs to implement new tech that they:
A) May not have been able to afford
B) Dont have the engineers or time to implement
C) Didnt know about
I really dont think that NVIDIA is cold hearted enough to say how can we break the way ATI does it. (physX I think was more of a we dont want the ati blames NV for X not working in this game and vice versa)
My non PC world example: BMW/Ford have patents on exhaust/turbo exiting the INSIDE of the head in a V motor (DOHC/PushRod respectively) for better heat energy consumption. Does this mean that they should Open source it for GM to use? HELL NO!! thats why patents are in place, to one up your competition. No different in the PC world.
IMHO this is 99% true, I think that Kyle on occasion is biased. I think in this case for his "hatred" for 3d kinda pushes him the red way, NVIDIA is late too...
now to my thoughts on "nFinity"
I have 2-GTX285's so this is awesome, how many 4800 series owners have eyefinity?????
I dont care for the 3d, its cool but not for me
My 285s are fast, there isnt a game I cannot play currently at my 19x12 or 2560x16 res at full settings
I was going to go ATI just for eyefinity, but I use CUDA stuff too much and I havent been impressed with "bug fixes" by ATI in the past.
Glad I read that write up. And I'm really proud to read HardOCP after reading Kyle's comments. Someone has to stand up to companies when they try to make technologies too proprietary. I honestly don't think nV's current stances on their tech benefits them (I refuse to buy their products already due to it). I support innovation, not AA lockouts (as if that's proprietary).
Thank you Kyle.
Oye vay, more of that B:AA AA stuff. Eidos made the dicsion not not allow ATI to reuse the code Nvidia developed for a game engine THAT NEVER FUCKING HAD SUPPORT FOR AA FROM THE START. ATI is completely free to do their own code and help them verify it works without issue. I could see this as an issue if a game engine had AA support from the start, but in this case, UE3 never freaking had it til Nvidia wrote the damn code or it.
The problem is that it's not proprietary code. It's standard stuff implemented through Direct X. It was not some proprietary code path which implemented it's own MSAA algorithm. If that was the case no harm no foul. But this is not the case with this game. Simply changing the vendor ID gives you perfect AA without issue since Direct X is what is providing it. It would be like Seagate providing code which locked out the ability to format Western Digital hard drives in Windows.
But that doesn't answer the question, why is it Nvidia fault for what the publisher decides to do?