New Inquirer Article about the future of 3dcards

Brent_Justice said:
FP16 frame buffer blending can be added to any future ATI card as well, it is NOT a SM 3.0 feature


Its not that without looping, you have to do HDR in 5 or so passes.

With looping and branching its down to 2

This shader is multipass shader like many new shaders that are going to be used, where sm 3.0 has a distinct advantage of cutting down passes.
 
The guy that wrote that is just a fool...plain and simple. He reminds me of a 8year old that his daddy bought him a nvidia 6800gt to play doom3 and when he was done with the game and read future benchmarks of hlf2 he got pissed off cuz ati is faster on that game. Just plain garbage ideas he has. Oh and calling doom3 a old engine lol... point made.
 
it just seems to me, that ati is trying to tell everyone that future proof tech is not needed. and that you have need for it ever.
also seems to me that ati is saying we will give you new tech when we think you need it.

to me it looks like they don't care about the customer and or partners
 
rancor said:
Its not that without looping, you have to do HDR in 5 or so passes.

With looping and branching its down to 2

again, if it can do it just as fast in 5 passes as the other card can do it in 2, what's the difference ;)

right now its just too early to tell on all of this

give it time, we'll see how it all works out, i honestly cannot tell you the future, i'm very curious to see what happens...
 
Brent_Justice said:
again, if it can do it just as fast in 5 passes as the other card can do it in 2, what's the difference ;)

right now its just too early to tell on all of this

give it time, we'll see how it all works out, i honestly cannot tell you the future, i'm very curious to see what happens...


it can't lol, the x800s are going 10 frames per sec and less (they are losing over 70%), on the 6800s they only loose 40-50% they are still in playable range.
 
Shane said:
The guy that wrote that is just a fool...plain and simple. He reminds me of a 8year old that his daddy bought him a nvidia 6800gt to play doom3 and when he was done with the game and read future benchmarks of hlf2 he got pissed off cuz ati is faster on that game. Just plain garbage ideas he has. Oh and calling doom3 a old engine lol... point made.

plz don't act out f@nboyizm. that is uncalled for, we here are trying to have a civilized convo without relating to f@nboyizm. so plz will you cut that out
 
rancor said:
it can't lol, the x800s are going 10 frames per sec and less (they are losing over 70%), on the 6800s they only loose 40-50% they are still in playable range.

on what, a beta patch that isn't released to the public yet ? ;)

wait until the final version of the patch to determine how cards compare

betas aren't betas for nothin
 
Brent_Justice said:
on what, a beta patch that isn't released to the public yet ? ;)

wait until the final version of the patch to determine how cards compare


Brent I have the next patch, I'm working with it right now, I've also tried HDR in my engine same thing happens but its a bit more pronounced of a loss since I'm my engine is Ogl. Also in the patch you can't force ATi to do HDR, I had to rewrite some of the path selection to test it out. It won't even be an option for ATi users :)
 
i agree somewhat... he was doing a little too much bashing, but he is correct about ati not having PS 3.0, and ati will never win a benchmark again as long as nvidia is the only company with SLI.
 
ZenOps said:
Doom 3 is an old engine, it would have been revolutionary if released two years ago. And as far as I know, it doesn't come close to using the vast majority of PVS 2.0 capabilities. nevermind 3.0. As a matter of fact, IMO other than the resolution, FPS and forced AA/AF it barely looks any different on a Radeon 8500/GF4 class card than todays newest cards. Carmack said he allowed and planned it for the widest videocard support, and it shows.

Doom3 is a "good" engine, but honestly I don't know if I'd favor it over a generic DX9 engine, with much smoother sound and networking integration. It could have been "great" but there are a lot of things that just don't seem right.

And in a triple buffered game, input lag would tend to make me think you would "feel" 240fps even if your monitor could only show 120hz (its almost always one full frame behind in rendering, you are always shooting ahead of your target in FPS shooters) PCI-E will get rid of this annoyance, as it can communicate back to the processor mid-frame whether or not something has happened immediately. In a perfect video engine on PCI-E it need only be single buffered, with a physics hit detection engine on the videocard. IE: 120FPS will "feel" like 120FPS because of no input lag buffering.

I still say the killer application for PCI-E is going to be websurf videomemory buffering (instant back button screen refreshes on HTML 1.1 code and images) For the mid and low end users, true bidirectional 2D screen buffering will shave seconds off every action. And trust me, surfing for pr0n with no "back button" delay may push PCI-E further than any 3D game will. If Nvidia believes they can sit on a bridged half-duplex solution, I'd be very worried.


What is back button delay? seems instant to me..
 
Some good points raised in the article, perhaps too scathing, but its the Inquirer so what do you expect? :)
 
Hey RIP(Zues), where in my post is there so called !!!!!!ism ? If you learn to read/understand english properly you will see that nowhere did I state that nvidia or ati is the better card to own. I simply stated what the author ( Inquirer ) posted is ignorant and the standard of !!!!!!ism as you would say. Lets keep it like you name says RIP
 
rancor said:
Brent I have the next patch, I'm working with it right now, I've also tried HDR in my engine same thing happens but its a bit more pronounced of a loss since I'm my engine is Ogl. Also in the patch you can't force ATi to do HDR, I had to rewrite some of the path selection to test it out. It won't even be an option for ATi users :)


unfortunately I don't know when its going to be released, but should be sometime soon, the patch seems very solid.
 
Shane said:
Hey RIP(Zues), where in my post is there so called !!!!!!ism ? If you learn to read/understand english properly you will see that nowhere did I state that nvidia or ati is the better card to own. I simply stated what the author ( Inquirer ) posted is ignorant and the standard of !!!!!!ism as you would say. Lets keep it like you name says RIP

The author of the article really isn't a "fan" of either side, he's written scathing editorials against both companies. I wouldn't take it too seriously, though, (i.e. most publishers going for D3 engine instead of HL2) as his information is often off-base.
 
what reviewer is gonna compare 2 vidcards sli to a single slot card? ie. comparing 2 6600's sli to an x700 non-sli in a doom 3 benchie, i personally dont see the point.
 
quote from the article "ATI has no PS3.0 part, and never will. "

Charlie Demerjian is a TARD plain and simple. That one line proves it 100%.

Then he goes on to complain about how expensive the top end card are... then shifts into how 2 cards in SLI will be so great.. bla bla bla and they will kill ati... bla bla bla

Once again he doesnt realize that 2 cards will cost more than 1... and if people are having so much trouble paying for 1 card how are they going to feel about buying 2?

and then look at this trash.. here come another great TARD quote

"I know where my money will be going, and an X800 seems like money badly spent if you don't plan on buying an new card every three months."

ok.. i have had my x800xt pe for at least 2 if not almost 3 months... i better run out and buy all the next gen cards that just came out.... O WAIT THERE ARE NONE!!!
and there wont be for a while the x800/6800 are going to be here with their oem/mainstream slower versions for quite some time. THIS WILL NOT CHANGE!

They just came out with these cards they are not going to come out with a new core in 3 more months unless they want to kill their current product lines that arent even out 100% yet.

Articles like these are what give the Inquirer a bad name......
 
Brent_Justice said:
Yes, monitors can display it, by disabling VSYNC

Holy fuck how some people can be stupid.

EDIT: ummm, sorry if I was offensive....but that was my HONEST reaction to the post...come on dude, time to do some homework!
 
ati has to play catchup with the r500... only they must also design the xbox2 subsystem right ?... so they have to tackle chipset problems, design a whole new chipset for microsoft, and design 2 graphics cores(which will probably be somewhat similar)... stretching their resources pretty thin if you ask me.. the nvidia/xbox partnership hurt nvidia, and it was probably the reason they fell behind to begin with
 
Verge said:
ati has to play catchup with the r500... only they must also design the xbox2 subsystem right ?... so they have to tackle chipset problems, design a whole new chipset for microsoft, and design 2 graphics cores(which will probably be somewhat similar)... stretching their resources pretty thin if you ask me.. the nvidia/xbox partnership hurt nvidia, and it was probably the reason they fell behind to begin with

I doubt it. Even if it were true, they made a lot of bank doing it. They even attributed their rise in profits to Xbox in 2003.
 
Shane said:
Hey RIP(Zues), where in my post is there so called !!!!!!ism ? If you learn to read/understand english properly you will see that nowhere did I state that nvidia or ati is the better card to own. I simply stated what the author ( Inquirer ) posted is ignorant and the standard of !!!!!!ism as you would say. Lets keep it like you name says RIP

right here
He reminds me of a 8year old that his daddy bought him a nvidia 6800gt to play doom3 and when he was done with the game and read future benchmarks of hlf2 he got pissed off cuz ati is faster on that game

and also my name is not RIP(Zues)
learn to read my name right.


[RIP] = being the team i am on for the games i play
Zeus = being the name i use. and it's not Zues...

get it right.
 
well it sounded like it to me. but oh well

and as for you gordon151
you're being a smart ass lol :D

heheehhe
 
Brent_Justice said:
more like, when its needed

This is what the video card forum can not understand, and is responsible for 3/4 of the threads.
 
^eMpTy^ said:
and I think his point was that ATi would never have a PS3 only card...that they would skip it and go straight to PS4...and I dunno if even that is true...

Sorta like what NV did with 3.0. Can you say DX8.1 on the FX in HL2 VST?
 
R1ckCa1n said:
Not at [H].... it's the highest settings while still being a great gaming experiance.

There is a connection between how many FPS a card can push and what the highest settings with a good experience are.

If Card A at high settings can push twice the FPS of Card B at high settings, then I think there is a slight chance that Card A can have a good experience with higher settings than Card B.
 
R1ckCa1n said:
Not at [H].... it's the highest settings while still being a great gaming experiance.

lol...I'm sorry I forgot you can only read things 5 words at a time...i'll try to be more concise next time...READ THE REST OF THE POST... :D
 
"In My Personal Opinion That article is a very flamatory, and very fanatical."

This is how I feel about the article...seems more like a childish tirade.

Nvidia was further behind when the R300 hit the market and people were making less drama out of the event. Both cards are close in a vast majority of games...the real issue is both are in short supply, preventing a solid price war.

Ati's game plan was to take a good portion of the Xbox2 hit sooner then later. When PS3.0 games become standard they should have a part ready. It was a solid plan, given the amount of projects they are juggling. The mistake they made was misjudging the low-highend market. One might say it was not having a new openGL driver, but who knows if it will show solid performance increases.

In the end both cards are solid and it really depends on the price one pays.
 
R1ckCa1n said:
Sorta like what NV did with 3.0. Can you say DX8.1 on the FX in HL2 VST?

Actually...yeah...just like that...good point...they'll half ass around with 2.0b till then...
 
Mutalys said:
I also think he's contradicting himself when he first talks about the high-end market being only a small fraction then he tries to lay-on the SLI is going to be a 'nuke'. Come on... while it'll look good in benches and I'm sure it'll be popular for the high end, do yout think people are going to throw down $800 - $1000 for a SLI configuration regularly? I've read that at the lower end, say 2x$200 you're better off with the $400 card. But that may change...

Do realize that computing is one of the absolute cheapest hobbies in the world. Quite honestly, in what hobby can you obtain the very fastest, high-end parts for every element of the activity for a mere $3000.

I know that in photography, my last Hasselblad body cost me over $3000 used... that's without a lens... which costs about the same as the body... and I tend to use a variety of lenses.

I don't mean to say this in a derogatory way, but once you have a job, that pulls in $60,000/year, given that you don't have any other expenses that are out-of-the-ordinary, $1,000 for the 2x 6800GT's and a new mobo isn't a big deal. In fact, you could do that once/year and not even miss it.

If you are a student still, working $12/hour jobs part time... I see what you're saying.

But ATI/NVidia will be making most of their money from people who are working full time. Students comprise a minority of their income.

When you're working, you will be plenty glad that NVidia's SLI solution exists just so you can spend that money you have and play Doom 3/Far Cry w/8xAA/16xAF at 70 fps.

Without SLI, your $1000 is worthless, because nothing will even exist to satisfy your craving for performance.

*edit

Nevermind, your $1000 wouldn't be worthless, because you'd be spending it fighting over the few x800xt/pe's going off eBay for half the performance of an SLI rig.
 
" don't mean to say this in a derogatory way, but once you have a job, that pulls in $60,000/year, given that you don't have any other expenses that are out-of-the-ordinary, $1,000 for the 2x 6800GT's and a new mobo isn't a big deal. In fact, you could do that once/year and not even miss it."

Not everyone adult makes 60,000+ a year. Having said that you seemed to miss the main point...he complains that one card is too expensive, but then praises the ability buy two expensive cards. If he is worried about the cost of one, why would he then cheer the ability to buy two?

Oh you can get a solid camera setup for 3,000 dollars...Depends on the type of work. Outdoor/studio etc.
 
Ponder said:
" don't mean to say this in a derogatory way, but once you have a job, that pulls in $60,000/year, given that you don't have any other expenses that are out-of-the-ordinary, $1,000 for the 2x 6800GT's and a new mobo isn't a big deal. In fact, you could do that once/year and not even miss it."

Not everyone adult makes 60,000+ a year. Having said that you seemed to miss the main point...he complains that one card is too expensive, but then praises the ability buy two expensive cards. If he is worried about the cost of one, why would he then cheer the ability to buy two?

Oh you can get a solid camera setup for 3,000 dollars...Depends on the type of work. Outdoor/studio etc.

Thank you.

Honestly, you may be right that someone who lives comfortably can lay $1k on the table for a gaming rig... maybe even once a year. Fortunately, most of the folks I know in that income bracket are more practical in their choices. That's not to say I wouldn't want one, but with as many people screaming about spending $500 for a video card right now, you can't actually expect SLI to be that popular.
 
Back
Top