USMC2Hard4U
Supreme [H]ardness
- Joined
- Apr 4, 2003
- Messages
- 6,157
sometimes you do what you gotta do.Killer69 said:Alright, alright. But lets not make that a habit USMC2Hard4U.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
sometimes you do what you gotta do.Killer69 said:Alright, alright. But lets not make that a habit USMC2Hard4U.
The fan actually runs silently during operation. It does seem to have at least two different RPM modes, however. At system startup, the fan did not start spinning until a few seconds into the boot process; this scared us at first, but it did eventually kick in starting at a high RPM and dropping to a slower RPM once in Windows. While operating at the slower RPM mode, the fan could be considered a silent solution as it was very quiet. Even under full load, we never heard the fan spin up to full RPM.
Killer69 said:Guessing? Or do you know something that we don't?
Brent_Justice said:Release 80 ForceWare is suppose to alleviate the restrictions on inter-mixing different branded video cards. Meaning BIOS's do not have to match. Also there is something about greater multi-monitor options with SLI.
Pip said:This section of the review caught my eye as I am used to hearing about how screaming loud modern high-end video cards are. Could the reviewers expand up this a bit? Is this truly a quiet solution? or are we speaking in relative terms? I generally strive for a fairly quiet computer, big HSF, low speed big fans, quiet PS... I do not overclock, so I am wondering just how quiet this card will be in that context?
Thanks much!
Arklight said:Serious?
So, essentially, if I wanted too (lol) I could run a 7800GTX and a 6800 Ultra?
Or do you mean, for example, a BFG 7800 GTX and a Leadtek 7800 GTX?
Thx!
Arklight said:Serious?
So, essentially, if I wanted too (lol) I could run a 7800GTX and a 6800 Ultra?
Or do you mean, for example, a BFG 7800 GTX and a Leadtek 7800 GTX?
Thx!
Skrying said:Do you actually think core mixing would be possible. Come on now.
He means a BFG with a Leadtek like you said.
Brent_Justice said:different brands, I didn't say different generations
USMC2Hard4U said:I wonder if they are going to make a Board with 2 GPUs on it like that one Gigabyte Card, or those MSI SLI on 1 card cards?
Sure it will be 1000$ but if you have the money to spend, who cares!
Yeah I caught that too.Devistater said:To:
Brent Justice and or Kyle Bennett
"so fast that at 720p (1024x768), 4x AA provides little or no degradation of performance."
"that true1600x1200 (1080p) with 4x AA gameplay is finally here."
from page: http://www.hardocp.com/article.html?art=Nzg0LDQ=
720p is not 1024x768. Its 1280x720
And 1080p is not 1600x1200. Its 1920x1080
Its part of HDTV definitions for widescreen.
http://en.wikipedia.org/wiki/720p
http://en.wikipedia.org/wiki/1080p
As a comparison, 720p has 921,600 pixels. 1080p has 2,073,600 pixels.
While 1024x768 has 786,432 pixels and 1600x1200 has 1,920,000 pixels
DOH!! damn MSI for putting caps like that sooo close to the slots.. I saw those on mine when building it... I did not have any issues, but this is yet another excuse for me to ditch this Neo4 for a DFI SLI-DRJima13 said:
Stellar said:Note than your inability to afford the card does not make the card's price unreasonable.
Consider the performance benefit - which is anywhere from 60-100% greater than a single 6800 Ultra and in most cases at least equivalent to two 6800 Ultras in SLI - without the complications that SLI creates - then consider that the card only costs ~20% more than the 6800 Ultra, and ~40% LESS than 2x6800 Ultras.
That makes the card a BARGAIN for high-end users.
trudude said:Yeah I caught that too.
Stellar said:Note than your inability to afford the card does not make the card's price unreasonable.
Consider the performance benefit - which is anywhere from 60-100% greater than a single 6800 Ultra and in most cases at least equivalent to two 6800 Ultras in SLI - without the complications that SLI creates - then consider that the card only costs ~20% more than the 6800 Ultra, and ~40% LESS than 2x6800 Ultras.
That makes the card a BARGAIN for high-end users.
I find it funny that the only comments made so far about the card NOT being impressive, are from people with video cards and systems that are several generations old.
There's just no argueing with this kind of performance -
Jima13 said:
Devistater said:To:
Brent Justice and or Kyle Bennett
"so fast that at 720p (1024x768), 4x AA provides little or no degradation of performance."
"that true1600x1200 (1080p) with 4x AA gameplay is finally here."
from page: http://www.hardocp.com/article.html?art=Nzg0LDQ=
720p is not 1024x768. Its 1280x720
And 1080p is not 1600x1200. Its 1920x1080
Its part of HDTV definitions for widescreen.
http://en.wikipedia.org/wiki/720p
http://en.wikipedia.org/wiki/1080p
As a comparison, 720p has 921,600 pixels. 1080p has 2,073,600 pixels.
While 1024x768 has 786,432 pixels and 1600x1200 has 1,920,000 pixels
Skrying said:Oh yes, with my several generations old X800XL and 6800nu.
Where do you get the 20% price difference? I can grab a 6800U for around only $350, guess what, the 7800GTX will cost me about $200 more, yeah buddy, that's more than 20%.
$200 does not equal the increase of 30% performance to me. If you see it different then good, but the way you argue this is completely pointless. You can not consider $600 for this card a bargain no matter how you cut it.
Wow, talk about some narrowminded thinking. I can give you one great example why I think the card is a "bargain:"Skrying said:Oh yes, with my several generations old X800XL and 6800nu.
Where do you get the 20% price difference? I can grab a 6800U for around only $350, guess what, the 7800GTX will cost me about $200 more, yeah buddy, that's more than 20%.
$200 does not equal the increase of 30% performance to me. If you see it different then good, but the way you argue this is completely pointless. You can not consider $600 for this card a bargain no matter how you cut it.
Skrying said:Oh yes, with my several generations old X800XL and 6800nu.
Where do you get the 20% price difference? I can grab a 6800U for around only $350, guess what, the 7800GTX will cost me about $200 more, yeah buddy, that's more than 20%.
$200 does not equal the increase of 30% performance to me. If you see it different then good, but the way you argue this is completely pointless. You can not consider $600 for this card a bargain no matter how you cut it.
Devistater said:Yeah, much better lol. As long as you guys know that those resolutions mentioned in the review are the probably closest equivelent and not the exact same thing, I'm happy
Brent_Justice said:we were measuring the pixel fillrate like we have done in the past, the 7800 GTX can shade 24 pixels per clock, so the pixel fill rate is correct, MHz x pixels-per-clock, it's just an easy way to compare a cards ability to fill pixels
trinibwoy said:I can see you trying to make it simpler to understand but maybe it would be better to educate people on the differences between the shader core and the ROPs instead of dumbing it down for them. And your pixels-per-clock there is 16, not 24. No matter what the shader core does, the card can't output more than 16 pixels per cycle. All the extra shader power is doing is helping you get closer to peak ROP output which is your true fillrate.
Maybe it's easier for people to understand it your way, I just think that with the move to unified cores coming up, people are going to have to understand what "fillrate" really means eventually.
R1ckCa1n said:From what I see it is what the NV47 should have been; tweak the core and add some fixes. I too thought this was a "TRUE" 24 pipe card when in all reality it is a tweaked 16 pipe card. Instead NV called it
btw: mine comes on Friday! Hello BF2 @ 1600 x 1200 4xAA 16xAF
Brent_Justice said:I pointed out very clearly in the review that it has 24 pixel pipelines and 16 ROPs.
Skrying said:It is the NV47. I'm betting the G70 was just used as a sneaky marketing gimic, you know to get everyone like "oh noes, new code name must = kickass 1337 card from hell!!!!!!." Either way, it worked, lol.
Does this mean a card with 24 pixel pipelines and 24 ROPs would perform better? From my understanding (which is limited in this super techie stuff ) isnt the 24 pixel pipelines just really there to maximize the 16 ROPs? So a 24 ROP card would still have times where it could hit a theoritical higher performance with everything else equal?
Iratus said:If you want exact breakdowns of every pipelined action to get that pixel on your screen other reviews will do it or there will be a seperate editorial if the technology is new and spangly. If you want an informed decision on what is the best to play your games, then thats you'll usually get here. And we thankfully get it free of most of the non relevent padding copied from a tech briefing that some reviewers feel the need to include to pad their 'review' out.