[H]ardOCP 7800 GTX Preview!!!!

The fan actually runs silently during operation. It does seem to have at least two different RPM modes, however. At system startup, the fan did not start spinning until a few seconds into the boot process; this scared us at first, but it did eventually kick in starting at a high RPM and dropping to a slower RPM once in Windows. While operating at the slower RPM mode, the fan could be considered a silent solution as it was very quiet. Even under full load, we never heard the fan spin up to full RPM.

This section of the review caught my eye as I am used to hearing about how screaming loud modern high-end video cards are. Could the reviewers expand up this a bit? Is this truly a quiet solution? or are we speaking in relative terms? I generally strive for a fairly quiet computer, big HSF, low speed big fans, quiet PS... I do not overclock, so I am wondering just how quiet this card will be in that context?

Thanks much!
 
I'm glad to hear that Nvidia has addressed the issues with widescreen SLI gaming. With the increase in widescreen laptop and desktop gaming, it's good to see Nvidia step up to the plate and address this. Two big thumbs up to Nvidia for this! Hopefully some more info/benchies on this are forthcoming.

I myself am in the process of a very slow upgrade from Socket 754 to Socket 939, buying one component here and there over the course of ~2 months or so. I went with a SLI motherboard for the fact that I can choose my video card upgrade path: start with one card and add another down the line if I so choose. Since I have no PCI-e cards right now, this 7800GTX is looking like a VERY sweet buy. It would be cheaper than dual 6800 Ultras and provide me the path to dual 7800GTX's. Given the performance of 2 x 6800 Ultras vs. 1 x 7800GTX, it seems almost a no-brainer that the 7800GTX would be a smarter investment, in my case. Those with one or two 6800GT or Ultras, that would be a different thing entirely.

Oh yeah, great preview Brent/Kyle, top notch as usual! But I didn't expect any less. *wipes the brown off his nose hehe* :D
 
The image quality improvements are the only thing that stands out. The chain link, grass and trees in some of those screenshots.......wowzers!! Gotta love that non-jagged smoothness. Other than that, I wasn't to impressed with the preformance jump. Nice card(s) though.
 
I agree with the comment up above that some of us with high end cards won't need to upgrade this time around, but do it every other cycle. That's why I bought the best card at the time, to future proof my investment. My 6800 Ultra is great, and plays everything with no problems. And like I said, the FS/FT forum is a good place to pick up a used BFG 6800 Ultra ($400) or GT ($300), especially with people selling to upgrade. Everyone is going to win in this deal, as the high end buyers upgrade and sell their used equipment.
 
Killer69 said:
Guessing? Or do you know something that we don't? :confused:

Release 80 ForceWare is suppose to alleviate the restrictions on inter-mixing different branded video cards. Meaning BIOS's do not have to match. Also there is something about greater multi-monitor options with SLI.
 
Brent_Justice said:
Release 80 ForceWare is suppose to alleviate the restrictions on inter-mixing different branded video cards. Meaning BIOS's do not have to match. Also there is something about greater multi-monitor options with SLI.


Serious?

So, essentially, if I wanted too (lol) I could run a 7800GTX and a 6800 Ultra?

Or do you mean, for example, a BFG 7800 GTX and a Leadtek 7800 GTX?

Thx!
 
Pip said:
This section of the review caught my eye as I am used to hearing about how screaming loud modern high-end video cards are. Could the reviewers expand up this a bit? Is this truly a quiet solution? or are we speaking in relative terms? I generally strive for a fairly quiet computer, big HSF, low speed big fans, quiet PS... I do not overclock, so I am wondering just how quiet this card will be in that context?

Thanks much!

I run water cooling on the CPU and the computer is in an open environment so there are literally no other fans but the video card fan. Compared to the 6800 Ultra and X850 XT-PE the fan on the 7800 GTX is very quiet, I could barely barely hear it since it runs at a low RPM. If this were inside the case there would be no audible noise at all.
 
Arklight said:
Serious?

So, essentially, if I wanted too (lol) I could run a 7800GTX and a 6800 Ultra?

Or do you mean, for example, a BFG 7800 GTX and a Leadtek 7800 GTX?

Thx!

Do you actually think core mixing would be possible. Come on now. :rolleyes:

He means a BFG with a Leadtek like you said.
 
Arklight said:
Serious?

So, essentially, if I wanted too (lol) I could run a 7800GTX and a 6800 Ultra?

Or do you mean, for example, a BFG 7800 GTX and a Leadtek 7800 GTX?

Thx!

different brands, I didn't say different generations
 
I wonder if they are going to make a Board with 2 GPUs on it like that one Gigabyte Card, or those MSI SLI on 1 card cards?

Sure it will be 1000$ but if you have the money to spend, who cares! :)
 
USMC2Hard4U said:
I wonder if they are going to make a Board with 2 GPUs on it like that one Gigabyte Card, or those MSI SLI on 1 card cards?

Sure it will be 1000$ but if you have the money to spend, who cares! :)

Doubt it'd work to well. Mainly because you probably wouldnt be able to fit it in a normal case. I know I couldnt fit the 7800GTX in my LAN case already, I couldnt imagine anything longer.
 
Damn you Brent Justice damn you.

And you can have a quick damning Kyle for giving him a forum to disseminate his evil and corrupting words.


Just paid my credit card off... but yet, here we go again.

On a side note not the Dual Core Processor, Or the SLi 7800's, or even my new raid array where I find the price particuarly offensive. It's the fricking power supply.

Also 1920*1200 benchies next time please, I'm gonna keep saying it :D

Excellent review though, and good work by Nvidia by not doing a paper launch, the canucks will be certainly be feeling pressurised this morning.
 
To:
Brent Justice and or Kyle Bennett

"so fast that at 720p (1024x768), 4x AA provides little or no degradation of performance."
"that true1600x1200 (1080p) with 4x AA gameplay is finally here."
from page: http://www.hardocp.com/article.html?art=Nzg0LDQ=

720p is not 1024x768. Its 1280x720

And 1080p is not 1600x1200. Its 1920x1080

Its part of HDTV definitions for widescreen.
http://en.wikipedia.org/wiki/720p
http://en.wikipedia.org/wiki/1080p

As a comparison, 720p has 921,600 pixels. 1080p has 2,073,600 pixels.
While 1024x768 has 786,432 pixels and 1600x1200 has 1,920,000 pixels
 
Note than your inability to afford the card does not make the card's price unreasonable.

Consider the performance benefit - which is anywhere from 60-100% greater than a single 6800 Ultra and in most cases at least equivalent to two 6800 Ultras in SLI - without the complications that SLI creates - then consider that the card only costs ~20% more than the 6800 Ultra, and ~40% LESS than 2x6800 Ultras.

That makes the card a BARGAIN for high-end users.

I find it funny that the only comments made so far about the card NOT being impressive, are from people with video cards and systems that are several generations old.

There's just no argueing with this kind of performance -

http://images.anandtech.com/graphs/nvidia geforce 7800 gtx_06220580647/7761.png

Kyle - Please do not hard link others' pictures, but please rather link the page that contains the document.
 
Devistater said:
To:
Brent Justice and or Kyle Bennett

"so fast that at 720p (1024x768), 4x AA provides little or no degradation of performance."
"that true1600x1200 (1080p) with 4x AA gameplay is finally here."
from page: http://www.hardocp.com/article.html?art=Nzg0LDQ=

720p is not 1024x768. Its 1280x720

And 1080p is not 1600x1200. Its 1920x1080

Its part of HDTV definitions for widescreen.
http://en.wikipedia.org/wiki/720p
http://en.wikipedia.org/wiki/1080p

As a comparison, 720p has 921,600 pixels. 1080p has 2,073,600 pixels.
While 1024x768 has 786,432 pixels and 1600x1200 has 1,920,000 pixels
Yeah I caught that too.
 
Stellar said:
Note than your inability to afford the card does not make the card's price unreasonable.

Consider the performance benefit - which is anywhere from 60-100% greater than a single 6800 Ultra and in most cases at least equivalent to two 6800 Ultras in SLI - without the complications that SLI creates - then consider that the card only costs ~20% more than the 6800 Ultra, and ~40% LESS than 2x6800 Ultras.

That makes the card a BARGAIN for high-end users.

FYI, 2 x 6800Us does not yield twice the performance of a single card...Plus the fact that a 7800 is just 20% more expensive is not a sign that 7800 are reasonably priced...its a signed that 6800Us are un-reasonably over-priced...

if someone sold you 2 small pizzas for 100 bucks and then someone offered one Xtra Large(with pepperoni) for 60 it doesn´t mean the xtra large is "reasonably" priced...they are both overpriced...but offcourse you would get the xtra large cause you would eat more and with xtra toppings....
 
trudude said:
Yeah I caught that too.

Speaking of widescreen HDTV, there was a 10 min movie footage of Batman Begins broadcast on WB channel before the movie was released. That 10 min footage is 1.5 gigs in 1920x1080 res. And its dang sweet too :)

DVD resolution is a pitiful 700x500 (approx, depends on pal/ntsc, and some other stuff), a pitiful 1/3 of a million pixels. Compared with 2 million in 1080p hehhehe.

Whats ironic is that you can't buy this HDTV footage, you have to capture it from a HDTV capable setup. While some may consider 20-30 gigs excessive for a LOTR EE movie, I love it the uber high res and detail that you can see.
 
Stellar said:
Note than your inability to afford the card does not make the card's price unreasonable.

Consider the performance benefit - which is anywhere from 60-100% greater than a single 6800 Ultra and in most cases at least equivalent to two 6800 Ultras in SLI - without the complications that SLI creates - then consider that the card only costs ~20% more than the 6800 Ultra, and ~40% LESS than 2x6800 Ultras.

That makes the card a BARGAIN for high-end users.

I find it funny that the only comments made so far about the card NOT being impressive, are from people with video cards and systems that are several generations old.

There's just no argueing with this kind of performance -

Oh yes, with my several generations old X800XL and 6800nu. :rolleyes:

Where do you get the 20% price difference? I can grab a 6800U for around only $350, guess what, the 7800GTX will cost me about $200 more, yeah buddy, that's more than 20%.

$200 does not equal the increase of 30% performance to me. If you see it different then good, but the way you argue this is completely pointless. You can not consider $600 for this card a bargain no matter how you cut it.
 
Devistater said:
To:
Brent Justice and or Kyle Bennett

"so fast that at 720p (1024x768), 4x AA provides little or no degradation of performance."
"that true1600x1200 (1080p) with 4x AA gameplay is finally here."
from page: http://www.hardocp.com/article.html?art=Nzg0LDQ=

720p is not 1024x768. Its 1280x720

And 1080p is not 1600x1200. Its 1920x1080

Its part of HDTV definitions for widescreen.
http://en.wikipedia.org/wiki/720p
http://en.wikipedia.org/wiki/1080p

As a comparison, 720p has 921,600 pixels. 1080p has 2,073,600 pixels.
While 1024x768 has 786,432 pixels and 1600x1200 has 1,920,000 pixels

Actually here is what is probably a better FAQ. :)
 
Skrying said:
Oh yes, with my several generations old X800XL and 6800nu. :rolleyes:

Where do you get the 20% price difference? I can grab a 6800U for around only $350, guess what, the 7800GTX will cost me about $200 more, yeah buddy, that's more than 20%.

$200 does not equal the increase of 30% performance to me. If you see it different then good, but the way you argue this is completely pointless. You can not consider $600 for this card a bargain no matter how you cut it.


In all fairness, a $350 Ultra is probably not the same as a $599 GTX.

If you can find a $350 BFG Ultra OC in PCI-E new in the retail box I want to see it. lol

If you are going to compare products, then the comparison should have to have somewhat equal factors.


On NewEgg:

BFG 6800 Ultra PCI-E : $515.00
BFG 7800 GTX PCI-E : $619.00

The Ultra is roughlyh 83% the cost of the GTX. The GTX yields a 20%-30%, sometimes even more increase in performance. Good deal? Yes. Worth the extra money? That's up to the individual. ;)

The 7800 GTX is a better deal and a WAY better comparison with regards to the example above.. Both are retail. Both are from the same company (and therefore have the same warranty and whatnot). One is the evolution of the other. In this respect, getting a 6800 Ultra would be completely asanine.
 
Skrying said:
Oh yes, with my several generations old X800XL and 6800nu. :rolleyes:

Where do you get the 20% price difference? I can grab a 6800U for around only $350, guess what, the 7800GTX will cost me about $200 more, yeah buddy, that's more than 20%.

$200 does not equal the increase of 30% performance to me. If you see it different then good, but the way you argue this is completely pointless. You can not consider $600 for this card a bargain no matter how you cut it.
Wow, talk about some narrowminded thinking. I can give you one great example why I think the card is a "bargain:"

I am moving to a Socket 939 platform with a SLI motherboard. I was using a 6800GT AGP card and I want to move to that level of performance at the minimum. I do want to go SLI in the (near) future, so I have a few options that I looked at:

(Assumes that prices won't drop through the floor in the next month or so, but they might drop some with this new launch)
* 1 x 6800GT now, 1 x 6800GT in the (near) future, cost: ~$600+ for SLI
* 1 x 6800 Ultra now, 1 x 6800 Ultra in the (near) future, cost: ~$700+ for SLI (based on your mention of $350 Ultras)
* 1 x 7800GTX now, 1 x 7800GTX in the (near) future, cost: ~$599 single, $1200 SLI

Buying the single 7800GTX now will get me the performance near/above dual 6800 Ultras in SLI and cost me ~$100 less, PLUS allow me the option to slap another in at a later time. The 7800GTX would be the same price roughly of the dual 6800GTs in SLI and beat the pants off of it. Even with the prices of 6800GT/Ultra cards hopefully going down in the very near future, I think the 7800GTX is the ideal card for me to get in the next month or so. I'm looking into the future more than the present with this new system.
 
Skrying said:
Oh yes, with my several generations old X800XL and 6800nu. :rolleyes:

Where do you get the 20% price difference? I can grab a 6800U for around only $350, guess what, the 7800GTX will cost me about $200 more, yeah buddy, that's more than 20%.

$200 does not equal the increase of 30% performance to me. If you see it different then good, but the way you argue this is completely pointless. You can not consider $600 for this card a bargain no matter how you cut it.

First of all, where the hell can you"grab a 6800U for $350"? Not an AGP version either.

I'm comparing MSRP to MSRP because when you take into account all your discounts or insider deals or whatever deals you may be privy to, the playing field is no longer level and the comparison becomes invalid.

Second of all, why did you pull 30% performance increase of out your ass all of a sudden? That is nowhere near the truth. Moreover the fact is at this point, we can't even see what this card is capable of because it is so severely bottlenecked in most systems.
 
Devistater said:
Yeah, much better lol. As long as you guys know that those resolutions mentioned in the review are the probably closest equivelent and not the exact same thing, I'm happy :)

It was corrected earlier today, then when the database took a dump, we lost some of the edits we had made. :( Sorry.
 
You guys can stop the bickering. I suggest you keep this on topic in an adult manner. This is your warning.
 
Brent_Justice said:
we were measuring the pixel fillrate like we have done in the past, the 7800 GTX can shade 24 pixels per clock, so the pixel fill rate is correct, MHz x pixels-per-clock, it's just an easy way to compare a cards ability to fill pixels

I can see you trying to make it simpler to understand but maybe it would be better to educate people on the differences between the shader core and the ROPs instead of dumbing it down for them. And your pixels-per-clock there is 16, not 24. No matter what the shader core does, the card can't output more than 16 pixels per cycle. All the extra shader power is doing is helping you get closer to peak ROP output which is your true fillrate.

Maybe it's easier for people to understand it your way, I just think that with the move to unified cores coming up, people are going to have to understand what "fillrate" really means eventually.
 
i won an amd 4000+ athlon64 (clawhammer baby!) at GeForce LAN 2.0. After I sell my current OC'd 3500+ on eghay, i'm taking the proceeds and buying one. playing on one of their setup's at the party, the 7800GTX is smooth. i have a 6800 Ultra right now and play at 1280x1024 (my LCD demands it) and i couldnt get close to a 7800GTX with my settings at lower AA/AF levels.

nvidia is getting it right the past year. at the first GeForce LAN, i think that close to 60% of the LAN'rs/gamerz had ATI Radeon cards (mostly 9800s) - including myself.

this year, its turned around, any self-respecting gamer had a 6800 video card. Over 70% had Nvidia cards (judging from walking around the BYOC area).

the 7800GTX is one heckofa card. in over a year when we are all playing UT2K7, it'll be standard issue for everyone.
 
trinibwoy said:
I can see you trying to make it simpler to understand but maybe it would be better to educate people on the differences between the shader core and the ROPs instead of dumbing it down for them. And your pixels-per-clock there is 16, not 24. No matter what the shader core does, the card can't output more than 16 pixels per cycle. All the extra shader power is doing is helping you get closer to peak ROP output which is your true fillrate.

Maybe it's easier for people to understand it your way, I just think that with the move to unified cores coming up, people are going to have to understand what "fillrate" really means eventually.

We are very comfortable with our approach on the subject matter. If readers, such as yourself, are inclined for more in-depth information, there are certainly other resources for that information. I think "dumbing it down" to 20 pages was sufficient for most us that care more about the gameplay the card can provide instead of the mechanics of the silicon and software.
 
R1ckCa1n said:
From what I see it is what the NV47 should have been; tweak the core and add some fixes. I too thought this was a "TRUE" 24 pipe card when in all reality it is a tweaked 16 pipe card. Instead NV called it

btw: mine comes on Friday! Hello BF2 @ 1600 x 1200 4xAA 16xAF :D

It is the NV47. I'm betting the G70 was just used as a sneaky marketing gimic, you know to get everyone like "oh noes, new code name must = kickass 1337 card from hell!!!!!!." Either way, it worked, lol.

Brent_Justice said:
I pointed out very clearly in the review that it has 24 pixel pipelines and 16 ROPs.

Does this mean a card with 24 pixel pipelines and 24 ROPs would perform better? From my understanding (which is limited in this super techie stuff :p) isnt the 24 pixel pipelines just really there to maximize the 16 ROPs? So a 24 ROP card would still have times where it could hit a theoritical higher performance with everything else equal?
 
No dumbing down, I want 108 pages, and I want 320*200 banners on all of them. And I want biased vitriol spewed from the mouths of blatant bois of the fan that are censored. Hang on I want... mumblemumblemumble, damn the libel police got me


Seriously though, I consider myself fairly geeky on the scale of things and have been in the scene since my little 300A whispered it wanted to go faster. but I, and 98.5% of the rest of peeps couldn't care less how they arrive at the gigapixel figure, its sometimes interesting but mostly academic at best. Performance is all that matters. The purchasing decisions of most people will be
  1. Do I get more FPS in x game
  2. Ok I'll that one

If you want exact breakdowns of every pipelined action to get that pixel on your screen other reviews will do it or there will be a seperate editorial if the technology is new and spangly. If you want an informed decision on what is the best to play your games, then thats you'll usually get here. And we thankfully get it free of most of the non relevent padding copied from a tech briefing that some reviewers feel the need to include to pad their 'review' out.
 
Skrying said:
It is the NV47. I'm betting the G70 was just used as a sneaky marketing gimic, you know to get everyone like "oh noes, new code name must = kickass 1337 card from hell!!!!!!." Either way, it worked, lol.



Does this mean a card with 24 pixel pipelines and 24 ROPs would perform better? From my understanding (which is limited in this super techie stuff :p) isnt the 24 pixel pipelines just really there to maximize the 16 ROPs? So a 24 ROP card would still have times where it could hit a theoritical higher performance with everything else equal?

Well with less ROPs than Pixel Pipes it will definitely always keep the ROPs busy with lots of fragments behind them needing to be rastered.

Subsequently the 6600 GT has 4 ROPs and 8 pixel pipes and the 4 ROPs do not hold it back.

Only in very rare cases where one texture or shader is being applied to a fragment the 7800 gtx would deliver performance similar to NV40, but that just isn't the case with games, they all employ multitexturing.

I think the real-world gaming results in this review speak for themselves ;)
 
I hate to say it Brent but you, and [H]ardOCP, are the main reason I just dropped six bones on a video card! ;)
 
Iratus said:
If you want exact breakdowns of every pipelined action to get that pixel on your screen other reviews will do it or there will be a seperate editorial if the technology is new and spangly. If you want an informed decision on what is the best to play your games, then thats you'll usually get here. And we thankfully get it free of most of the non relevent padding copied from a tech briefing that some reviewers feel the need to include to pad their 'review' out.

Yeah maybe it's the wrong forum/site for that kinda thing. My point wasn't that they should go into finite detail - just that they shouldn't misrepresent what fillrate really means which just adds to the confusion.. Oh well, when in Rome...... :)
 
Just a readers 2-bit critique here, Mr. Bennett, but I would love to see 2 graphs per game. One that shows the new "Playable settings" graph and the old more common way of comparing cards at the same resolution/AA/Aniso settings.

I totally see the need for a reviews that really give you a good idea on how the card will "feel" in actuall game play. But I also see the need for the antiquated "pissing contest" style just so I can quickly read the article and keep some perspective on just how much faster a card may be than the last generation(s).

Obviously, I could go find this data from some other web site. Honestly, I've always been reader of the [H], and I guess I'd just assume reading here before most other places. My hang up is that I personally would not drop cash for a piece of hardware having only read "playable settings" data alone.
 
Back
Top