GeForce GTX 780 Ti vs. Radeon R9 290X 4K Gaming @ [H]

Actually, users on OCN are reporting a lack of voltage control and the thermal limitations of the reference cooler severely limits overclocks. I guess there could be a custom BIOS to address any voltage limitations, but I haven't seen mention of it.

I don't know. Maybe i'm wrong - hopefully H has a max overclock review in the works for the 780 max out vs 290X max OC. So far the comparisons i've seen seem to indicate that the GK110 has far more headroom due to being the more efficient chip, but I could be wrong. Shrug.

I just really wish that AMD had matched nvidia with a more robust and efficient reference design. The 290X would be so much more desirable had they done so. Like I said, great chip, hobbled by a so-so ref design (unless you can put up with the added noise of uber modes)
 
No one said it is fucking quiet, people are saying that it isn't a fucking jet engine like you keep trying to say it is. Every fucking thread where the 290x is mentioned you go on your inane fucking rant about how the 290x either supposedly throttles or bursts ear drums, give it a fucking break already.

Uhm, I have a right to my opinion just like anyone else. You just need to calm down there and stop throwing childish tantrums about it. Just so you know, i've probably spent more on AMD GPUs than you have in the prior 3 years - I've liked several AMD GPUs but I don't like the issues introduced with the 290 series in terms of quiet mode throttling. Not a fan of that at all, and AMD could have prevented it with a more robust reference design. One thing is clear - the 290X performs exceptionally well at high resolutions if you can put up with those design issues. Me? I don't know. I don't like it.

That's my opinion and I can express it freely. You're also free to disagree without going on an angry rampage. You can state that you love the 290X and I won't get angry about it, that's your opinion, have at it.
 
It's funny that some game maintain 30FPS AVG, this is the target a of alot Next Gen console title but in 1080p or lower.. I'am impress still because it's 4k..
 
Getting there, much to cover :)

I also have to say, I have a personal interest in ShadowPlay for myself. However, ShadowPlay under Windows7 I have found out is limited to only 4GB video files, about 15Min of recording gameplay footage. You will have to run Win8 to have unlimited file sizes and gameplay recording time. I'm on Win7 on my main gaming machine, so to make ShadowPlay work for me as well as Fraps does I'd have to upgrade to Win8, something I'm not too happy about. I may do it though, because I want to experiment with ShadowPlay for replacing Fraps for recording game videos. I like the fact I can run at my native 2560x1600 and no matter what it will scale the video to 1080p, with fraps I have to run the games at 1080p if I want to record video and output at 1080p. I also like the fact ShadowPlay can use the GPU to transcode the video on the fly, and the format is perfect, saving my poor CPU what would take an hour to transcode.

I actually prefer Win8/8.1 to Win7, believe it or not...;) It seems not well known, but you never have to look at the RT ui unless you want to--ugh...;) I never have to see it, otherwise I might not feel so charitable about W8. (For all of its millions of non-touchscreen customers, Microsoft really needs to publicize that use of the Win8 RT ui in Win8 is purely optional.)

For me at home, it's running leaner and faster than Win7--but that's on a purely subjective level. I still have Win7x64 on another partition--which I'll be deleting in a few days as I simply no longer need it. Win 8 has some nice features, like iso auto-mounting (way better than Daemon tools which is a real kludge comparatively), and like Storage Spaces--two capabilities which I wouldn't have given a nickel for before I started using them that now I can't imagine not having. Once you get past the Win8 RT ui (and realize it's merely optional), Win8 has quite a lot under the hood to recommend it. Backwards compatibility with games is the equal of Win7, if not actually a tad better. (8.1 not only has a Win7 compatibility mode, it also has a Win8 compat mode, as well...lol...which I've had no use for at all.)

The only thing Win8 lacks in the Explorer.exe GUI department compared with Win7 is the Start Menu. I use Classic Shell 4.02--by far the best version yet, and I have to remind myself that it doesn't ship with Win8 else I'll forget! I suppose I had some motivation for Win8/8.1, too--I only paid $39.99 for the whole thing back in January...;) Anyway, just thought I'd pass on my own experience with 8 so far...
 
Getting there, much to cover :)

I also have to say, I have a personal interest in ShadowPlay for myself. However, ShadowPlay under Windows7 I have found out is limited to only 4GB video files, about 15Min of recording gameplay footage. You will have to run Win8 to have unlimited file sizes and gameplay recording time. I'm on Win7 on my main gaming machine, so to make ShadowPlay work for me as well as Fraps does I'd have to upgrade to Win8, something I'm not too happy about. I may do it though, because I want to experiment with ShadowPlay for replacing Fraps for recording game videos. I like the fact I can run at my native 2560x1600 and no matter what it will scale the video to 1080p, with fraps I have to run the games at 1080p if I want to record video and output at 1080p. I also like the fact ShadowPlay can use the GPU to transcode the video on the fly, and the format is perfect, saving my poor CPU what would take an hour to transcode.
Please get some of the best cards out there and overclock the living shiznits out of them and report results. I really want to see head to head of the following overclocked cards (preferably at 1080 P as well to see if we can do 120/144 fps vsync).

780 Ti OC
R9 290X OC
R9 290 OC
780 OC
Titan OC

then repeat with SLi and CFX configs. I would kill to see that article here at [H].

This will make my buying decision much easier :D.
 
If a quieter card is worth the $150 difference for a user, that $150 would probably cover the cost of watercooling the R290x or R290. Looking forward to seeing a watercooled R290 compete to see if additional gains are made besides a much better sound profile.
 
Great work guys, though I must say that your 'apples to apples' comparisons with 4xMSAA were really not useful.

I'm most interested to hear your lack of problems with the video mode selection with the Geforce cards, but a quick gander at the Nvidia forums indicates people are still having problems. I wonder if this is specific to the 780 Ti versus older models?

Now get cracking with the SLI / Crossfire comparison, or it will be the Comfy Chair for you! :D

I'm predicting a very spendy new year: 4K display, 2x video cards, new smaller Nofan cooler, and semi-fanless PSU (I'm looking at the ZM 1250 as it's silent below 500W, but can only find one review).

Which way I go will depend as much on noise as performance, so I await OEM coolers for both sides with interest.
 
Please get some of the best cards out there and overclock the living shiznits out of them and report results. I really want to see head to head of the following overclocked cards (preferably at 1080 P as well to see if we can do 120/144 fps vsync).

780 Ti OC
R9 290X OC
R9 290 OC
780 OC
Titan OC

then repeat with SLi and CFX configs. I would kill to see that article here at [H].

This will make my buying decision much easier :D.

Agree with that, since that's what really matters to most people buying high end cards. I do think all this hand wringing over the noise is stupid though, in about a month we should start seeing custom coolers and we'll be able to judge the noise/performance profile when someone spends more than 50 cents on a heat sink.
 
It is ridiculous, especially since I have no problems setting both of my 7950s to 70% fan just so I can eke every last mhz out of the cards.
 
If a quieter card is worth the $150 difference for a user, that $150 would probably cover the cost of watercooling the R290x or R290. Looking forward to seeing a watercooled R290 compete to see if additional gains are made besides a much better sound profile.

The EK full cover blocks run 105-115 depending on nickle plating and other bling options, EK tends to cost slightly more for FC blocks. It's safe to say you could put a full cover block on a 290 and still come in 100 bucks less than the 780ti, and still vastly less than Titan.

The real question is if the games you want to play are more favorable to nvidia or you are locked into their feature ecosystem for things like physx and (in the future) g-sync. Or if you wanted a cheap GPU compute/tesla card making Titan the only choice. On the AMD side it depends on mantle, or if you need that 4gb of VRAM. Because even if you factor a water block into the equation AMD is still cheaper.
 
Well nVidia's support of 4k MST monitors has been flaky. In some games, SLi is completely broken running in 4k with MST. You won't see this in the likes of huge budget games like Crysis and Far Cry, but other games such as FFXIV that *need* that second card to maintain 60fps, is where it falls flat on it's face.

I'd be curious to know if the 780 Ti has the same BIOS issues as the previous cards, where the PC would refuse to POST on a cold boot on nVidia cards. Not to mention not having any display before windows is loaded. OFC this is fixed in a firmware update for like the Asus monitor, but it's not readily available on their site - you need to open a ticket with their support to get it.

AMD cards have none of these issues, and they don't rely on hacking Eyefinity to support two-monitor gaming. I am still running on two GTX 780s, but if the OC potential is there, and crossfire scaling is good/better @ 4k, I might make the switch. Still holding out for non-reference AMD cards, or maybe Ghz editions that I can watercool.
 
Fixed. Writing about the value of these cards is pointless when using such overpriced displays to benchmark.

Some people can afford them. And they'll be cheaper in the future.
 
Some people can afford them. And they'll be cheaper in the future.

Hopefully sooner than later. $3K+ for a display is still at the insane level.

I'm still on 1080p until the 1440p displays are affordable. $600+ for a good 1440p is nuts. I refuse to buy those cheap b-grade error panels.

Where is my 4K, <2ms, 144Hz display? (My god the GPU power required for this)
 
Love this review!

Now I'm sure we all want to see ...

2xR9 290X vs 2xGTX 780 Ti
3xR9 290X vs 3xGTX 780 Ti

:D

I would *love* to see 2xR9 290 and 3xR9 290 as it seems to bring even more value!
 
Fixed. Writing about the value of these cards is pointless when using such overpriced displays to benchmark. Save yourself some time.

I would have saved even more time by not responding to and reading your post. Just to put it into perspective.
 
I just wonder how much better the R9 290 series will become with custom cooling solutions.
 
i find it funny when people argue that the 290x was pushed to its hardware "limits" when it was clearly
designed to throttle originally .When throttling the 290 performed and competed well with its intended
match up (gtx 770)

The powertune is clearly designed differently than boost2.0 , yet some people on here don't want to see that and repeat it the same thing over and over and just expect amd to follow.

If nvidia hadn't come out and tried to upstage amd , we might not have these updated fan speeds. So thank you nvidia! for making amd better.

any crossfire/sli in 4k benchies incomming kyle ?
 
It's hard to justify the sound profile as being worth a $150 price premium.

I don't think so, personally. It's hard to game when you can hardly hear the game, or have to turn up the speakers too loud. I know this from having a GTX 470 SLI setup a little while back. I was pleased that I had noise reduction when I bought my GTX680.

In addition, I have found that AMD/ATI do not have the quality drivers I get from Nvidia.

Less trouble, less noise. That justifies the extra $150 in my opinion.

Although, neither has the performance it would take to get me to mover from my now SLi GTX680 setup. Looks like it will be at least one more generation before I change.
 
Uhm, I have a right to my opinion just like anyone else. You just need to calm down there and stop throwing childish tantrums about it. Just so you know, i've probably spent more on AMD GPUs than you have in the prior 3 years - I've liked several AMD GPUs but I don't like the issues introduced with the 290 series in terms of quiet mode throttling. Not a fan of that at all, and AMD could have prevented it with a more robust reference design. One thing is clear - the 290X performs exceptionally well at high resolutions if you can put up with those design issues. Me? I don't know. I don't like it.

That's my opinion and I can express it freely. You're also free to disagree without going on an angry rampage. You can state that you love the 290X and I won't get angry about it, that's your opinion, have at it.

Sure, you have a right to an opinion, but you stated the same damn thing two posts in a row on the SAME PAGE of the SAME THREAD. Shut up about it, some people don't give a flying shit, clearly you do. So don't buy the goddamn card and move on.

(on topic)

I want a 290X now.
 
Nice writeup! Good to know there is still parity between the two brands. It will keep competition flowing and prices more reasonable.

4K is a ways off for me still though. Firstly the price is just a bit too much for me right now. And secondly, I'm just not comfortable with the 16:9 aspect ratio on a monitor. 16:10 works MUCH better for desktop type work, It' seems like a silly small difference, but it is huge and noticeable to me.

When we start seeing decent 16:10 4k monitors at the $1000 - $1500 price range, I will start considering them a little bit more seriously.
 
On that note of "can't use highest in-game settings"... have you found AA is still even beneficial at 4K on a sub-40" screen?

I'd love to see another angle on this: 1080p gaming with SSAA. I'd be interested to see if 4x the pixels, scaled down, equals the performance of 4x the pixels as-is (and does it take up as much VRAM). I think many more people would find this useful: I'm sure more high-end card buyers are still playing at 1080P, whether that be a 24" monitor, a 50" TV, or a 100" projection screen. 1080P still seems to rule the roost, and probably will for quite some time... so now with ever growing performance reserves, SSAA becomes especially relevant again (I've been playing some old games with SSAA lately, forced by RadeonPro, and the detail is amazing... I can see the wires on a tower kilometers away in BF2 with nary a flickering pixel)
 
On that note of "can't use highest in-game settings"... have you found AA is still even beneficial at 4K on a sub-40" screen?

I'd love to see another angle on this: 1080p gaming with SSAA. I'd be interested to see if 4x the pixels, scaled down, equals the performance of 4x the pixels as-is (and does it take up as much VRAM). I think many more people would find this useful: I'm sure more high-end card buyers are still playing at 1080P, whether that be a 24" monitor, a 50" TV, or a 100" projection screen. 1080P still seems to rule the roost, and probably will for quite some time... so now with ever growing performance reserves, SSAA becomes especially relevant again (I've been playing some old games with SSAA lately, forced by RadeonPro, and the detail is amazing... I can see the wires on a tower kilometers away in BF2 with nary a flickering pixel)

Until things that matter for most consumers require more than 1080p the defacto high res will still be 1080p. Which you can read as "until someone is selling movies at over1080p and cable/streaming can be done at over 1080p, 1080p will remain standard".

I had CRT monitors that dealt with resolutions that were never really supported outside of CAD/CAM and engineering use for that entire era. They were always at an insane premium because there was no real use for it for most people. Games never really supported it either, which made logical sense. Most people won't buy a 4k monitor just for games. It's for bragging rights or your work requires it and you happen to play games on a work computer.

Cheap 4k won't come until 4k movies are common, 4k streaming is common, and enough 1080p screens have burned out they can convince the masses it's time to buy new TV's all over again. That's what it took to really drive the current gen of HD.

I don't think so, personally. It's hard to game when you can hardly hear the game, or have to turn up the speakers too loud. I know this from having a GTX 470 SLI setup a little while back. I was pleased that I had noise reduction when I bought my GTX680.

In addition, I have found that AMD/ATI do not have the quality drivers I get from Nvidia.

Less trouble, less noise. That justifies the extra $150 in my opinion.

AMD's drivers have been fine for a while now. Furthermore that extra 150 covers a full cover block with 50 bucks to spare. I ran dual 470's and they are still kicking in a PC. They weren't that good at stock, but under water with some volts the temps were easy to contain. People taking them from the stock 607 up to 900 wasn't unheard of.

A lot of things are completely different animals under water.
 
That's very useful info. However, I'll note that many people have their PCs on their desks (why I do not know, but...) and others - including myself - have their PCs beside their desks. And the desk acts as a considerable noise baffle.

I used it extensively on an open test bench as well, 2 feet from my head, and while I will say they were "loud" there, these were still actually bearable. Not that I would WANT to do it that way. And yes, cases and desks very much act to put sound dampening between you and your hardware.
 
Some people can afford them. And they'll be cheaper in the future.

Affordable 4K monitors are likely at least 2 years away. I admit I can not afford a 32" 4k monitor, however if I could I would not buy one anyway since the 330$ Qnix has better sRGB color space coverage and color saturation, equally as good preset color accuracy, better black levels, less lag, can be oc'd and the 2 Qnix models I've tested are PWM free/have super high dimming frequencies vs. the Asus's insultingly low LED PWM Dimming frequency. There are no in depth Shark 4K reviews.

The res argument is moot since one can purchase multiple 1440p monitors, avoid the scaling issues (my Crossover 2755AMG supports 4K over HDMI @30hz) and get far more real estate as well as less glow. These 4K monitors have absolutely zero value vs. the 27" 1440p monitors aside from being bigger but they also suffer from more pronounced glow. I cannot fathom 4K monitor owners giving a second thought to the value of current gpu's unless they have paid no attention to the display market in the lat two years.
 
Last edited:
I have a question:
When 780s released, among with the referance models, almost simultane&#959;usly were released aftermarket models as well with great cooling systems.
On the case of 290X, we are still waiting for decent aftermarket cooling systems. The only aftermarket models have been released so far, are from gigabyte, sapphire, xfx, and at all of them the cooling systems is identical to the referance. (* or at least they look identical)
So the question is what is the reason for this delay?

supply. AMD built their first batch of chips into complete devices, and sent them to resale partners. not until they ship CHIPS or chips on boards will we see third party coolers
 
Good review guys. But there's a lot of stuff nagging me about the whole review at 4k. On a competition level you'll never see anyone run 4k, there's no reason for it. On a consumer level 4k hasn't been widely adopted yet and won't be for some time. That being said tech that is easily within the grasp of consumers right now include Crossfire, SLI, and Multi-monitor gaming but they don't get touched when comparing the 290x and 780ti. Why not?

If you really want to touch on the subject of how loud the 290x is under load maybe we should include how Nvidia's new tech G-Sync is going to make your gaming experience much better, of course assuming you have a monitor with the capability. Even if you could excuse the 290x for how much louder it is, I'd say the GTX 780ti's $150 premium is more than worth it if you want to nitpick.
 
Correct me if I am wrong but isn't "uber mode" basically an overclocking switch?

If so, why not compare it to an overclocked 780 Ti? I mean with the EVGA precision utility, overclocking the Nvidia card is done with a mouse click.

You are wrong. Uber Mode increases the fan speed to 55%, from 40% in Quiet Mode. This stabilizes the clock speed at the 1GHz specified core clock cap for the 290X. The clock speed of 1GHz, is now an "up to" 1GHz cap. If it gets too hot, it throttles clock speed below 1GHz, keeping it cooler stabilizes the clock closer to 1GHz as the intended frequency. It is not overclocking. It is simply allowing the 290X to perform at its specified potential.
 
Great work guys, though I must say that your 'apples to apples' comparisons with 4xMSAA were really not useful.

The idea there was to push the VRAMs and see if there was a difference or bottleneck between 3GB and 4GB of RAM at 4K, a hot topic people want to know about.
 
Good review guys. But there's a lot of stuff nagging me about the whole review at 4k. On a competition level you'll never see anyone run 4k, there's no reason for it. On a consumer level 4k hasn't been widely adopted yet and won't be for some time. That being said tech that is easily within the grasp of consumers right now include Crossfire, SLI, and Multi-monitor gaming but they don't get touched when comparing the 290x and 780ti. Why not?

If you really want to touch on the subject of how loud the 290x is under load maybe we should include how Nvidia's new tech G-Sync is going to make your gaming experience much better, of course assuming you have a monitor with the capability. Even if you could excuse the 290x for how much louder it is, I'd say the GTX 780ti's $150 premium is more than worth it if you want to nitpick.
if you are going to say that then why not say the 290x should cost more since it will have mantle? gsync is not even out yet and the only plans right now are for it to be included in the Asus 144 hz screen many people here dont want to fool with a 1080p TN panel and I dont blame them. so until gsync is available in high quality higher resolution displays, it means very little. plus most people dont even realize that gsync will not work with every single game.


bottom line is most people agree the 780ti needs a price cut. 600 bucks would be fair at this time as you do get that nice game bundle. once thats gone its not realistic to say that its better cooler is worth 150 bucks more especially since AMD will have non reference coolers about that time too.
 
if you are going to say that then why not say the 290x should cost more since it will have mantle? gsync is not even out yet and the only plans right now are for it to be included in the Asus 144 hz screen many people here dont want to fool with a 1080p TN panel and I don't blame them. so until gsync is available in high quality higher resolution displays, it means very little. plus most people dont even realize that gsync will not work with every single game.

bottom line is most people agree the 780ti needs a price cut. 600 bucks would be fair at this time as you do get that nice game bundle. once that's gone its not realistic to say that its better cooler is worth 150 bucks more especially since AMD will have non reference coolers about that time too.

The majority of people have TN monitors 1080p or otherwise...G-sync will be adopted fast, I'm sure AMD will have their own flavor soon enough but that's speculation. G-Sync does worth with EVERY game because it doesn't use the game to do it's magic, it uses the gpu.

Again 780 ti is $150 more for something that's quieter, cooler, overclocks better, and runs slightly lower TDP. As someone who enjoys overclocking and benchmarks it's totally worth the price point.

And once you put those third party coolers on the cards, not much will change.
 
...many people here dont want to fool with a 1080p TN panel and I dont blame them. so until gsync is available in high quality higher resolution displays, it means very little. plus most people dont even realize that gsync will not work with every single game.


No, we don't want to mess with TN panels- but remember that Nvidia announced a 'module' that could be added to existing monitors. None of us have any idea how that will work, but we can look forward to it in more than just the few monitors they announced.

And why do you think that G-Sync will not work with every single game?
 
The majority of people have TN monitors 1080p or otherwise...G-sync will be adopted fast, I'm sure AMD will have their own flavor soon enough but that's speculation. G-Sync does worth with EVERY game because it doesn't use the game to do it's magic, it uses the gpu.

Again 780 ti is $150 more for something that's quieter, cooler, overclocks better, and runs slightly lower TDP. As someone who enjoys overclocking and benchmarks it's totally worth the price point.

And once you put those third party coolers on the cards, not much will change.
the majority of gamers getting a 290x or 780ti most certainly do not want to be stuck with a TN 1080p panel. and NO gsync does not work with every game as Tom Petersen mentioned that already.
 
No, we don't want to mess with TN panels- but remember that Nvidia announced a 'module' that could be added to existing monitors. None of us have any idea how that will work, but we can look forward to it in more than just the few monitors they announced.

And why do you think that G-Sync will not work with every single game?
watch the video with Tom Paterson on pcper. it will not work with every game because some games were already made in a way that is not going to be compatible. you wont have to guess which games as they will simply not be able to have a profile on gsync. it would be nice if we knew which games right now but he did not mention the exact ones.
 
I'll definitely move on to 4k once price becomes more reasonable for a name brand, not seiki. One big screen is always better than multiple small screens IMO.
 
2 AMD R9 290s + $100 for the price of a single 780Ti? Kind of a no brainer. I'm switching back to AMD. To get 60fps minimum in BF4 I have to run the game on low/medium on my GTX 670.

Nvidia should be ashamed of themselves for the prices they've been pushing the last six months. Top of the line video cards should not cost more than $500. EVER. No excuses, no bullshit. If I need two of them to play games the way I want then they should never EVER cost more than $500.

Hopefully two 290s with Mantle can play BF4 at 4k 60fps+. I'll buy a Westinghouse 4k with DP1.2 to HDMI 2.0 adapter the second they become available.
 
watch the video with Tom Paterson on pcper. it will not work with every game because some games were already made in a way that is not going to be compatible. you wont have to guess which games as they will simply not be able to have a profile on gsync. it would be nice if we knew which games right now but he did not mention the exact ones.

Can't watch videos here- but why on earth would it need profiles, and how is it possible to make a game that can't work with a technology that replaces V-Sync?
 
Nvidia should be ashamed of themselves for the prices they've been pushing the last six months. Top of the line video cards should not cost more than $500. EVER. No excuses, no bullshit. If I need two of them to play games the way I want then they should never EVER cost more than $500.

I hate high prices too. However, the fact that people bought Titans for SLI configurations to the tune of $2K means that the market will bear prices double what you or I think they should max out at. Consequently, prices will be what people are willing to pay. End of story. Apple's success at selling people outdated gear with phenomenal markups is evidence of that pricing strategy being appropriate and useful for profits. ;)
 
Apple's success at selling people outdated gear with phenomenal markups is evidence of that pricing strategy being appropriate and useful for profits. ;)

Oh nice, Apple being dragged into this discussion...lol. If you're going to go there, one could argue that NVIDIA brings the "premium" experience (a la Apple) and hence their pricing. Just saying. It has been an "eye opener" going back to AMD (I had Crossfire 6970 and 7970s in the past) - it's a rough and tumble world. But it's fun. Not nearly as polished as NVIDIA.

Oh, and outdated hardware is hardly true. Apple brings cutting edge hardware at every new release. They're usually the first out of the gate with new Intel platforms, for example.
 
Oh nice, Apple being dragged into this discussion...lol. If you're going to go there, one could argue that NVIDIA brings the "premium" experience (a la Apple) and hence their pricing.

I agree completely. That was my point, exactly. "Premium" is a perception. People pay for it. I did when I bought my GTX780 and I even got it after the price drops. Damned thing was more expensive than it should be, but it was obviously less expensive than it needed to be for me not to buy it ;)
 
Back
Top