AMD Fury owners - Do you like your card?

FURY owners - Are you happy with your Fury card?


  • Total voters
    56

Archaea

[H]F Junkie
Joined
Oct 19, 2004
Messages
11,833
I see a lot of hate for AMD in the forums.

I'm a died in the wool 15 year straight Nvidia guy until the last year or so. I joined the red team because I was wanting to try PLP monitor setup with a 20/30/20 and AMD was the only option for that.
I didn't know what to expect and had read lots of negative things about AMD cards over the years, but
I've been really pleased with my AMD Fury X card - especially after I figured out the Global Power Saving was on, and turned it off and got to stop using ClockBlocker on the random one off game.

Just in the last couple weeks I've been playing with a couple inexpensive freesync monitors and frankly I'm not at all disappointed I bought this card. The Fury X is quiet, fast, stable, and freesync monitors are far cheaper than gsync. The card runs just shy of 1070 performance which isn't bad considering it's been out about a year longer than the $400 1070.

Everything I play still has tons of performance to spare at 1440P or 1600P with just a single card, and I've not run into a single game I can't max all settings at with the 4GB - despite all the hate you see against the lowly 4GB of HBM that these cards shipped with. (I suppose you could count Doom as a game we can't run with max settings - since nightmare settings isn't an option at 4GB, but I'm playing tonight at Ultra settings at 125FPS range at 1440p, and it just looks fantastic on Ultra, and since the nightmare option isn't available - I guess I don't know what I'm missing - and probably don't care as it already looks fantastic. I'm wondering how other Fury card owners feel after some time with the ownership of these cards? Am I the odd man out with my enjoyment of this card, or the norm? Where does all the forum hate stem from? I can't say this card is anymore frustrating than my Nvida cards I've used over the years. Maybe a glitch or two here or there in an old title, nothing too serious.
 
Last edited:
I see a lot of hate for AMD in the forums.

I'm a died in the wool 15 year straight Nvidia guy until the last year or so. I joined the red team because I was wanting to try PLP monitor setup with a 20/30/20 and AMD was the only option for that.
I didn't know what to expect and had read lots of negative things about AMD cards over the years, but
I've been really pleased with my AMD Fury X card - especially after I figured out the Global Power Saving was on, and turned it off and got to stop using ClockBlocker on the random one off game.

Just in the last couple weeks I've been playing with a couple inexpensive freesync monitors and frankly I'm not at all disappointed I bought this card. The Fury X is quiet, fast, stable, and freesync monitors are far cheaper than gsync. The card runs just shy of 1070 performance which isn't bad considering it's been out about a year longer than the $400 1070.

Everything I play still has tons of performance to spare at 1440P or 1600P with just a single card, and I've not run into a single game I can't max all settings at with the 4GB - despite all the hate you see against the lowly 4GB of HBM that these cards shipped with. (I suppose you could count Doom as a game we can't run with max settings - since nightmare settings isn't an option at 4GB, but I'm playing tonight at Ultra settings at 125FPS range at 1440p, and it just looks fantastic on Ultra, and since the nightmare option isn't available - I guess I don't know what I'm missing - and probably don't care as it already looks fantastic. I'm wondering how other Fury card owners feel after some time with the ownership of these cards? Am I the odd man out with my enjoyment of this card, or the norm? Where does all the forum hate stem from? I can't say this card is anymore frustrating than my Nvida cards I've used over the years. Maybe a glitch or two here or there in an old title, nothing too serious.


Play ARMA 3 and get back with me....kinda piss me off with low FPS. I tried out the 900gtx series and promptly returned it back to the merchant. My two gtx480 sli does just fine for me except in certain servers in ARMA 3. $50 bux game is unplayable....that irritates me.
 
I don't think it's hate has anything to do with it.

I don't consider Fury X being a bad card, but I didn't consider it for several reasons, some of which are subjective.

1. 4GB VRAM: yes, you have mentioned that it isn't a limitation, but that is very game subjective, you were locked out of the nightmare setting for doom for example, simply from the lack of VRAM. My distaste for it came at a time when 970's VRAM debacle is still fresh (that whole 3.5GB/0.5GB thing), and [H]'s reviews point to the fact that VRAM size still mattered more than speed, a combination of both gave me an impression that Fury X is relatively ill equipped in that. If people are making a fuss about the 3.5GB/0.5GB affecting their gameplay, 4GB could not possibly be that far away (same reason why at the time, I simply SMH when people were returning their 970's for 980).

2. Compulsary AIO: it's a good thing for a lot of people, not for me. I don't have much time to mess around reinstalling fans in my case, nor was I willing to risk my entire system. My enthusiasm was in games (and thus willing to buy the necessary hardware to run them), but I only have time to have that hobby, I don't have the time to make computer modding my second one, so I wasn't willing to go near Fury X.

Again, I emphasise that, in the hindsight, Fury X wasn't, and still isn't a bad card, but it's the little things that work against it, especially when compared to 980ti.
Play ARMA 3 and get back with me....kinda piss me off with low FPS. I tried out the 900gtx series and promptly returned it back to the merchant. My two gtx480 sli does just fine for me except in certain servers in ARMA 3. $50 bux game is unplayable....that irritates me.
Perhaps it's a game problem and not a problem with GPU? And what 900 series did you try?
 
I just bough a Fury X to go with my new 3440x1440 75hz freesync display, upgrading from my R9 290.

I'm honestly underwhelmed with it's performance.

Not running any AA, Witcher 3 runs at 55 fps at Ultra. I can't max out GTA 5 due to ram limitations. Even got less than 30 fps in Metro with medium settings (???)

GPU core won't overclock past 7% no matter what i do.

It's still the best card i could get for freesync but i can clearly see that this card won't last me for years.

I don't have any issues with the card or the drivers though, everything worked fine from day 1. But i should've waited for Vega and kept my R9 290. Not a worthwhile upgrade.
 
Well, I sold off the 980 Ti card I had and picked up a Sapphire R9 Fury Nitro+. :) For whatever reason, the AMD cards downscale on the Samsung 4k 28 inch monitor I have where the Nvidia card looked like crap at every resolution except 4k. (For some reason, even at 4k, it looked dulled out compared to the AMD cards, even at the desktop.) Now, I have not really played many games lately but, so far, all the ones I played do so really well, even Batman: Arkham Knight. I am simply an AMD fan anyways and much prefer the Crimson Control Panel over the Nvidia one does not appear to have been updated in over a decade.

The only issue is, I cannot crossfire because I only have an 850 Watt Thermaltake M850W power supply. Most games do not show more than 400 watts of total usage with my FX 8300 at 4.5 Ghz but, Crysis 3 boosted that to 580 Watts or so of usage which tells me all my resources are being fully used in that game at 4k. Even an R9 Nano would most likely push me over the top of what my power supply can handle.
 
Hmmmm

I don't own Arma III, Witcher 3, GTA5, or the new Metro 2033 Redux.

But - What I have played works great.

I looked up benchmarks a google search for all four of these games mentioned as troublesome on a Fury card and none of those seem surprisingly bad? Perhaps it's a unwanted power saving setting defaulted in Crimson (this gets me every once in a while after a driver update) or some other System issue local to your machine - because those problems aren't reflected in the benchmark testing I found in a web search?

Arma III
arma-iii[1].jpg


Witcher 3 at 1440p
r_600x450[1].png


Metro at 4K
r_600x450[2].png


Grand Theft Auto 5
untitled-16[1].png
 
I just love polls and can't help myself.

When the 480 launched, I was a bit underwhelmed and I tried to find a Fury-X in my country as i though this was a better buy in the long term. But for whatever reason it appears as though the fury line in EOL in my part of the world.
 
I don't think it's hate has anything to do with it.

I don't consider Fury X being a bad card, but I didn't consider it for several reasons, some of which are subjective.

1. 4GB VRAM: yes, you have mentioned that it isn't a limitation, but that is very game subjective, you were locked out of the nightmare setting for doom for example, simply from the lack of VRAM. My distaste for it came at a time when 970's VRAM debacle is still fresh (that whole 3.5GB/0.5GB thing), and [H]'s reviews point to the fact that VRAM size still mattered more than speed, a combination of both gave me an impression that Fury X is relatively ill equipped in that. If people are making a fuss about the 3.5GB/0.5GB affecting their gameplay, 4GB could not possibly be that far away (same reason why at the time, I simply SMH when people were returning their 970's for 980).

2. Compulsary AIO: it's a good thing for a lot of people, not for me. I don't have much time to mess around reinstalling fans in my case, nor was I willing to risk my entire system. My enthusiasm was in games (and thus willing to buy the necessary hardware to run them), but I only have time to have that hobby, I don't have the time to make computer modding my second one, so I wasn't willing to go near Fury X.

Again, I emphasise that, in the hindsight, Fury X wasn't, and still isn't a bad card, but it's the little things that work against it, especially when compared to 980ti.

In addition to these downsides, the Fury X also had other issues:

The first card to ship without Dual-Link DVI, in an era when such high-end screens had only recently been replaced with DisplayPort. You don't throw away your displays just because you upgrade your video card. The custom RX 480s worked around this brain-dead decision by AMD (every fucking one I can find has a DVI port), but AMD is obviously jumping the gun here.

It also didn't ship with HDMI 2.0, which meant expensive clunky adapters for both needs. Not a good start for a card attempting to be "forward-looking" in it's unconventional design.

The terrible overclock really put it to shame. Even if you played a game that was far faster on the Fury X versus the stock 980 Ti, an overclocked 980 Ti card erased that difference entirely.
 
I was thinking of buying my parents a Vive for Christmas. They have a Fury X. After reading [H] VR reviews it's not even an option. There would be puke everywhere. So much for the AMD doing better over time crap.

Too bad, they have the perfect room for it too. Maybe if the 1070 drops in price a bit... Or God willing AMD gets their shit together but they seem to be going in the opposite direction.
 
yeah, mine doesn't overclock worth a hill of beans either. I tried a 75Mhz overclock (1125Mhz) today on a 3dMark run with a 33% power allowance. It crashed in firestrike. I put a 50Mhz overclock (1100Mhz) on it and it'll run firestrike, but it won't game for an hour straight in battlefield 1. At stock clocks and voltages it NEVER crashes on anything- so it just doesn't like the OC. Since that lack of OC headroom pretty much seems to be universal to most Fury X cardsl - it's really pretty amazing that AMD shipped them all with a 1050Mhz core and hasn't had complaints about failures at sock speeds. For 50Mhz, or less than 5%, to be the difference between 100% long term stability, and crashing on games within an hour of playtime seems to be really edging the line on what they set stock specs. I am a little peeved about AMD's intro to the card acting like the water cooling AIO option on the FURY X could allow for amazing OC headroom and then pretty much universally that being proved untrue. For Shame. Oh well OC has never been guaranteed on anything - so the complaint is more the sly way AMD promoted the potential of the Fury X water cooling solution.

But I still really like the card. Just don't overclock it. It works great at stock. Runs cool (<55* at full load for hours), is silent, and exhausts heat outside of case. With the Vulcan drivers - it been given extra life too. I expect to keep this card until I move to a 4K OLED monitor - which is years down the road. I suspect it will work at 1440p for the next console generation of games since the Fury card is faster than what they are putting in the next Xbox and PS4 consoles --- which means we should be should be pretty strong for 1440p for the next 3-5 years.

For reference the next Xbox - Project Scorpio will have ~6TF of processing power. The Fury has ~8TF. So we've still got some headroom there.
 
Last edited:
Nano here, great card and only real reason bought was the unique size or performance/size ratio was off the charts. 4GB limitations do show, Rise of The Tomb Raider high textures is the highest I can use before seeing some major stutter. Doom is another one. GTA V yet another. The ram size limitations are real but in most cases not catastrophic for high IQ game play in the end. Still there is a lot of potential for the Fiji chip with Vulkan and DX12 with it's 4096 shaders. So besides the ram size limitation there is still much potential left remaining to be exploited in the coming few years.
 
I am sorta happy. Happier than not, at least. Performance is there, but I notice a ton of tearing and judder in games, GTA V is one example. Could buy a Freesync display to help mitigate a bit, but I'm cash-strapped for wants right now.
 
Nano here, great card and only real reason bought was the unique size or performance/size ratio was off the charts. 4GB limitations do show, Rise of The Tomb Raider high textures is the highest I can use before seeing some major stutter. Doom is another one. GTA V yet another. The ram size limitations are real but in most cases not catastrophic for high IQ game play in the end. Still there is a lot of potential for the Fiji chip with Vulkan and DX12 with it's 4096 shaders. So besides the ram size limitation there is still much potential left remaining to be exploited in the coming few years.

Have you tried Rise of the Tomb Raider with new patch revision and drivers. I have no problems there and it looks like you should not too. I don't own GTA v to test.

ASUS VivoBook Laptop: i5-6198DU, 15.6" 1080p, 8GB RAM, 1TB HDD $390 + Free Shipping
 
Hmmmm

I don't own Arma III, Witcher 3, GTA5, or the new Metro 2033 Redux.

But - What I have played works great.

I looked up benchmarks a google search for all four of these games mentioned as troublesome on a Fury card and none of those seem surprisingly bad? Perhaps it's a unwanted power saving setting defaulted in Crimson (this gets me every once in a while after a driver update) or some other System issue local to your machine - because those problems aren't reflected in the benchmark testing I found in a web search?

Arma III
View attachment 7397

Witcher 3 at 1440p
View attachment 7398

Metro at 4K
View attachment 7399

Grand Theft Auto 5
View attachment 7400
Ultrawide 1440p is generallt found by splitting the difference between 1440p regular benchmarks and 4k, if vram usage doesnt spike over 4GB in the transition. UW 1440p is quite a bit more pressing on the card than standard 1440p. Frankly 55 fps on ultra at 3440x1440 is pretty damn impressive for witcher 3. My 390@1170 can't push more than upper 30's at the resolution.
 
I'm very happy with my card. I play most games that I like at 4K resolution with max, or near max settings (excluding AA). I was worried that some of the more demanding AAA games would be too much, and to be fair, they are at 4K + Max settings, but I am also surprised by something like Doom which runs incredibly well both before and after the DX12 patch. Given the time and use I've gotten out of it, it feels like it was a reasonable purchase. I also think it will continue to last until Vega without much issue. At that point, I'll see what the landscape looks like and upgrade from there. In the mean time, I've found that 1440p isn't too bad at all on a 4K monitor if you need the extra frames.
 
Play ARMA 3 and get back with me....kinda piss me off with low FPS. I tried out the 900gtx series and promptly returned it back to the merchant. My two gtx480 sli does just fine for me except in certain servers in ARMA 3. $50 bux game is unplayable....that irritates me.

yeah arma3 has terrible fps online for some reason, i even read gtx1080 owners complaining, cause the engine sux? low fps makes the game not very enjoyable
 
Back
Top