• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Radeon Fury X review thread

I doubt it. AMD has effectively become the "other" brand. In short, it's the alternative brand. I don't know why any one dropping such serious coin on GPUs would entertain the idea of the Fury X.

I know people will damage control and say it isn't a bad card- I agree. However, it does nothing to help AMD's current situation. They've developed an awful reputation for piss poor performance on day 1 titles (Arkham Knight, AC: Unity, GTA V) that people go out of their way to avoid purchasing their cards.

This card was supposed to have a healthy performance lead in front of the competing Nvidia cards AND be under priced. Instead it achieves none of that and positions itself as a boutique solution. Effectively endangering it to a niche product status.

Godspeed AMD. Drop the price, put HDMI 2.0 in, and get your drivers together.

Yeah I dont know if its just effects missing until a patch, but batman looks so superior with nvida effects on than off its ridiculous
 

Pagusas

Elden Member
You should not be using a 4k TV to play PC games on as the input lag is huge. Now 4k monitors is a different thing.
Also I think we should give AMD some time to sort it's drivers for the new card. Unlike Nvidia which already had a mature driver for the 980ti, AMD is having to deal with an all new chipset and that can take some time to optimise for.
All in all it's a very good card and the lack of HDMI 2.0 isn't a big thing. Display port is arguably better than it.

The hell are you talking about, most of us are biting at the bit for 4k Couch gaming, thats what a lot of these cards are being made for. There is a reason Steam made the couch mode.

It is a big deal to a lot of us, and you are way over stating input lag. We already have some decent 4k tv's that handle it well, and for many games like the Witcher it is nowhere near a big deal.
 
It isn't an angle, it is a way to... you know... look at the data presented by one of the best review sites out there.
1e9upy.png

2iku0h.png

397u0w.png

4whuhw.png

Man that 980ti frametime is gorgeous. The Fury is not bad though.
 

tarheel91

Member
Damn tough crowd in here. I'm gonna wait for voltage unlocked overclocking results before I decide if it's worth getting rid of my 295X2. Something doesn't add up with the way AMD has been bragging about overclocking given the results most reviewers have seen on stock voltage. If they could get up to 1250-1300MHz that'd be a solid bump.

Is this card not yet compatible with Mantle? I've noticed all tests used DX11 for BF4, and that seems silly given that most of the driver development was on the Mantle side.
 

Elman

Member
From the PC Perspective review:

Fiji's use of HBM is a totally new thing and the driver modifications needed to properly manage 4GB of memory on a wider, but slower, memory bus are still being perfected. AMD told me this week that the driver would have to be tuned "for each game". This means that AMD needs to dedicate itself to this cause if it wants Fury X and the Fury family to have a nice, long, successful lifespan.

images%2Farticle%2F2015%2F02%2F26%2Fcollar-pull-pedtv-wide.gif


Not a good sign when considering the current mindshare situation and AMD's stigma of poor and/or belated driver support. Even if that stigma is unfounded, AMD's marketing team is facing a hell of an uphill battle.
 

mrklaw

MrArseFace
price:performance is roughly in line with 980ti though. So if the reference 980 is out of your budget, the Fury X is a nice option. £509 in the UK will mean real prices hitting £499 soon enough, which is a reasonable option compared to £550 for a vanilla 980ti, or the £600 for the non-reference coolers.
 
yeah feared as much about drivers given we already have games exceeding 4GB VRAM already and 4K just makes it worse

thats a disaster in the making, considering how hard nvidia is gonna push for more memory usage since that's their trump card on both Ti & Titan X
 

AJLma

Member
Yikes.

I had a feeling the day I read about the Titan X's specs that AMD was going to have a hell of a time trying to keep up. The 900 series is too efficient per core by comparison to AMD's latest... my only concern at this point is whether Maxwell will age as poorly as Kepler.

4000 GCN cores and barely faster than a 980 in practice. AMD is going to be in a lot of trouble once 14nm rolls around and core counts get higher.
 
price:performance is roughly in line with 980ti though. So if the reference 980 is out of your budget, the Fury X is a nice option. £509 in the UK will mean real prices hitting £499 soon enough, which is a reasonable option compared to £550 for a vanilla 980ti, or the £600 for the non-reference coolers.

The problem though is that they have already publically mentioned they need to optimise the drivers per game since the 4GB is a limitation

I'd be way more confident that games work, and work when released on nvida. As someone who only upgrades my PC after several years, I'd be too concerned with games become even more demanding of VRAM over time and would choose 980 Ti (I think 970 is best price performance wise)
 

Randam

Member
Damn tough crowd in here. I'm gonna wait for voltage unlocked overclocking results before I decide if it's worth getting rid of my 295X2. Something doesn't add up with the way AMD has been bragging about overclocking given the results most reviewers have seen on stock voltage. If they could get up to 1250-1300MHz that'd be a solid bump.

Is this card not yet compatible with Mantle? I've noticed all tests used DX11 for BF4, and that seems silly given that most of the driver development was on the Mantle side.

why would you want to downgrade?

From the PC Perspective review:




Not a good sign when considering the current mindshare situation and AMD's stigma of poor and/or belated driver support. Even if that stigma is unfounded, AMD's marketing team is facing a hell of an uphill battle.

oh boy.
that will be interessting.
did they expand there driver division?
 
Not a good sign when considering the current mindshare situation and AMD's stigma of poor and/or belated driver support. Even if that stigma is unfounded, AMD's marketing team is facing a hell of an uphill battle.

Not a good sign considering AMD's current "whenever we get around to it" Crossfire profile support. Compared to Nvidia who drop updated profiles through Geforce Experience trivially.
 

tuxfool

Banned
Yikes.

I had a feeling the day I read about the Titan X's specs that AMD was going to have a hell of a time trying to keep up. The 900 series is too efficient per core by comparison to AMD's latest... my only concern at this point is whether Maxwell will age as poorly as Kepler.

4000 GCN cores and barely faster than a 980 in practice. AMD is going to be in a lot of trouble once 14nm rolls around and core counts get higher.

You cannot simply compare core counts across architectures. Also I should note that 14nm probably won't increase core counts initially, and they will only get more efficient with die sizes will decreasing once again. Another thing is that these cards will be the last GCN 1.x cards, next year their GPUs will be based upon GCN 2.0 (meaning also that this generation will be Kepler'ed by AMD).
 

Irobot82

Member
If AMD needs indeed to dedicate time to optimize the drivers for every game, it's doomed (with the current level of support that AMD provides).

I find this suspect, AMD did a great job fixing their drivers back in the first GCN days. They still gain performance improvements.
 
That £509 price is why Nvidia sacrificed their £800 flagship to panic launch the 980ti.

It'll be interesting to see what the AIB's can do with the non X (Air) variant.

missed that, why such a big difference in price between that and the amd branded one?

Cunt retailers are fucking up the launch.
 

tarheel91

Member
why would you want to downgrade?



oh boy.
that will be interessting.
did they expand there driver division?

Multi-GPU brings with it its own drawbacks. More input lag, inconsistent support, Mantle is rarely supported, etc. If Fury X is 85% of a 295X2 I'd consider swapping. I play at 3440x1440 so I don't need as much raw power as someone at say, 4K would.
 

wachie

Member
Yikes.

I had a feeling the day I read about the Titan X's specs that AMD was going to have a hell of a time trying to keep up. The 900 series is too efficient per core by comparison to AMD's latest... my only concern at this point is whether Maxwell will age as poorly as Kepler.

4000 GCN cores and barely faster than a 980 in practice. AMD is going to be in a lot of trouble once 14nm rolls around and core counts get higher.
Holy hyperbole Arkham Knight.

To top this all off as of writing Canadian consumers have nowhere to even buy this card right now.
NCIX had it in stock this morning.
 

oxidax

Member
Everybody has a different benchmark. I'm going crazy here. Tech power up, 3dguru and Toms lead me to the fury X while eveeryody else leads me to the 980ti.

fuck!
 

Sanjay

Member
price:performance is roughly in line with 980ti though. So if the reference 980 is out of your budget, the Fury X is a nice option. £509 in the UK will mean real prices hitting £499 soon enough, which is a reasonable option compared to £550 for a vanilla 980ti, or the £600 for the non-reference coolers.

You can get a vanilla 980ti for £509.99
You can get a non ref 980ti for £529.99 and comes a free Batman game, this was priced £499.99 two days ago.

Price Wars!

Fury X to be £449.99 soon.
 

Seanspeed

Banned
That £509 price is why Nvidia sacrificed their £800 flagship to panic launch the 980ti.
Hardly a panic launch. It was 99% likely to have been a deliberate strategy. And a smart one. Get the Titan X 'best of the best' buyers in early. Wait til a few weeks before AMD launch their cards, anticipate pricing and then launch your more gaming orientated flagship title at an appropriate price. Keep calm that AMD isn't gonna outperform what you brought out and then revel in a job well done.

From the PC Perspective review:



images%2Farticle%2F2015%2F02%2F26%2Fcollar-pull-pedtv-wide.gif


Not a good sign when considering the current mindshare situation and AMD's stigma of poor and/or belated driver support. Even if that stigma is unfounded, AMD's marketing team is facing a hell of an uphill battle.
On the plus side, this means that games that they do specifically support might well get a nice performance increase over what a 'standard' result would be, giving it a strong lead in these games.

Potentially...
 

jfoul

Member
The Radeon R9 Fury X is very competitive with the GTX980ti @ 4K & 1440p, but the 1080p numbers are disappointing. I think the Fury X will have better frame numbers and OC potential over time, but they should have been ready to go at release. Maybe AMD should have just released the Fury X, Fury and Nano at the same time in mid July with more mature drivers. Right now, it's hard to recommend a competitive card with potential, over a card that has great drivers that already has huge OC headroom.
 

Smokey

Member
By all accounts they should know how the game is panning out. They made a big deal of the fact that Nvidia Engineers were working alongside implementing gameworks features in the game.

maybe fewer gameworks features and a better performing engine instead?

http://arstechnica.com/gaming/2015/...is-seriously-broken-say-amd-and-nvidia-users/

NVIDIA did not develop the game/engine or port it to PC dude...they are a partner with them but they are not responsible for the quality of the product. Use some common sense.

With gameworks features on or off it doesn't matter the PC port is garbage.
 

pulsemyne

Member
The hell are you talking about, most of us are biting at the bit for 4k Couch gaming, thats what a lot of these cards are being made for. There is a reason Steam made the couch mode.

It is a big deal to a lot of us, and you are way over stating input lag. We already have some decent 4k tv's that handle it well, and for many games like the Witcher it is nowhere near a big deal.
Meh I suppose it's more a personal preference thing. Input lag annoys me, especially on FPS games.
I own a 34 inch 21.9 monitor so it's worthless me buying a 4k TV. Still is a daft move on AMD's part not to make the HDMI 2.0 compliant. The 30hz issues over HDMI affect these monitors as well. Thankfully it also has displayport.
 

Ty4on

Member
Sure, I agree, it's not a flop, but it is still quite disappointing. And AMD sort of need an "Nvidia-killer" card, or they're going to keep loosing market and mindshare.
Yeah. While the 290X didn't blow away everything else like the titan it did match it and the R9 290 was insane for 399 back then. This isn't really a much better deal or a better performing card.
 
Interesting. I'm reading the Guru3D review first, as I tend to like their write-ups.

in GTA5, the FuryX loses to the 980TI at 2560x1440.


But comes VERY close at 4K


Is that the case in most reviews?

All in all, it's not the beast I expected it to be... and especially not at that price.
 

x3sphere

Member
The HardOCP review is very scathing, and I tend to trust their results more as they don't use canned benchmarks.

At 1440p, the 980 Ti is 15-20% faster is what I'm seeing from that review, and in BF4 it was a whopping 33% faster though it seems to be an anomaly considering the rest of the results.

The lead diminishes a lot when you move to 4K, but the Ti still has a 4-8% lead there, and some games that use a lot of VRAM like Dying Light seem to favor the Ti by quite a bit.

This card should be priced closer to the 980, it doesn't compete with the Ti from what I'm seeing. Seriously disappointing as the 290X gave Titan a run for its money.
 

thelastword

Banned
Nobody is pushing an angle. Frametime tests show a less consistent performance than 980Ti at 4k.

http://www.computerbase.de/2015-06/amd-radeon-r9-fury-x-test/11/

It isn't an angle, it is a way to... you know... look at the data presented by one of the best review sites out there.
1e9upy.png

2iku0h.png

397u0w.png

4whuhw.png
That maybe so but these tests and comparisons are a bit suspicious to me, also why quote a german site? I have no problem with their frametime graphs but I see some inconsistencies. I noticed that the 8GB 390 wins quite a few tests at 4k but they use the 4GB version of the 380 against the 980ti and the Fury X in their frametime tests.

If this is truly pointing out that the 4GB of HBM may not be enough at 4K, then I think AMD squandered an opportunity here, but it's still early days to make that call based on AMD's less than stellar optimized drivers at launch and the whole overclocking fiasco. Why aren't these watercooled GPU's overclocking better? That still remains a mystery.

Also why are some sites comparing SC 980ti's to vanilla furies, don't they know that the Fury has issues with overclocking atm.....
 
That maybe so but these tests and comparisons are a bit suspicious to me, also why quote a german site?

There is no reason to doubt their methodologies or reviews. They are very good at what they do.

Why a german site? I live in germany, speak german, and they happen to be one of the best.

Other than that I usually read Guru3D or PcPer.
 

spicy cho

Member
Well, HardOCP is my go to for reviews and theirs was pretty damning. No regrets for my 980ti day one purchase. Not that I ever had a choice since fury x has no DVI port.
 
Man. They really did goof. It would've been fine if it had been just the 4gb limit, or just the delay, or just the lack of hdmi, or just the poor launch drivers/volt limit, but all together? Damn, it's like they're rolling over for NVidia at this point.

Really hope we can see some proper competition come 2016, but it seems like I finally have my answer as whether or not I'm jumping ship to Team Green this fall.
 

AJLma

Member
Barely faster than 980 is as much hyperbole as saying the Titan X is barely faster than the Fury X at 1080p.

It's really not far off. It mostly hits the halfway point between a 980 and 980 Ti.

I'm also going by HardOCP's "highest playable settings" benchmarks since in the past they've most closely hit the mark as far as real world performance go, compared to my own machine, and are actually a sensible measurement of performance instead of just maxing everything out and running numbers.
 
Top Bottom