• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Radeon Fury X review thread

So is the high speed memory they are using is a bust or what?

This is my 980 Ti:

oV4Pu4K.jpg

It's playing GTA V at 1440p maxed out. Its memory controller load is at 50%.

I'm effectively pushing the card to the limit and the memory controller is still cruising. HBM is really a solution looking for a problem at this point.
 
Yeah, as noted by Sir Abacus, this is due to lack of voltage. From what I understand, they have undervolted the card to the bleeding limit to keep power consumption down. With unlocked voltage, we should see some very different numbers.

The problem is the 980 Ti overclocks phenomenally at stock voltages and like crazy with voltage and BIOS modification, for the exact same price.

If AMD really undervolted this card, that was incredibly stupid on their part. The vast majority of people buying enthusiast GPUs are going to look at the raw performance and not care about the power draw.
 

Durante

Member
I just read the computerbase review (was hiking all day and really curious!).

Looks like a pretty good card, but not quite good enough (to e.g. seriously challenge NV or force them to reduce any prices).

FPS Results at 4k are nice, and even at 2560x1440 it at least challenges 980ti in FPS:
capturefuryv6szv.png


But the frametime result I was most curious about (since it's currently one of the most memory-hungry games) shows that 4 GB are still 4 GB:
capturefury251syi.png


And most crucially, it's the first time I can remember that a newly revealed AMD card does worse in price/performance than the NV alternative -- I think that's the killer:
capturefury3svstr.png


On the plus side, temperatures and noise levels are amazing, and power consumption isn't too bad. If they can fix frametime spikes in the driver and adjust the price down a bit it could be a solid product.

So is the high speed memory they are using is a bust or what?
No. The card certainly has enough memory bandwidth.
 

Irobot82

Member
Doesn't afterburner usually allow for voltage unlocking? Or do they have to get permission from the vendors first?
 
doesn't that depend on whether the GPU is ram starved or CU/ROPs starved?

Yes, and all indications so far have been that current GPUs are not memory bandwidth starved. Like someone else said in here, HBM is a solution in search of a problem. That's all great that AMD wanted to be innovative, but by the time HBM matters, Nvidia will also be on it.
 
It's a bit disappointing, I figured the rumors of it being a Titan X beater were way too optimistic, but this changes so little in the current GPU landscape.

An unlocked voltage might help, but I still have my doubts it would pair up against an overclocked 980 TI. Guess it will be waiting for drivers, but by then the damage will probably already been done even if they manage to improve it.
 

pj

Banned
I don't understand the compute benchmarks in that giant image.

Shouldn't the fury x destroy 980ti in compute? 8.6 tflops vs. 7
 
I don't understand the compute benchmarks in that giant image.

Shouldn't the fury x destroy 980ti in compute? 8.6 tflops vs. 7

Peak performance is not the be all and end all. At full occupancy GCN is under far greater register pressure than Maxwell. Doing productive work vs blindly issuing FMAs involves far more than peak performance.
 

FLAguy954

Junior Member
I don't understand the compute benchmarks in that giant image.

Shouldn't the fury x destroy 980ti in compute? 8.6 tflops vs. 7

AMD flops are different from Nvidia flops. It's best to look at how card perform in practice than simply looking at their on-paper flop counts.
 
AMD flops are different from Nvidia flops. It's best to look at how card perform in practice than simply looking at their on-paper flop counts.

No the FLOPS are the same. The differences are in scheduling, register pressure, occupancy, sizes of caches, decoding, latency and probably more factors I can't think of off the top of my thread.
 

arevin01

Member
ATM the there's really no reason to get the Fury X over the 980ti unless you prefer cooler temps. AMD dropped the ball IMO.
 

ApharmdX

Banned
Now I don't regret buying a reference 980 Ti at launch. These benchmarks are close, especially at 4K, but the Fury X is awkwardly positioned overall. It delivers most of the performance that the 980 Ti does, but not all of the performance, and the Fury X has only 4 GB of VRAM, so how future-proof is it? The Fury X needs to be $599. At that price, it makes sense. Custom 980 Tis are priced into the stratosphere, so at six hundred bucks the Fury X will sell for sure.

To the people saying about bad drivers on the AMD side, there's some truth to that, but as a 980 Ti owner, I get several TDRs daily with Chrome, and phantom Witcher 3 crashes. NVIDIA is a lot better about supporting new games on release day, but let's not pretend that either green or red delivers great drivers.
 

endtropy

Neo Member
I suspect Nvidia is going to see an uptick on 980ti sales.. How many folks were sitting on the side lines waiting for ATI to herald a new era of memory bandwidth and performance only to be disappointed?

Glad to see them deliver a product but it feels like so much engineering and effort went into HBM that the underlying GCN was left a little under developed (compared to big maxwell).

The interesting match up will be Pascal vs the next iteration of Fury.. Nvidia waited to deploy a more mature version of HBM and ATI chose to push HBM to market first, perhaps too early?

Either way, as it stands today, I'm not sure why anyone would choose the ATI solution considering the cost is the same, the performance (on average) is lower, and you have to live with the ATI driver team (not looking to argue this, but plenty of triple A releases have been problematic for team red). Maybe Nvidia has undue influence on developers, maybe there is some ethical issue there, all I know is I would not choose to be on the wrong side of it as the end consumer.
 

Tovarisc

Member
So is the high speed memory they are using is a bust or what?

As a tech HBM is far from bust. Very high bandwidth, can be stacked easily for great amounts of VRAM, takes a lot less space than GDDR5 and requires less juice to work. If HBM was a bust then Nvidia wouldn't be all over it with their Pascal.

Edit:
There's barely any reason to get the Fury X over the 980.

I think you forgot to add /s there. Comparing FuryX to rather irrelevant cards like 980 and TitanX and then saying FuryX isn't worth is just very hyperbolic and stupid, imo. 970 and 980Ti are cards where it's at with Nvidia product line atm for anyone who is looking for high to very high performance in such price ranges from Nvidia. FuryX is doing rather well against 980Ti, not sure why people expected 20+% more performance from FuryX.
 

mkenyon

Banned
As a tech HBM is far from bust. Very high bandwidth, can be stacked easily for great amounts of VRAM, takes a lot less space than GDDR5 and requires less juice to work. If HBM was a bust then Nvidia wouldn't be all over it with their Pascal.

Edit:

I think you forgot to add /s there. Comparing FuryX to rather irrelevant cards like 980 and TitanX and then saying FuryX isn't worth is just very hyperbolic and stupid, imo. 970 and 980Ti are cards where it's at with Nvidia product line atm for anyone who is looking for high to very high performance in such price ranges from Nvidia. FuryX is doing rather well against 980Ti, not sure why people expected 20+% more performance from FuryX.
No, I most certainly did not. From last page:

Important benches:

pcars-r9.gif


pcars-titan.gif


pcars-99th.gif


w3-r9.gif


w3-titan.gif


w3-99th.gif


bf4-r9.gif


bf4-titan.gif


bf4-99th.gif


More from TechReport here.

Speaking of which, if you dig deeper using our frame-time-focused performance metrics—or just flip over to the 99th-percentile scatter plot above—you'll find that the Fury X struggles to live up to its considerable potential. Unfortunate slowdowns in games like The Witcher 3 and Far Cry 4 drag the Fury X's overall score below that of the less expensive GeForce GTX 980. What's important to note in this context is that these scores aren't just numbers. They mean that you'll generally experience smoother gameplay in 4K with a $499 GeForce GTX 980 than with a $649 Fury X. Our seat-of-the-pants impressions while play-testing confirm it. The good news is that we've seen AMD fix problems like these in the past with driver updates, and I don't doubt that's a possibility in this case. There's much work to be done, though.
 
I'm curious about a couple things.

1. AMD saying this is an "overclockers dream" and you can "overclock this like there's no tomorrow" must mean that when voltage to the GPU is unlocked, you can push the core clock pretty far, but I guess this will bring the power draw right up. It's baffling though that they gave the cards to reviewers with locked voltage with this in mind. Surely no-one expects that 5-10% is the maximum overclocks for these cards given what AMD have said? Bit of a tits up though on their part, as now the word is out there that the Fury X doesn't overclock well at all.

2. I read somewhere that the benches and scores put the Fury X at only 28-29% faster than a 290X but the shaders and flops numbers should put it at around 35-40% faster than their old card. Does this point to shit drivers at launch?

It sounds like clutching at straws (I am), but I'll need to find out more about the above two points to really judge the Fiji GPU. But at this point, non-reference 980 Ti's (especially that G1) are top of the pile. Why should anyone wait even longer for AMD to improve their drivers to bring this card up to speed?
 
I don't understand the compute benchmarks in that giant image.

Shouldn't the fury x destroy 980ti in compute? 8.6 tflops vs. 7

which one are you talking about? one of those german ones?

yeah i had exact same expectation (though I knew that nvdia cores are not AMD cores and its likely just a marketing term)

I thought fury X would win compute at least even if that didnt translate to actual game performance
 
Wow, those frame time benchmarks are pretty striking. Sure, they might mitigate it with driver updates, but even the 290x is all over the place in Witcher 3, and they don't have the driver excuse to fall back on there.
 

mephixto

Banned
I think you forgot to add /s there. Comparing FuryX to rather irrelevant cards like 980 and TitanX and then saying FuryX isn't worth is just very hyperbolic and stupid, imo. 970 and 980Ti are cards where it's at with Nvidia product line atm for anyone who is looking for high to very high performance in such price ranges from Nvidia. FuryX is doing rather well against 980Ti, not sure why people expected 20+% more performance from FuryX.

Disagree, just cause the Fury X benches put it in between the 980 and the 980ti, but the 980 price is $499 ($150 lower Fury X)
 

Crisium

Member
The margins on this thing must be horrible. It's just not a $650 card. It could have been a hit priced right, but would AMD be making profit if it were much lower?

At $550 it's a very fair deal I think. Better than the 980 overall, doesn't cost much more, and undercuts the 980Ti appropriately as it is indeed slower.

That is fair, and if it's $550 in a month I'll get one otherwise 980 Ti for me as much as it pains me - I'm not paying equal money for worse performance. That's the whole reason I usually prefer AMD in the first place. If I'm dropping $650, I'm getting my money's worth, sorry AMD.
 
I will admit I was wrong. I was hyping up this card from AMD and it certainly is a disappointment. Clean win for NV, maybe next year.
 
Yeah the FCAT frametime results from Tech Report and PCPer are pretty bad. Even in games where Fury X is close to or slightly ahead of 980 Ti it will offer a subjectively worse experience due to worse frametime.

AMD should get an ass kicking for the way they hyped this thing, delayed it so long, and then it just flops. At $650 the 980 Ti is objectively the better card hands down.
 

Tovarisc

Member
No, I most certainly did not. From last page:

Important benches:

pcars-r9.gif


pcars-titan.gif


pcars-99th.gif


w3-r9.gif


w3-titan.gif


w3-99th.gif


bf4-r9.gif


bf4-titan.gif


bf4-99th.gif


More from TechReport here.

Not sure which is more surprising, AMD's frame timings or that some site is still actually doing frame timing tests. I thought frame time testing experienced silent death back when it became apparent that SLI has quite bad frame timing issues and made CF look good :b

GTX980 gives better frame timings than FuryX @ 4k gaming, but what kind performance hit you take for that smoother frame timing? Even then I would compare to 980Ti, but that is just me because in my eyes 980 is irrelevant card now.
 

Van Owen

Banned
Man, those frame-time spikes. I'm just not sure I can continue to put up with those and keep buying AMD, even though I hate what nvidia is doing with the exclusive gameworks crap.
 
FuryX is doing rather well against 980Ti.

Are you high? Releasing a hyped up product that loses to its primary competitor at the exact same price months after the other one came out is "doing rather well?"

Look, I grew up with ATI and AMD. I was buying Rage cards and K6 processors and building machines. I defended them for a long time, but it's over. Let's just hope Intel and Nvidia don't price gouge the shit out of us when the inevitable AMD/ATI sell off or bankruptcy happens.
 

dr_rus

Member
So there it goes. Less performance and features than 980Ti for the same price. NV has really pushed them lower than they've planned to go initially - I think that the plan was to release on $700 and use the $300 gap to position it as a good alternative to Titan X. All this went down the drain on the moment 980Ti was launched.

HBM is showing itself in 4K but 4K is the exact place where 4GB of RAM of Fiji is the real problem. In lesser resolutions we're mostly shading limited and GCN even 1.2 doesn't work well here against Maxwell 2. I believe that AMD can do a lot with drivers to improve the overall picture and the card is likely to end up on the same level as 980Ti stock eventually but then there's the OC issues which basically puts 980Ti ahead again.

I'd say that AMD made a mistake going with HBM but then again I don't think that Fiji's being paired with GDDR5 via Hawaii's bus would help much. They need to seriously overhaul GCN architecture - it's time for 2.0, not 1.something. I hope it comes with 16 or 14nm.
 

Durante

Member
Not sure which is more surprising, AMD's frame timings or that some site is still actually doing frame timing tests. I thought frame time testing experienced silent death back when it became apparent that SLI has quite bad frame timing issues and made CF look good :b
Sorry, what? All multi-card solutions always had worse frame pacing than single-card ones, but I don't remember a time when AMD was ahead in multi-card framepacing across the board like you seem to imply.

Anyway, Techreport had a major role in coming up with the frametime metric, and they have been using it ever since, in every single GPU review.
 

Seanspeed

Banned
No, I most certainly did not. From last page:

Important benches:
To be fair, we know that AMD cards have performance issues with Project Cars. It being weaker there is almost to be expected. It being weaker in BF4 probably isn't to be expected, but it's also almost as consistent, just a bit slower.

I've seen a few other frametime graphs that show it's not as good as 980Ti, but I'd still like to see more. I don't expect the result to change much, but I wouldn't be surprised to see that over a larger range of games, it's ultimately a bit closer than these few games would suggest.

Or maybe not, I dunno. I'd like to give it a chance, though.
 

joesiv

Member
I'm curious about a couple things.

1. AMD saying this is an "overclockers dream" and you can "overclock this like there's no tomorrow" must mean that when voltage to the GPU is unlocked, you can push the core clock pretty far, but I guess this will bring the power draw right up. It's baffling though that they gave the cards to reviewers with locked voltage with this in mind. Surely no-one expects that 5-10% is the maximum overclocks for these cards given what AMD have said? Bit of a tits up though on their part, as now the word is out there that the Fury X doesn't overclock well at all.?

Indeed... I kind of expect a driver update to unlock the voltage, and allow for full fan control too. Perhaps it was AMD's effort to over compensate for the PR nightmare that was the 290x's launcy (hot and loud). So they undervolted and limited the fan speed aggressively. The reviews will praise the power consumption and lack of noise...

Unfortunately, the resulting performance is lacklustre compared to the green team, so I think it backfired big time.

I recall AMD released an update pretty quick after the 290x that gave new fan profiles to curb the reviewers/customer backlash with noise.

I guess they'll have to do the opposite this time, and allow for more control over clocks and fan control, just to make it perform as it should.

Something to note however is that water cooling cannot get as hot as air cooling (as water has a hard limit to where it's effective, whereas air does not). Also tomshardware saw some interesting throttling when doing the shader stress test... hopefully that doesn't indicate a possible limit to it's capabilities based on load or heat, even at these clocks/voltage.
 
Not sure which is more surprising, AMD's frame timings or that some site is still actually doing frame timing tests. I thought frame time testing experienced silent death back when it became apparent that SLI has quite bad frame timing issues and made CF look good :b

Are you remembering time backwards? I thought that was the opposite of how it was.
 

mkenyon

Banned
Yeah, it's most definitely backward. Some more data for folks:

fc4-99th.gif


ai-99th.gif


civbe-99th.gif


gtav-99th.gif


c3-99th.gif


I mean, read through PCPer's and TechReport's reviews. This card is having major issues right now. I'm typically known as an AMD defender too, I'm not trying to inject some form of bias here.
 

IMACOMPUTA

Member
As a tech HBM is far from bust. Very high bandwidth, can be stacked easily for great amounts of VRAM, takes a lot less space than GDDR5 and requires less juice to work. If HBM was a bust then Nvidia wouldn't be all over it with their Pascal.

Edit:

I think you forgot to add /s there. Comparing FuryX to rather irrelevant cards like 980 and TitanX and then saying FuryX isn't worth is just very hyperbolic and stupid, imo. 970 and 980Ti are cards where it's at with Nvidia product line atm for anyone who is looking for high to very high performance in such price ranges from Nvidia. FuryX is doing rather well against 980Ti, not sure why people expected 20+% more performance from FuryX.
You can blame AMD for making the 980 relevant again.
 

HeWhoWalks

Gold Member
Ultimately, as I suspected, the Titan X remains on the throne. That said, it looks like between the Ti and Fury X, the former is the way to go in just about everything.
 

FLAguy954

Junior Member
So there it goes. Less performance and features than 980Ti for the same price. NV has really pushed them lower than they've planned to go initially - I think that the plan was to release on $700 and use the $300 gap to position it as a good alternative to Titan X. All this went down the drain on the moment 980Ti was launched.

HBM is showing itself in 4K but 4K is the exact place where 4GB of RAM of Fiji is the real problem. In lesser resolutions we're mostly shading limited and GCN even 1.2 doesn't work well here against Maxwell 2. I believe that AMD can do a lot with drivers to improve the overall picture and the card is likely to end up on the same level as 980Ti stock eventually but then there's the OC issues which basically puts 980Ti ahead again.

I'd say that AMD made a mistake going with HBM but then again I don't think that Fiji's being paired with GDDR5 via Hawaii's bus would help much. They need to seriously overhaul GCN architecture - it's time for 2.0, not 1.something. I hope it comes with 16 or 14nm.

I agree. They would of been much better off launching this against the 980 tbh. Also, all of these reviews tell me that the GCN 1.X has hit a wall and the architecture needs a Maxwell-level kick in the ass. In regards to HBM, AMD needed them in the Fury cards or we would be seeing ridiculous power draws like on the 390X.
 
TechReport saying it's absolutely due to crap drivers, which is more damning in a way than hardware that isn't up to snuff.

That's pretty typical for day 0 cards. Reviewers are dealing with a host of beta drivers that don't have all the features enabled.

So if we take into account that:

- The card is sitting at it's absolute minimum voltage right now to reduce it's power draw.

- The card is getting errors when trying to overclock at stock voltage.

- The temps at stock are incredibly low.

This card could unleash all of its Fury when the voltage gets unlocked?

abe-simpson.gif
 
Ultimately, as I suspected, the Titan X remains on the throne. That said, it looks like between the Ti and Fury X, the former is the way to go in just about everything.

No, the Gigabyte G1 Gaming 980 Ti and most of the other non-reference 980 Ti's have already rendered the Titan X obsolete for most gamers. The G1 sits at the top of the pile, surely, as it handedly out-performs the Titan X and is significantly cheaper.
 

HeWhoWalks

Gold Member
Wow, HardOCP just savaged the Fury X. There was no holding back. Also, 4 GB of HBM already limiting in current games, look at how bad VRAM intensive games like GTAV and Dying Light did. Hold that L, AMD. You earned it after making people wait this long.

This part, in particular, was pretty brutal:

"Usually trying to decide between two video cards at the same price point is a wash, with very even and split performance. However, this is not the case this time with the AMD Radeon R9 Fury X and GeForce GTX 980 Ti. There is a definite pattern that leads to one video card being the best value for the money, and it is GeForce GTX 980 Ti, not the AMD Radeon R9 Fury X."

I was honestly expecting the Fury X to at least beat the Ti, while the Titan X would reign supreme at 4K.
 

HeWhoWalks

Gold Member
No, the Gigabyte G1 Gaming 980 Ti and most of the other non-reference 980 Ti's have already rendered the Titan X obsolete for most gamers. The G1 sits at the top of the pile, surely, as it handedly out-performs the Titan X and is significantly cheaper.

That 12GB of VRAM will always make it the viable option when gaming at higher resolutions (even some today's reviews point this out). By no means is the Titan X an obsolete card.
 
So if we take into account that:

- The card is sitting at it's absolute minimum voltage right now to reduce it's power draw.

- The card is getting errors when trying to overclock at stock voltage.

- The temps at stock are incredibly low.

This card could unleash all of its Fury when the voltage gets unlocked?

abe-simpson.gif

Let's say that's the case, you work at AMD, and the NDA on Fury X is going to be lifted in a month.

How in the holy fuck do you not unlock the voltage before the NDA releases? They have to know the 980 Ti performance and they have to know they can't beat it. Their only possible saving grace is to out overclock it, but they won't let you do that at review time?

I will be shocked if this thing overclocks beyond a 980 Ti when the voltage is unlocked.
 
Top Bottom