• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Polaris architecture to succeed Graphics Core Next

wachie

Member
A 4GB 480 at $199 and a 4GB 470 at $149 would do wonders for a significant portion (80%) of the market.

pC7QLPm.png
 
As someone who ran SLI 770s for two years, synthetics don't matter one fucking iota when it comes to SLI/CF.

General statement met with a general statement as a counter argument. No one wins.

I guess what you're trying to say is there aren't any games that take advantage of crossfire/sli and therefore there would never be a real world advantage to crossfire or sli? if thats the case then I agree, but I think that trend may change.
 
General statement met with a general statement as a counter argument. No one wins.

I guess what you're trying to say is there aren't any games that take advantage of crossfire/sli and therefore there would never be a real world advantage to crossfire or sli? if thats the case then I agree, but I think that trend may change.

There are definitely some games that show huge gains in SLI or crossfire. However, they seem to be the exception.
 
General statement met with a general statement as a counter argument. No one wins.

If someone doesn't buy into the "look how well two cheap cards work!" bullshit I'd say they do in fact win. But I guess if all someone plays is Ashes of the Singularity they'll be perfectly fine with twin 480s.
 
If someone doesn't buy into the "look how well two cheap cards work!" bullshit I'd say they do in fact win. But I guess if all someone plays is Ashes of the Singularity they'll be perfectly fine with twin 480s.

So I'm not sure what argument you're trying to make here.

Is it that synthetic benchmarks aren't a good way for consumers to gauge GPU performance or that the crossfire/sli benchmarks are bunk or both? What if the benchmarks showed similar gains for a more expensive GPU, would it still be bullshit?

Are you also implying that ashes of the singularity is the only game where crossfire cards will see any benefit.
 
I'm being facetious and you're being obtuse. The point is that SLI/CF sucks and I say that as someone who was suckered into that hype with a side swipe of everything looks good when AMD only use Ashes of the Singularity to gauge some semblance of real world performance.
 

kami_sama

Member
I'm being facetious and you're being obtuse. The point is that SLI/CF sucks and I say that as someone who was suckered into that hype with a side swipe of everything looks good when AMD only use Ashes of the Singularity to gauge some semblance of real world performance.

Yeah, micro-stuttering completely kills any reason I had to get cf.
 
I do think the 480 is a fantastic card especially at $199. But everyone who thinks that it's going to give them a VR reacharound and two of them duct taped together will be a 1080 is going to be sorely disappointed.
 
I do think the 480 is a fantastic card especially at $199. But everyone who thinks that it's going to give them a VR reacharound and two of them duct taped together will be a 1080 is going to be sorely disappointed.

Yeah, it seems reasonable to believe at this point that the RX 480 will perform between a GTX 970 and 980 at stock. With a good OC, you can probably push towards a 980. It'll be the solid entry level VR card that AMD is promoting it to be. What it won't be is some overclocking wunderkind that chases a 50% OC and catches up to a stock GTX 1070 like some people who believed some garbage WCCFTech rumors believed.

And please, if you got the idea of CF'ing two mid-range GPU's like these, stop. There are enough games which have either 0 CF scaling or mediocre scaling where this setup will be inferior to buying a GTX 1070 for the same price, more than the games where CF scaling will be so good that you can catch up to a 1080. The highs don't make up for the lows, which include having a 200+ dollar brick in your PC because a game like DOOM doesn't support CF.
 

ethomaz

Banned
The power draw increase based in clock is really bad for Polaris.

About the benchs I don't know what AMD did but the GCN 1.4 looks to performs on pair or even below previous GCN... there is little to zero architecture performance increase.

Waiting 29th to shift this and show better result... for now disappointment.
 
D

Deleted member 465307

Unconfirmed Member
The power draw increase based in clock is really bad for Polaris.

About the benchs I don't know what AMD did but the GCN 1.4 looks to performs on pair or even below previous GCN... there is little to zero architecture performance increase.

Waiting 29th to shift this and show better result... for now disappointment.

I keep seeing people mention the drivers. Could that be why it seems to be the same or worse?
 

chaosblade

Unconfirmed Member
I keep seeing people mention the drivers. Could that be why it seems to be the same or worse?

Seems likely, updated drivers were only provided to reviewers, so anyone else who got cards early don't have drivers that support the new architecture. If the Polaris (and Vega) cards are literally only improvements in power consumption with no architectural improvements (or even regression), that would be pretty disappointing.

Supposedly some of the leaks have been from the new drivers, but at this point it's better to just not put much stock in anything until reviews hit.
 
power consumption does in fact sit around 100-110 watts under load while gaming. The max power draw while bench marking at 1300+ mhz, as seen in the links above, is 147 watts.

Is that real?

30-40percent more power consumption with a 6 percent overclock?

Wtf
 

FoxSpirit

Junior Member
I am glad I don't care about OC one bit. Squeeze out 20% more performance? Unless in fringe cases that 20% is nothing. In return, you get much more power draw on any card and more heat, which means more noise. I want the best performance/watt. My HD-6850 is already loud enough, even in the silenced case.

Keeping my fingers crossed.
 

DieH@rd

Banned
If that turn true then AMD is having issues with this chip on 14nm.

Gaming chips are designed to work in certain performance/power range. Exceeding it can cause non-linear power draw increase. I don't think the issue here is this, but rare and quick power spikes that can cause instabilities.
 

LCGeek

formerly sane
I am glad I don't care about OC one bit. Squeeze out 20% more performance? Unless in fringe cases that 20% is nothing. In return, you get much more power draw on any card and more heat, which means more noise. I want the best performance/watt. My HD-6850 is already loud enough, even in the silenced case.

Keeping my fingers crossed.

This hasn't even been true on AMD as of late and I'm not even being nice.

Sure my 7950s took decent bit more power than the nvidia equivalent but the idea they are pumping out heat means you got case problems or improper cooling. None of them ever really push past 60-70C on most games and that's with a moderate fan profile. The same for performance concerns the card I mentioned in some games could beat a stock 7970 once you were at 1100mhz or more.
 

Chiggs

Member
And please, if you got the idea of CF'ing two mid-range GPU's like these, stop. There are enough games which have either 0 CF scaling or mediocre scaling where this setup will be inferior to buying a GTX 1070 for the same price, more than the games where CF scaling will be so good that you can catch up to a 1080. The highs don't make up for the lows, which include having a 200+ dollar brick in your PC because a game like DOOM doesn't support CF.

I've got two R9 390Xs in Crossfire and I've had a fantastic time with them. The brute force is incredible and THERE ARE PLENTY of games that support both Crossfire and SLI.

Your comment/opinon that the highs don't make up for the lows is entirely subjective, and one that many on this forum would challenge.
 
I am glad I don't care about OC one bit. Squeeze out 20% more performance? Unless in fringe cases that 20% is nothing. In return, you get much more power draw on any card and more heat, which means more noise. I want the best performance/watt. My HD-6850 is already loud enough, even in the silenced case.

Keeping my fingers crossed.

On most gpus these days a medium oc adds maybe 10-15 watts of power consumption, which is nothing

Unlike 20 percent performance, which is a LOT

that's enough to make framedrops to 50 fps (aka repeating frames with vsync-> microstutter) completely go away as they'll now be 60 fps
A different gpu model with 20 percent more performance can often cost 100 or 200 dollars more

It's literally free performance

the gtx 980 reference is about 20-25 percent faster than a gtx 970 reference, every gtx 970 can overclock to be about 20 percent faster than reference clocks
Person A is willing to overclock and gets near gtx 980 performance (obviously if he had a 980 and oced he get even more performance)

You're not willing to oc so you'd do what? pay 200 dollars more for a 980?

Currently the rx 480 seems to be less than 20 percent faster than the rx 470, but it costs 50 percent more. Since 20 percent is nothing why buy the 480 and not just get the 470 it's a lot cheaper?

Same with cpus btw, go from 3.9ghz boost clock to 4.4ghz oc, it's a free 10-15 percent performance you won't hear me complain about that.
Hell the old sandy bridge cpus had a very low stock clock and overclocking on those gave you over 30 percent performance extra

Having good overclock headroom on hardware is awesome and adds a ton of value to it
 
D

Deleted member 59090

Unconfirmed Member
Currently the rx 480 seems to be less than 20 percent faster than the rx 470, but it costs 50 percent more. Since 20 percent is nothing why buy the 480 and not just get the 470 it's a lot cheaper?

200 is not 50% more than 150.
 

AmyS

Member
Posted?

AMD’s Raja Koduri Celebrates Vega 10 GPU Development Milestone – Next-Gen, HBM2 Powered Chips For Radeon 2017 Family

According to a tweet by Raja Koduri (SVP and Chief Architect of Radeon Technologies Group, AMD has achieved a development milestone for their upcoming Vega 10 GPU series. The tweet by Raja confirms that AMD is on track with the development of their next-generation graphics products which are expected to launch next year in the Radeon family.

DTIjWSK.jpg


0v946lM.jpg


KOaNOXb.jpg


https://twitter.com/GFXChipTweeter/
http://wccftech.com/amd-vega-10-gpu-milestone/
 

FoxSpirit

Junior Member
On most gpus these days a medium oc adds maybe 10-15 watts of power consumption, which is nothing

Unlike 20 percent performance, which is a LOT

that's enough to make framedrops to 50 fps (aka repeating frames with vsync-> microstutter) completely go away as they'll now be 60 fps
A different gpu model with 20 percent more performance can often cost 100 or 200 dollars more

It's literally free performance

the gtx 980 reference is about 20-25 percent faster than a gtx 970 reference, every gtx 970 can overclock to be about 20 percent faster than reference clocks
Person A is willing to overclock and gets near gtx 980 performance (obviously if he had a 980 and oced he get even more performance)

You're not willing to oc so you'd do what? pay 200 dollars more for a 980?

Currently the rx 480 seems to be less than 20 percent faster than the rx 470, but it costs 50 percent more. Since 20 percent is nothing why buy the 480 and not just get the 470 it's a lot cheaper?

Same with cpus btw, go from 3.9ghz boost clock to 4.4ghz oc, it's a free 10-15 percent performance you won't hear me complain about that.
Hell the old sandy bridge cpus had a very low stock clock and overclocking on those gave you over 30 percent performance extra

Having good overclock headroom on hardware is awesome and adds a ton of value to it
Hmm, good point for the 470. Might get this one. I mean, I can run Overwatch pretty well with my old HD-6850, stable 50fps with the biggest caveats being 92% render scale and low textures. I could go to stable 70 but lighting set to Ultra makes too much of a difference for me in visual clarity. Still, good idea. Then again, I heard the new NVidia 1060 could also seriously deliver on price and performance/watt. Waiting a bit more.
 

Kayant

Member
Via - http://videocardz.com/62250/amd-vega10-and-vega11-gpus-spotted-in-opencl-driver

The following list can be found in OpenCL driver, that is present in the latest Crimson software.

  • SI: TAHITI
  • CI / GFX7: MILOS, KRYPTOS, HAWAII, NEVIS, PENNAR, BONAIRE, Kabini
  • VI / GFX8: ICELAND, TONGA, CARRIZO, BERMUDA, racerx, FIJI
  • GFX81: AMUR, STONEY, ELLESMERE, DERECHO
  • GFX9: GREENLAND, RAVEN1X, VEGA10, VEGA11

The SI, CI, VI and GFX stand for GPU generations. The latest, yet unreleased architecture is GFX9, which includes Greenland, Raven1X, Vega10 and Vega 11. For quite some time Greenland was rumored to be just another codename for Vega10, but since it’s listed separately, we should assume that Greenland is something else, probably an integrated graphics chip.
AMD-VEGA10-and-VEGA11-GPUS.png
 

Locuza

Member
It's very interesting to see Greeland and Vega10 separately.
There is Ellesmere (Polaris 10) but no Polaris 10 naming directly.

Greenland is a 4096-ALU GPU with 2048-Bit HBM2, DP:SP of 1:2 and 4 GMI-Links.
I thought that Vega10 is probably the new name for Greenland, like Polaris 10 was for Ellesmere or Polaris 11 for Baffin.
It would suprise me if AMD has the ressources to pull of two separate Chips, one for the HPC-Clients and one for the "normal" Consumers, like Nvidia will do this round with GP100 and GP102.

But GPUs aside, it seems like Raven1X is the integrated GPU for Raven-Ridge which would be really nice, especially if the Vega-IP finally hits DX12 FL12.1.
 
Top Bottom