• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD R9 390X benchmarks leak

x3sphere

Member
4gb on hbm is more than enough and asking for more for games is a little crazy.

4GB is fine for the step down 390, but the flagship should really have 8GB if they want to compete with NV.

I guarantee if it ends up being 4GB at a similar price point to the 980 Ti, Nvidia will outsell them handily. People generally aren't looking for "just good enough" in a flagship card.
 
I just need it to play games. My Monitor is 1080p and uses HDMI. I would like it if there was 2 HDMI outputs on it if possible.
Most cards do not have 2x HDMI on them, rather a mish mash of HDMI, DVI, and display port. You would need to look for an OEM variant for that.
What kind of games do you play / want to play?
Right, so everyone on this board better toss their 980 SLI, 970SLI, 980, 970 rigs in the garbage.

No, I am just stating that 4GB isn't exactly the best idea for a new top end card coming out in june of 2015. Especially if it has all this shading power and bandwidth that cannot be put to good use because it runs out of framebuffer.
 

mkenyon

Banned
I'm going to buy another GTX 970 to do SLI and have better performance than the 390X.
Assuming that the game has SLI profiles. And that those SLI profiles have solid frame delivery. You do have to deal with that 1 frame of input lag though.

:p

Have you done this before?
 

LCGeek

formerly sane
If it comes out to around $600-$700 and beats out the Titan X, then it will sell like hotcakes. Beating out the 980, 980 Ti and Titan X at a more affordable price will be damn nice. However, I'll just probably add another GTX 970 and beat out all of those cards by themselves :).

The beauty of new cards is this the price drops that will be incoming if the 390x is worth it have me salivating. Last time this happened got two of my midrange cards 680 and 7950 at digustingly cheap prices compared to how they were weeks before the drops.

GTA5 has sealed my single gpu desires until something mid range comes out that does it 1080p locked or near not bothering with a new gpu or system. If witcher is the same my hunch will be justified till the end of this generation.
 

Vuze

Member
What's the best graphics card to capture footage of your desktop at the highest framerate?

Like 60fps 1080p with several video overlays playing? Possible?

I heard shadowplay has limitations and the games have to be fullscreen.
You can record your whole desktop with Shadowplay, I had several games where the game-only hook didn't work, they were captured just fine with the desktop-hook.

As for limitations, I dunno. Can capture up to 4k @ 30 or 60fps regardless of resolution; not sure if higher fps is possible, I only have a 60Hz monitor so maybe that's the issue. I really like Shadowplay, it's a great feature.
 

mkenyon

Banned
Exactly. Like Lambo's these are "high end" products.



What's the best graphics card to capture footage of your desktop at the highest framerate?

Like 60fps 1080p with several video overlays playing? Possible?

I heard shadowplay has limitations and the games have to be fullscreen.
A dedicated capture card. Any other answer is wrong.

Both NVIDIA and AMD use an onboard H.264 encoder to capture footage.
 
Most cards do not have 2x HDMI on them, rather a mish mash of HDMI, DVI, and display port. You would need to look for an OEM variant for that.
What kind of games do you play / want to play?


No, I am just stating that 4GB isn't exactly the best idea for a new top end card coming out in june of 2015. Especially if it has all this shading power and bandwidth that cannot be put to good use because it runs out of framebuffer.

You said Star Citizen 1080P uses over 4GB of RAM. What does that mean for 970s, 980s, and their SLI rigs then? If a 980 4GB will play the game fine, so will a 390X.
 
Assuming that the game has SLI profiles. And that those SLI profiles have solid frame delivery. You do have to deal with that 1 frame of input lag though.

:p

Have you done this before?

That's why I don't care for SLI. . . all that money for annoyance? PC's have enough of that already. :)
 

mkenyon

Banned
You said Star Citizen 1080P uses over 4GB of RAM. What does that mean for 970s, 980s, and their SLI rigs then? If a 980 4GB will play the game fine, so will a 390X.
They probably mean with some ludicrously high AA type settings.

SC runs fine on my 780 Ti KPE with 3GB of memory at 1440p.
 

JJKillaNOLE

Neo Member
Very few titles actually use anywhere near 4GB of memory. Smart games will use ALL of the VRAM you have in-order to cache assets. Give them 12GB of VRAM, they may very well fill it.

Monitoring GPU memory usage while playing many titles doesn't give you a real indication of how much VRAM is required. If you're loading Skyrim texture packs, it's useful because you have a baseline number and know that the engine won't cache assets up to your VRAM max normally.

I can remember not too long ago, summer of last year, I was reading threads on how 2gb of vram was more than enough for 1080p. This is when I had my 770 and thinking about upgrading. 4gb is not enough. And the people who think otherwise are fine with turning down settings like textures and AA. Something most avid PC gamers who spend top dollar on a video card don't want to do.
 

mnannola

Member
Anyone thinking this is going to be less than $700 is crazy. This thing is beating a $1,000 card if these benchmarks are to be believed.
 
They probably mean with some ludicrously high AA type settings.

SC runs fine on my 780 Ti KPE with 3GB of memory at 1440p.

You are most likely right. He threw that fact out there like some deterrent to buy a 390x 4GB. My money is that the card will have 4GB and 8GB versions (yes its possible with HBM)

Also- If there was a Titan X at $700 with 4GB RAM, this board would be ALL over it.
 

Genio88

Member
Are these new HBM gonna be really important gaming wise in your opinion? I mean for example overclocking GPUs memory clock speed doesn't give almost any fps boost in games, unlike the GPU core clock, plus i don't think developers will take advantage of them given that console still have DDR3 and GDDR5 memory, and we have seen in the recent Witcher 3 how console influence the PC version's development.
 
A dedicated capture card. Any other answer is wrong.

Both NVIDIA and AMD use an onboard H.264 encoder to capture footage.

Can you loop footage back into a dedicated capture card from a HDMI graphics card residing in the same PC? It's so I don't need to buy a seperate PC to capture stuff.

I'll settle for the in-built recording options of a graphics card as long as they can just capture desktop footage without capture limitations (like games having to be fullscreen, several video overlays running at once etc).

edit - ok just read about shadowplay being able to do this. Thanks.
 
Anyone thinking this is going to be less than $700 is crazy. This thing is beating a $1,000 card if these benchmarks are to be believed.

amd will be the one being crazy if they priced it above 700 bucks. titan x holds so little in the marketshare. unless they want their rebranded/refresh to go up against the 900s, which the "new" 360/70/80 will fail to beat.
 

mkenyon

Banned
SLI'd? Nope. Never had the money for it.
If you stick to big budget AAA type games, you'll generally be okay on the SLI profile side of things. If you play anything outside of that, it's basically a dice throw on whether or not SLI will be working.

Even in some cases of AAA, you won't have profiles on Day 1 of release, perhaps not even week 1. Sometimes never.

In both cases, you can generally find workarounds to get it working, but that takes 5-30 mins or more of getting the right custom profiles. When those get loaded, you might get uneven frame pacing, which is seen on your end as stuttering, or often labeled as "micro stuttering".

If you like tinkering, none of that is really a hassle, and part of the fun of having a high performance PC. If you want things to just work though, it can give you headaches.
Can you loop footage back into a dedicated capture card from a HDMI graphics card residing in the same PC? It's so I don't need to buy a seperate PC to capture stuff.

I'll settle for the in-built recording options of a graphics card as long as they can just capture desktop footage without capture limitations (like games having to be fullscreen, several video overlays running at once etc).

edit - ok just read about shadowplay being able to do this. Thanks.
Yes, you can loop it back into the same PC. It's also going to give you less of a performance hit than Shadowplay/Game DVR. Game DVR is AMD's version of Shadowplay, and works totally the same.

Both Game DVR and Shadowplay will probably work just fine if you're not after super high frame rates (100 FPS+).
 

tuxfool

Banned
How the hell AMD still cant get their power consumption under control after all these years is frustrations. It may not matter to some but for me it is big issue as electricity costs are out control here and my computer is nearly always on.

You are aware that cards draw less power at idle?

Unless your game is running 24/7 or you're constantly running simulations using compute, the fact that your computer is on all the time isn't all that relevant.
 

bj00rn_

Banned
Right, so everyone on this board better toss their 980 SLI, 970SLI, 980, 970 rigs in the garbage.

That's exactly what some of us will do around at the time when the game is finally released. Most of our rigs will be upgraded for VR well before that time anyway :)
 
You are aware that cards draw less power at idle?

Its a losing battle. Nvidia has got everyone to think that perf/watt is the end all be all of metrics now. The whole green "efficiency" thing.

Thats why Nvidia users are buying mid-range cards at $600, and top end cards at $1k.

The 980s (204) lineage is a mid range card not one person can deny that.
 

tuxfool

Banned
Are these new HBM gonna be really important gaming wise in your opinion? I mean for example overclocking GPUs memory clock speed doesn't give almost any fps boost in games, unlike the GPU core clock, plus i don't think developers will take advantage of them given that console still have DDR3 and GDDR5 memory, and we have seen in the recent Witcher 3 how console influence the PC version's development.

It will certainly provide advantages at higher resolutions.
 
can anyone explain how a faster and bigger mem bandwidth can have tangible effects on games?

especially with games that are released recently like the witcher, and older games where hbm isn't really the standard.
 
You said Star Citizen 1080P uses over 4GB of RAM. What does that mean for 970s, 980s, and their SLI rigs then? If a 980 4GB will play the game fine, so will a 390X.

They probably mean with some ludicrously high AA type settings.

SC runs fine on my 780 Ti KPE with 3GB of memory at 1440p.

No, just normal 1080p with only SMAA1x. This is without any enemy ships even.
starcitizen_2015_05_1dssi4.png
Having played Star Citizen with a 970 before, the game stutters due to VRAM going above the limit at 1080p.
 

Irobot82

Member
you're fooling yourselves if you think any card will run SC at 60fps fully finished. At least not until 14nm and then probably another arch revision after that.
 
If you stick to big budget AAA type games, you'll generally be okay on the SLI profile side of things. If you play anything outside of that, it's basically a dice throw on whether or not SLI will be working.

Even in some cases of AAA, you won't have profiles on Day 1 of release, perhaps not even week 1. Sometimes never.

In both cases, you can generally find workarounds to get it working, but that takes 5-30 mins or more of getting the right custom profiles. When those get loaded, you might get uneven frame pacing, which is seen on your end as stuttering, or often labeled as "micro stuttering".

If you like tinkering, none of that is really a hassle, and part of the fun of having a high performance PC. If you want things to just work though, it can give you headaches.

Yes, you can loop it back into the same PC. It's also going to give you less of a performance hit than Shadowplay/Game DVR. Game DVR is AMD's version of Shadowplay, and works totally the same.

Both Game DVR and Shadowplay will probably work just fine if you're not after super high frame rates (100 FPS+).

I mean, which high end (graphically) games out right now that don't have SLI profiles? Also, is it really never straight forward on getting it to work?
 

mkenyon

Banned
I mean, which high end (graphically) games out right now that don't have SLI profiles? Also, is it really never straight forward on getting it to work?
As long as you keep your drivers up to date, then most bigish budget new releases will have profiles in the first week. Otherwise, no, it's not straight forward. Tribes: Ascend, for example, never had SLI profiles. CoD: Black Ops didn't have working profiles for at least the first threeish weeks. For more recent titles, Elder Scrolls Online, Shadows of Mordor, Titanfall, and GAF's favorite Dark Souls/Dark Souls II didn't have SLI profiles for months. I'm not sure if all of them even do have SLI profiles currently.

Most of the games I play (competitive and multiplayer games - which means a decent chunk of F2P games like Planetside 2 or H1Z1), never get SLI profiles.

In most of these cases, you can generally find workarounds through custom profiles though. They're never quite as smooth though.
That is exactly why people spend 700 USD on a GPU, to turn down their texture resolution settings...
I spent $1200 on mine between the card and waterblock. $750 on my display (RoG Swift). I don't really care about textures in comparison to high frame rates. So, yes, that is what some people do :p
 
As long as you keep your drivers up to date, then most bigish budget new releases will have profiles in the first week. Otherwise, no, it's not straight forward. Tribes: Ascend, for example, never had SLI profiles. CoD: Black Ops didn't have working profiles for at least the first threeish weeks. For more recent titles, Elder Scrolls Online, Shadows of Mordor, Titanfall, and GAF's favorite Dark Souls/Dark Souls II didn't have SLI profiles for months. I'm not sure if all of them even do have SLI profiles currently.

Most of the games I play (competitive and multiplayer games - which means a decent chunk of F2P games like Planetside 2 or H1Z1), never get SLI profiles.

In most of these cases, you can generally find workarounds through custom profiles though. They're never quite as smooth though.

I spent $1200 on mine between the card and waterblock. $750 on my display (RoG Swift). I don't really care about textures in comparison to high frame rates. So, yes, that is what some people do :p
I also put HFR ahead of IQ. But turning down texture quality is like a punch to the gut due to how most games handle texture quality :/
 

mkenyon

Banned
I also put HFR ahead of IQ. But turning down texture quality is like a punch to the gut due to how most games handle texture quality :/
Subjective.

In my gaming circle, the first thing that everyone does is crank down all the settings until the frame rate doesn't drop below 120+. In anything competitive (which SC would qualify as for us), we'll turn down every setting just to make the core gameplay more clear and easier to see, things like shadows and post processing is almost always the first to go.

Not everyone drops tons of money on high end PC gear to see high IQ.
This has to be fake. 4GB VRAM on flagship card? Nahhh

But if so

lol
There will undoubtedly be 8GB variants through the AIBs. There are with the 290X even.
 

pestul

Member
These benchmarks are 2 months old and I don't trust them for a second. I also think the power consumption will be less than the R29x series.
 

Kambing

Member
I'm going to feel pretty shit if this outperforms the Titan X, while costing less... At least competition is a good thing?
 

golem

Member
I mean, which high end (graphically) games out right now that don't have SLI profiles? Also, is it really never straight forward on getting it to work?
GTAV stutters like crazy for me in SLI. Ryse has flashing decals which are super annoying. Divinity Original Sin is unplayable in SLI mode. Far Cry 4 shadows flicker in full screen mode.

ACU on the other hand works and looks great (although performance is still subpar overall)
 
As long as you keep your drivers up to date, then most bigish budget new releases will have profiles in the first week. Otherwise, no, it's not straight forward. Tribes: Ascend, for example, never had SLI profiles. CoD: Black Ops didn't have working profiles for at least the first threeish weeks. For more recent titles, Elder Scrolls Online, Shadows of Mordor, Titanfall, and GAF's favorite Dark Souls/Dark Souls II didn't have SLI profiles for months. I'm not sure if all of them even do have SLI profiles currently.

Most of the games I play (competitive and multiplayer games - which means a decent chunk of F2P games like Planetside 2 or H1Z1), never get SLI profiles.

In most of these cases, you can generally find workarounds through custom profiles though. They're never quite as smooth though.

I spent $1200 on mine between the card and waterblock. $750 on my display (RoG Swift). I don't really care about textures in comparison to high frame rates. So, yes, that is what some people do :p

GTAV stutters like crazy for me in SLI. Ryse has flashing decals which are super annoying. Divinity Original Sin is unplayable in SLI mode. Far Cry 4 shadows flicker in full screen mode.

ACU on the other hand works and looks great (although performance is still subpar overall)

The problem is, I don't exactly want to move away from Nvidia, it just doesn't help that the 980 and Titan X are super fucking expensive. I'm sure the 980 Ti will be pretty pricy too.
 
Top Bottom