• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF: [Budget 4k] Destiny 2 vs GTX 970/ GTX 1060: 30fps Is Easy, But What About 60?

MTC100

Banned
Things seems to have slowed down because AMD isn't competing, and Nvidia is delaying products. So the progressive loss of optimization is being slowed by the fact that drivers aren't moving on as fast as before.

It happened to CPUs before, now it's happening to GPUs too.

I actually read some article about this phenomenon a few months ago, it came to the conclusion that it indeed isn't as easy as it was ages ago to increase processing power of GPUs and CPUs. It may look like intel or nvidia are milking their customers when in fact we hit a technological plateau a while ago.

All we can do now is wait for smaller manufacturing processes so nvidia, amd and intel can cram up more transistors into their GPUs and CPUs. I also wouldn't expect too much of a PS5 in the near future, I would be surprised if it would have more horsepower than a non-Ti GTX 1080(perhaps even 1070). That's fine though, the bottleneck are the CPUs with consoles, let's hope AMD somehow figures out a way to build an APU with a powerful CPU.
 

Polygonal_Sprite

Gold Member
1) I made a general case that applies to titles that are badly optimized and rushed. Picking what appears as one of the best optimized title of the year is just about cherry picking.

As someone who paid £850 for a i5 4690k @ 4.5GHz / 970 equipped PC who was looking to move over to the platform as my main place to play the latest AAA games at 60fps instead of 30fps on console I absolutely agree.

Destiny 2 being a fantastic port doesn't make up for the shit show that was Assassin's Creed Unity, Ryse, Arkham Knight, Mortal Kombat X, Pro Evolution Soccer, No Mans Sky, Forza Horizon 3 and Dishonoured 2 all of which fell short of 60fps on my PC even when dramatically lowering settings like shadow and aa quality. Sure those games may work fine now but at launch and for months after they were awful unless you had crazy hardware to just brute force 60fps performance. The truth of the matter is that most AAA console ports to PC are cheap and extremely low effort and in the same ballpark as the effort put into Wii U ports.

If you complain online about not getting 1080/60 performance out of a 970 you always get the same answers from PC players... "I don't play those sorts of games on PC" which is fine and all but not when most of the people saying that talk about how PC gaming is better in every way over consoles even on extremely low end hardware it's not much help.

Sure a 970 can easily match the standard PS4's 30fps performance but in most cases it falls way short of the same game with the same visual settings at a locked 60fps (the reason I went with that hardware) and even further short of PS4 Pro's 1600 / 1800p performance.

Goal posts are moved all the time in PC gaming with the advent of new GPU's so I think it's unfair to discount PS4 Pro from these discussions, especially because at almost a third of the price I paid for my PC I can play the latest AAA games looking great and very close to the same image quality as native 4k.

I'm glad Destiny 2 is one of the few console ports that is actually optimised. Enjoy!
 

rtcn63

Member
Sure a 970 can easily match the standard PS4's 30fps performance but in most cases it falls way short of the same game with the same visual settings at a locked 60fps (the reason I went with that hardware) and even further short of PS4 Pro's 1600 / 1800p performance.

I was playing games at 1080p/60fps and at visual settings comparable to if not better than on a base PS4... with an i5 + GTX 670. (Most of the time, at least) And IIRC, the GTX 750ti was recommended for budget PC users looking for PS4/XB1-ish settings and performance by most of the major sites.

There are a lot of PC ports that aren't total trash. I'd even dare say that those like Dishonored 2 and Arkham Knight are in the minority. Some like Just Cause 3 are terrible on consoles as well.
 
As someone who paid £850 for a i5 4690k @ 4.5GHz / 970 equipped PC who was looking to move over to the platform as my main place to play the latest AAA games at 60fps instead of 30fps on console I absolutely agree.

Destiny 2 being a fantastic port doesn't make up for the shit show that was Assassin's Creed Unity, Ryse, Arkham Knight, Mortal Kombat X, Pro Evolution Soccer, No Mans Sky, Forza Horizon 3 and Dishonoured 2 all of which fell short of 60fps on my PC even when dramatically lowering settings like shadow and aa quality. Sure those games may work fine now but at launch and for months after they were awful unless you had crazy hardware to just brute force 60fps performance. The truth of the matter is that most AAA console ports to PC are cheap and extremely low effort and in the same ballpark as the effort put into Wii U ports.

This is not true, I don't know how you have come to that conclusion.

If you complain online about not getting 1080/60 performance out of a 970 you always get the same answers from PC players... "I don't play those sorts of games on PC" which is fine and all but not when most of the people saying that talk about how PC gaming is better in every way over consoles even on extremely low end hardware it's not much help.

Sure a 970 can easily match the standard PS4's 30fps performance but in most cases it falls way short of the same game with the same visual settings at a locked 60fps (the reason I went with that hardware) and even further short of PS4 Pro's 1600 / 1800p performance.

Goal posts are moved all the time in PC gaming with the advent of new GPU's so I think it's unfair to discount PS4 Pro from these discussions, especially because at almost a third of the price I paid for my PC I can play the latest AAA games looking great and very close to the same image quality as native 4k.

I'm glad Destiny 2 is one of the few console ports that is actually optimised. Enjoy!

What games are you talking about? I've had a GTX 970 from November 2014 and playing at 30 fps hasn't even been an option for me, I've always been able to run games at greater than or in the worst case, equal to PS4 graphical settings at 1080p 60 fps, I've even had the choice of playing games at higher resolutions at the same frame rate target the PS4 has such as 30 fps, while being able to run resolutions such as 1440p and above, this is a 77.7% increase in resolution. But I prioritize gameplay so I always go for 60 fps over 30, I don't even recall the worst case scenarios, I'm not sure if there has been any.

CPU: i7 4790k 4.7GHz
Ram: Kingston HyperX Savage 16GB (2x8GB) 2400MHz DDR3 ram
GPU: Gigabyte G1 GTX 970 (Core - 1530MHz, Memory - 7516MHz)
 

Philtastic

Member
Well, I'd like to see comparisons then. Sadly it seems no one is doing them properly, and that means establishing console settings first and then run side by side.



Now you're exaggerating. I've never said the 970 is bested by a PS4, just that on PC it's moving from "max settings", to "average".

It's not like we're prophets looking at the future. I simply noticed that all most modern titles, from The Division onward, ran like crap on a 770. The 770 that at PS4 launch was able to double the framerate at console settings. So, early stages the 770 was 2x a PS4, now a 770 runs worse than a PS4 in many cases.

You've made two contradicting statements here: 1) No one knows what the console-equivalent graphical settings are but 2) the 770 is performing worse over time at console-equivalent settings (that we don't know as in statement #1). A lot of people argue that console hardware somehow runs better over time, typically because developers become more familiar with programming for the hardware, while PC hardware somehow runs worse over time, typically because more advanced techniques are being implemented that are optimized for consoles and not PCs. While we could subscribe to this conspiracy-esque theory, the more likely scenario is that games on PC just have more advanced effects that can be enabled which naturally require more power. Tom Clancy's The Division, as named above, has improved shadows, normal maps, parallax occlusion mapping, ambient occlusion, volumetric lighting, depth of field, reflections, and other things on PC as mentioned in the following Digital Foundry comparison:
http://www.eurogamer.net/articles/digitalfoundry-2016-the-division-face-off

We need to account for these additional features and higher resolution/higher accuracy effects that become the new baseline in PC games as time marches on, which is a very good potential reason for why GPUs that could handle ultra settings at a console launch might now be reduced to medium or high settings. As you alluded to, however, we often don't know what the equivalent PC settings are to match consoles, and maybe those settings don't even exist due to changes in how shadow accuracy/resolution, reflections, etc. are handled. More demanding effects being implemented on PC is a much more rational explanation for why older GPUs can't handle ultra-level effects rather than developers optimizing better for consoles (which use PC hardware these days) while being lazy on PC optimization.
 

Magwik

Banned
I actually read some article about this phenomenon a few months ago, it came to the conclusion that it indeed isn't as easy as it was ages ago to increase processing power of GPUs and CPUs. It may look like intel or nvidia are milking their customers when in fact we hit a technological plateau a while ago.

All we can do now is wait for smaller manufacturing processes so nvidia, amd and intel can cram up more transistors into their GPUs and CPUs. I also wouldn't expect too much of a PS5 in the near future, I would be surprised if it would have more horsepower than a non-Ti GTX 1080(perhaps even 1070). That's fine though, the bottleneck are the CPUs with consoles, let's hope AMD somehow figures out a way to build an APU with a powerful CPU.
The boost to next consoles will almost certainly come from the CPUs.
 

Zeneric

Member
Yea, 970 easily handles Destiny 2 at 1080/60 np. I was able to get 90-120 fps out of the card for the Destiny 2 PC beta version on High settings and other settings set to on (except DoF and Motion Blur which were turned off and Texture Quality was set to Extra High). 970 is still able to handle latest AAA games on High/Ultra settings running at 60 FPS in 1080p. 970 is still excellent for 1080/60 today and will be for a long while (till next generation consoles come out that push graphics beyond current generation graphics).
 
Shits embarrassing.

Ps4 will not beat the 970 GTX.

The 970 GTX is comparable to the Pro.

Anyone with an ounce of common sense knows that the obsolete PC hardware myth is just that.

I had a thread a couple of years ago now showing how the 8800GT down clocked by 50% was still matching the Xbox 360 in 2013.

Please stop.
 
While we could subscribe to this conspiracy-esque theory

It's not a "conspiracy theory", it's just plain logic and common sense. Not long ago even Linustech arrived to the same conclusion.

When new games arrive Nvidia assigns a number of engineers to optimize drivers and effects for that game. Good engineers are a limited resource, so of course you don't make them "waste" their time optimizing a new game for a 7xx card. Something similar was clearly shown with The Witcher 3: at release there was a wide gap between 7xx and 9xx, and the complaints were so loud and widespread at the time that Nvidia assigned some engineers to backport the optimizations and increase performance on the older cards.

By the time a new generation will arrive, dev hours will be moved onto the new architecture. Nvidia optimizes to sell hardware, game developers optimize for the most common. Right now most common still happens to be what I anticipated would be on the way to obsolescence. Add to this the failure of DX12 to deliver tangible difference and you can see how this effect slowed down.
 
By the time a new generation will arrive, dev hours will be moved onto the new architecture. Nvidia optimizes to sell hardware, game developers optimize for the most common. Right now most common still happens to be what I anticipated would be on the way to obsolescence. Add to this the failure of DX12 to deliver tangible difference and you can see how this effect slowed down.

There is a channel on YouTube called JERM gaming. It is the home of the Potato Masher, a low-cost used PC based on a gtx 760, a 7xx generation card. The channel has done a lot of comparisons between the Potato Masher and a PS4.

The Potato Masher: https://www.youtube.com/playlist?list=PLQbCPWtOQp0FoY_-7GwWSErWP2j7--Hh5

This PC has been consistently outperforming PS4 since the series began. Only the most horrendous PC ports like Dishonored 2 and Mafia 3 proved too much of a challenge. What this means to me is that older cards actually have a lot of longevity when you have reasonable expectations.

PC hardware evolves and PC games evolve with it. If you want to keep running everything at max settings then you'll have to upgrade frequently, but if you just want to get the most out of a card before you upgrade then you can go for many years before needing an upgrade.
 
There is a channel on YouTube called JERM gaming. It is the home of the Potato Masher, a low-cost used PC based on a gtx 760, a 7xx generation card. The channel has done a lot of comparisons between the Potato Masher and a PS4.

The Potato Masher: https://www.youtube.com/playlist?list=PLQbCPWtOQp0FoY_-7GwWSErWP2j7--Hh5

This PC has been consistently outperforming PS4 since the series began. Only the most horrendous PC ports like Dishonored 2 and Mafia 3 proved too much of a challenge. What this means to me is that older cards actually have a lot of longevity when you have reasonable expectations.

PC hardware evolves and PC games evolve with it. If you want to keep running everything at max settings then you'll have to upgrade frequently, but if you just want to get the most out of a card before you upgrade then you can go for many years before needing an upgrade.

i dont agree with the guy youre replying to, but jermgaming videos are mostly horrible and not a good source for objective comparisons
 

Snubbers

Member
I found that highly annoying, having a gtx970, 1440p/locked 60 on most highest settings was the perfect sweets pot for me, far nicer than 4k/60ish with less visual glitz.. 1080p/60 @ highest was arguably not bad either.

I've noticed with George experience, it always wants you to go 4k on lowish settings rather than 1440p at high is settings, which looks night and day better to me on a 40" monitor sat 2 feet away...

No wonder we have this resolution is everything vibe from developers where framers and nice visuals come second..
 
Oh not at all, I've watched every video and I completely disagree.

ive seen videos where the he points out that the pc version runs smoother despite the video showing it is clearly not the case, fails to notice the complete lack of antialiasing in the pc version, huge texture quality differences due to vram which he usually doesnt even mention or says its extremely minor etc
 
ive seen videos where the he points out that the pc version runs smoother despite the video showing it is clearly not the case, fails to notice the complete lack of antialiasing in the pc version, huge texture quality differences due to vram which he usually doesnt even mention or says its extremely minor etc

Could I trouble you for a link?
 
Can a GTX 960 or 1050 Ti do 1080p 60fps maxed or near-maxed?

Not possible for 1050Ti, but surely possible on high.
destiny2-gpu-bench-10zou5t.png
 

Philtastic

Member
It's not a "conspiracy theory", it's just plain logic and common sense. Not long ago even Linustech arrived to the same conclusion.

When new games arrive Nvidia assigns a number of engineers to optimize drivers and effects for that game. Good engineers are a limited resource, so of course you don't make them "waste" their time optimizing a new game for a 7xx card. Something similar was clearly shown with The Witcher 3: at release there was a wide gap between 7xx and 9xx, and the complaints were so loud and widespread at the time that Nvidia assigned some engineers to backport the optimizations and increase performance on the older cards.

By the time a new generation will arrive, dev hours will be moved onto the new architecture. Nvidia optimizes to sell hardware, game developers optimize for the most common. Right now most common still happens to be what I anticipated would be on the way to obsolescence. Add to this the failure of DX12 to deliver tangible difference and you can see how this effect slowed down.
Way to cherry pick one game where they screwed up their drivers and then fixed them. What you said also isn't relevant to our conversation: even if the 7xx series of cards (or other series) was not performing as well as it should compared to other PC GPUs, to stay on your point, you would need to demonstrate that they were not performing as well as they should compared to consoles. You would also be ignoring how AMD GPUs seem to perform better over time, even when using APIs that don't directly benefit from console optimizations, which is directly the opposite of your claim that PC GPUs in general start lagging behind consoles as the generation wears on.

Case in point: going back to your example of The Division seemingly not performing as well on a 770 compared to the PS4, benchmarks indicate that a 770 matches the 30 fps minimum frame rate of the PS4 at 1080p when put to the "High" preset and increases its minimum frame rate by ~50% on "Medium":
http://www.pcgamer.com/tom-clancys-the-division-benchmarks-and-optimization-guide/2/

What are the console-equivalent settings? Given all of the rendering quality improvements that I mentioned in my previous post, do console-equivalent settings even exist on PC for The Division?
 

drek_max

Neo Member
PS4 GPU is a modified AMD 7870 gpu.
PS4 pro GPU is a pumped up AMD 470 ( or pumped down 480).
In real gaming scenarios, GTX 970 is almost 2X powerful than a AMD 7870 and equal and sometimes better than 480.
The game performance on PC obviously depends on optimization.
Gtx 970 (especially overclocked) with good pc is capable of delivering 2X performance of original PS4 and equal or better than PS4 pro. Whether the devs utilize the horsepower or not is upto them (aka optimization).
 

Durante

Member
Sure a 970 can easily match the standard PS4's 30fps performance
So basically you completely disagree with the point that was being criticized, which contradicted this (and was stupid and wrong, so good on you for disagreeing).

but in most cases it falls way short of the same game with the same visual settings at a locked 60fps
Well yes, it's not exactly twice as fast as a PS4, so it won't get locked 60 FPS in some games (particularly if they don't run at a locked 30 on PS4 in the first place).

All of this is really rather simple.
 
Shockingly, performance between two chips with similar specs on one architecture is also comparable.

if you go look at performance summaries you can clearly see this isnt the case. a 470 is 30% faster than a 280x. polaris compared to tahiti also showed a 20% avg perf improvement from architecture alone. the closed box nature of a console will further improve this. and ps4 pro has some polaris + tweaks
 

JWiLL

Banned
The 970 is definitely holding up better than I expected, but I still believe the gap is narrowing and I'd like to see a comparison of Battlefront or COD or Farcry 5 to see how they run on a 970 at *console settings* (and not just "guesswork").

What about last year Battlefield 1 and COD? I do remember COD ran very poorly on PC.

You seem to have a very clear and very strange agenda of trying to act like consoles can somehow close gaps with mid-high range PC hardware (which is what the 970 was at launch), but you also have terrible info.

Battlefield 1 ran great on PC basically from launch and I played the Infinite Warfare campaign at 1440p at 120-144fps consistently on my 1080, all highest settings save for medium shadows and with 1xTXAA. Only time it had drops was during elevator sections on your homeship, which were clearly doubling as loading screens so it was understandable.

Also, part of the beauty of owning a PC is that you never have to run on "console settings". Though in this case that would mean lowering settings or omitting them entirely, so I don't see how that even helps your argument. It would only improve performance for the 970.
 

dr_rus

Member
if you go look at performance summaries you can clearly see this isnt the case. a 470 is 30% faster than a 280x. polaris compared to tahiti also showed a 20% avg perf improvement from architecture alone. the closed box nature of a console will further improve this. and ps4 pro has some polaris + tweaks

What's 470 has to do with 280X or PS4Pro? 470 is almost 5TFlops card running a different shader core config to PS4Pro.

Polaris compared to Tahiti did introduce some architectural changes but in performance area they are mostly down to introduction of DCC which in turn allows PS4Pro's 217GB/s bandwidth to directly compete with 280X's 288GB/s. The rest of architectural changes between GCN1 and GCN4 have rather minor effect on performance.

Closed box nature can't help in optimizing GPU shader throughput since 99% of such optimizations will result in performance gains on PC GPUs of the same architecture too.

So nothing "absolutely" here, PS4Pro GPU is definitely comparable in its performance to a 280X PC GPU.
 
What's 470 has to do with 280X or PS4Pro? 470 is almost 5TFlops card running a different shader core config to PS4Pro.

Polaris compared to Tahiti did introduce some architectural changes but in performance area they are mostly down to introduction of DCC which in turn allows PS4Pro's 217GB/s bandwidth to directly compete with 280X's 288GB/s. The rest of architectural changes between GCN1 and GCN4 have rather minor effect on performance.

Closed box nature can't help in optimizing GPU shader throughput since 99% of such optimizations will result in performance gains on PC GPUs of the same architecture too.

So nothing "absolutely" here, PS4Pro GPU is definitely comparable in its performance to a 280X PC GPU.

no

https://www.computerbase.de/2016-08...erformance/2/#diagramm-the-division-1920-1080
 

thelastword

Banned
Gets better performance on non VRAM limited scenarios.
How are we getting to this....

The PRO is capped at 30fps locked, but we are getting better performance on 970? How can we prove this....

All I'm seeing here is lots of guesswork, no proper testing methodology and what is quite an illogical and unfair conclusion to boot...When the PS4 PRO version is unlocked then we can talk.


Even then, with a better CPU paired with the 970, over the pro, it still seems to struggle to hit 60fps at 1800p with medium settings...It must also be known that this game performs best on Nvidia GPU's atm so that's a plus for it...There also seems to be a huge issue with some settings at the highest graphical level for AMD GPU's. The fact that AMD GPU's gets an uplift of 60+% by changing from highest settings to high, speaks volumes.

I'm sure there will be an AMD driver to fix some of these issues on the PC side, but Bungie should also do some further optimization on these AMD based consoles...Looking at some of the CPU+GPU combos that are doing 1080p 60fps on this game...I think the mid gen consoles could have done 1080p 60fps high settings.. and the XB1S and PS4 Slim 720p 60 and 900p 60 respectively...
 
As someone who paid £850 for a i5 4690k @ 4.5GHz / 970 equipped PC who was looking to move over to the platform as my main place to play the latest AAA games at 60fps instead of 30fps on console I absolutely agree.

Destiny 2 being a fantastic port doesn't make up for the shit show that was Assassin's Creed Unity, Ryse, Arkham Knight, Mortal Kombat X, Pro Evolution Soccer, No Mans Sky, Forza Horizon 3 and Dishonoured 2 all of which fell short of 60fps on my PC even when dramatically lowering settings like shadow and aa quality. Sure those games may work fine now but at launch and for months after they were awful unless you had crazy hardware to just brute force 60fps performance. The truth of the matter is that most AAA console ports to PC are cheap and extremely low effort and in the same ballpark as the effort put into Wii U ports.

If you complain online about not getting 1080/60 performance out of a 970 you always get the same answers from PC players... "I don't play those sorts of games on PC" which is fine and all but not when most of the people saying that talk about how PC gaming is better in every way over consoles even on extremely low end hardware it's not much help.

Sure a 970 can easily match the standard PS4's 30fps performance but in most cases it falls way short of the same game with the same visual settings at a locked 60fps (the reason I went with that hardware) and even further short of PS4 Pro's 1600 / 1800p performance.

Goal posts are moved all the time in PC gaming with the advent of new GPU's so I think it's unfair to discount PS4 Pro from these discussions, especially because at almost a third of the price I paid for my PC I can play the latest AAA games looking great and very close to the same image quality as native 4k.

I'm glad Destiny 2 is one of the few console ports that is actually optimised. Enjoy!

You do know that even with those so called bad ports (not all you mentioned were so bad either) they still would look and perform better on your pc than they did on consoles, right?
 
How are we getting to this....

The PRO is capped at 30fps locked, but we are getting better performance on 970? How can we prove this....

Pretty easily. The 970 runs the game at higher resolution, framerate and visual settings than the PS4 Pro, therefore it performs better.
 

thelastword

Banned
Capped at 30 but with checkedboarded to reach 4k...
Yes, with a better CPU and the fact that this is an NVidia sponsored title, means at this point, it performs better over a similar class of AMD GPU's. I don't think we should use that as an indicator of the strength of the PRO GPU.....

Perhaps DF should have used a GTX1080 with the Pentium CPU in their other destiny video, but a 1080ti holds a solid 60fps at 1080p with that CPU, whilst the RX Vega 64 can't hold 60fps at the highest settings at 1080p....Something is murdering AMD GPU's at the highest settings in the NV sponsored title and it has nothing to do with the CPU or DX11....So it's no doubt NV GPU's can hold better resolution and better settings over the AMD counterparts, at least for now...Heh, even now this game is not properly optimized for Ryzen CPU's....too...
 
I wouldn't necessarily assume poor amd performance on the pc side translates over to consoles. That being said this obv isnt a top tier console game technically. Basically theres no way for us to have any idea
 
"No" what? Nothing in this link is at odds with what I've said. GCN4's main performance gain is from a lot better memory bandwidth utilization thanks to DCC but 280X has sufficiently higher bandwidth than PS4Pro which completely negates this gain.

so you think dcc is accounting for up to 30% total perf improvement?
 

feel

Member
Damn I miss having the beta online on pc. Playing at high settings with all those effects popping fluidly in 120fps was just stunning.
 

dr_rus

Member
so you think dcc is accounting for up to 30% total perf improvement?

What's "up to"? Look at your own link for a change, it's averages at +15%. This is easily what DCC can account for and DCC is the most likely contributor. I'd even say that it's on a lower side of what I'd expect from couple of generations of DCC advancement.

This "up to" 30% is most likely related to a tessellation improvement between GCN1 and GCN4 but tessellation is something which can easily be tuned on a fixed h/w platform to a point where it's not the bottleneck. Remember that OG PS4 has to run the same games as Pro and I don't think that we had any examples of Pro having better tessellation so far.
 
Was also cool of DF to show that the aging 750 ti was still kinda holding up at console settings as well.

Of course it does.

You basically have a PS4/XBONE when you buy an Alienware Alpha.

There are a few rare cases where the performance is worse but you have a ton of optimization capability on the Alpha that can make every game look better and run faster.
 
I bought GTX 970 2,5 years ago with Witcher 3 and Batman AK codes (Glad Batman AK was broken on -sorry for who all wants to play day one for sure- PC so they gave all Batman games as a sorry. :p Also they did awesome optimization job with later updates. -gladly-)

Still rocking and probably i will be ok for 2-3 years more. Awesome card indeed.
 
Top Bottom