• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry - Playstation 5 Pro specs analysis, also new information

not with the zen 2 CPU



22:48

3060ti + 3600 = extreme bottleneck 30-36 fps with ray tracing. GPU is just sleeping waiting for CPU to do its job

SI4eM3l.png


half the gpu is underutilized, which means with a 2x faster CPU, 3060ti would've pushed for 60 fps in this scene. but it can't, due to CPU being an extreme bottleneck

I wonder how they tested this. It's common knowledge on PC that you set your anisotropic filtering to ON in the GPU's control panel within Windows and turn it off in the game itself.

50% of games with this option enabled use the CPU for this for some strange reason if the GPU is not manually set to handle it. That could be the case here.
 

Bojji

Member
this part is funny when you consider in the other thread, people would have you believe rx 6700 overperformed the ps5 at low resolutions because of console being CPU limited

now we have that in reverse here, lol

in the end though, it is and was a complicated topic. there will be situations where ps5 gpu will indeed be gpu limited at 720p and also situations where it will also be CPU limited at higher resolutions. both can be true at the same time for different contexts

That's what happens when you don't like results, hahaha. It was CPU limited few weeks ago in most games and now it isn't...

Let's see how past year went: Alan Wake, Pandora, Rebirth, FF XVI, Spider-Man 2, Lords of the Fallen... Daaaamn... that PS4 sure punches above of its weight...

Out of those games Alan runs like crap on PS5, same thing with LotF and FXVI is really, really bad in performance mode.
 
We don't know how efficient it is, we don't have any solid info about that. GPU tests done by DF shows that PS5 trades blows with 6700 - equivalent PC GPU, this shows that there aren't any magic optimizations done on consoles.



Those tests were GPU limited so CPU wasn't important as long as it wasn't bottlenecking GPU.



You can see where 3600 is better than PS5/4700S:

PS5:

u0XnUeK.jpg


3600:

AikLDBv.jpg
Huh only 8MB of CACHE!? I didnt know this part.
Still I think the current cpu will handle the games just fine.

This limitation was always the case with consoles further in their respective gens. Its not new, but that 8mb is barebones at least.
Guess it also comes down how it's adressed, how the RAM handles instructions etc....
 

Mr.Phoenix

Member
and you're doing reverse cherry picking. here's how
I am sorry, but I disagree.
- due to console jump being smaller, and pandemic, crossgen was longer than it is supposed to. even games like hogwarts legacy eventually landed on PS4. returnal can be ported to PS4 and only limitation would be the HDD (it runs very well on stuff like gtx 1050ti). this is one of the biggest reasons most games still kept having 60 FPS modes
- majority of 60 fps games are crossgen titles
- the actual "nextgen" games have just begun, and most of them exhibit CPU boundness problems. trying to paint a good picture by showcasing games like god of war ragnarok, spiderman, etc. is reverse cherry picking because they're not representative
At no point when I listed out games did I mention any cross-gen game. I always used examples like Alan Wake, Pandora and Baldurs gate 3. doesnt get any more next gen than those does it? But even if you want us to dig deeper, we can talk about SM2, which supports RT in both fidelity and performance modes, and is not on the PS4. Or Ratchet.

And you are massaging the facts here. I do not and should not care if the majority are cross gen or not. Thats not my problem. The fact of the matter, and this is fact, is that the majority of the game released on the platform thus far... as recently as this year... have a 60fps mode. We can not then use the excuse of cross-gen for those too. How is me stating the facts cherry-picking?
technically ps4 can run all the library of ps2 and ps3 at 60 fps, heck maybe 120 FPS. does that make ps4 a 60/120 fps console? ps5 being able to brute force ps4 or ps4-gen worthy games at 60 fps is not that impressive for me and I cannot take that as a way to "this consoles' cpu is capable for 60 fps".

reality is, jedi survivor (ray tracing) and dragon dogma 2 (probably ray tracing) are the real representatives of how third party games going forward will run on console CPUs with ray tracing involved. at least for me. they may not be for you. in that case, we would have to agree to disagree
No. No and no. That is literal cherry-picking. You cannot just conveniently pick the games that seem to have issues and ignore the ones that didn't. You say Jedi survivor? This is my issue with these arguments.

How about this, Take Guardians of the Galaxy instead. It has a native 2160p fidelity mode at 30fps, a 60fps 1080p performance mode, and a DRS 1620p-2160p RT mode at 30ps. That game covers the entire spectrum of every possible PS5 graphic mode. It looks and performs significantly better the Jedi survivor, and oh... its not cross-gen. As far as I am concerned, Jedi Survivor is an unoptimized mess of a game and even had issues on PC, this is like you trying to use Elden ring to measure performance when we all know how FS are...

You cant possibly be any more disingenuous if you tried if you are trying to suggest that "hey the reason why this CPU can handle 60fps is that so far it has only been running PS4 games". Come on, if you have to sink that low to make a point then you know you are doing something wrong.

Anyways, yh... lets agree to disagree.
 
Last edited:

DragonNCM

Member
Yeah....I was right...same shit like PS4 pro vs PS4, CPU bottleneck again.
Not paying this time for same shit. Fool me ones shame on you fool me twice shame on me.
 

Mr Moose

Member
I am sorry, but I disagree.

At no point when I listed out games did I mention any cross-gen game. I always used examples like Alan Wake, Pandora and Baldurs gate 3. doesnt get any more next gen than those does it? But even if you want us to dig deeper, we can talk about SM2, which supports RT in both fidelity and performance modes, and is not on the PS4. Or Ratchet.

And you are massaging the facts here. I do not and should not care if the majority are cross gen or not. Thats not my problem. The fact of the matter, and this is fact, is that the majority of the game released on the platform thus far... as recently as this year... have a 60fps mode. We can not then use the excuse of cross-gen for those too. How is me stating the facts cherry-picking?

No. No and no. That is literal cherry-picking. You cannot just conveniently pick the games that seem to have issues and ignore the ones that didn't. You say Jedi survivor? This is my issue with these arguments.

How about this, Take Guardians of the Galaxy instead. It has a native 2160p fidelity mode at 30fps, a 60fps 1080p performance mode, and a DRS 1620p-2160p RT mode at 30ps. That game covers the entire spectrum of every possible PS5 graphic mode. It looks and performs significantly better the Jedi survivor, and oh... its not cross-gen. As far as I am concerned, Jedi Survivor is an unoptimized mess of a game and even had issues on PC, this is like you trying to use Elden ring to measure performance when we all know how FS are...

You cant possibly be any more disingenuous if you tried if you are trying to suggest that "hey the reason why this CPU can handle 60fps is that so far it has only been running PS4 games". Come on, if you have to sink that low to make a point then you know you are doing something wrong.

Anyways, yh... lets agree to disagree.
GotG is cross-gen.
 

yamaci17

Member
I am sorry, but I disagree.

At no point when I listed out games did I mention any cross-gen game. I always used examples like Alan Wake, Pandora and Baldurs gate 3. doesnt get any more next gen than those does it? But even if you want us to dig deeper, we can talk about SM2, which supports RT in both fidelity and performance modes, and is not on the PS4. Or Ratchet.

And you are massaging the facts here. I do not and should not care if the majority are cross gen or not. Thats not my problem. The fact of the matter, and this is fact, is that the majority of the game released on the platform thus far... as recently as this year... have a 60fps mode. We can not then use the excuse of cross-gen for those too. How is me stating the facts cherry-picking?

No. No and no. That is literal cherry-picking. You cannot just conveniently pick the games that seem to have issues and ignore the ones that didn't. You say Jedi survivor? This is my issue with these arguments.

How about this, Take Guardians of the Galaxy instead. It has a native 2160p fidelity mode at 30fps, a 60fps 1080p performance mode, and a DRS 1620p-2160p RT mode at 30ps. That game covers the entire spectrum of every possible PS5 graphic mode. It looks and performs significantly better the Jedi survivor, and oh... its not cross-gen. As far as I am concerned, Jedi Survivor is an unoptimized mess of a game and even had issues on PC, this is like you trying to use Elden ring to measure performance when we all know how FS are...

You cant possibly be any more disingenuous if you tried if you are trying to suggest that "hey the reason why this CPU can handle 60fps is that so far it has only been running PS4 games". Come on, if you have to sink that low to make a point then you know you are doing something wrong.

Anyways, yh... lets agree to disagree.

1, guardians of galaxy is crossgen (it is on ps4)
2, spiderman 2 is a glorified crossgen game and no one can convince me otherwise. it shares the same core functions and similar drawcalls as the 1st spiderman on PS4. and that was already optimized to hit 30 FPS on jaguar cores, in return, it was able to hit 120 FPS on zen 2, and with ray tracing, they still had "headroom" to hit a similar 60 FPS. and I also specifically said that 1st party games are much less CPU bound with ray tracing. but no one buys consoles for first party games alone. so spiderman 2 being able to hit 60 fps with ray tracing is not really a strong case for console. ratchet and clank has similar drawcalls to spiderman and only reason it is not on PS4 is because of the SSD based tech and all that jump stuff. they could've ported to PS4 making you wait at the front of portals like hogwarts legacy is making people wait on castle doors. it is just my honest opinion. you may not agree.
3, baldurs gate 3 hits consistently 30 fps in act 3 which is problematic and it is not like it is a small portion of the game
4, pandora and alan wake 2 examples are legit, and I appreciate their developers for making the effort. but they're still exceptions to the norm, for me. I would like to see how their future games will run on PS5 in terms of CPU boundness.

what you don't understand is I have my own perspective to the problem from a 2700x perspective. I can get 60 fps locked in spiderman with ray tracing. but I cannot in jedi survivor, starfield, baldurs gate 3, hogwarts legacy. what should I think now? that my cpu is not the problem? for me games like spiderman with ray tracing and alan wake 2 are exceptions where 2700x/ps5 performs well. other games stick out more for me because I have a feeling that they will be more common place as we move into the gen. which is why I don't like ps5 keeping the same CPU. and most of the time, in games that I can comfortably get 60 fps with my 2700x are the ones that happened to run on PS4 too. atomic heart, resident evil 4, diablo 4, ac mirage, ac valhalla, elden ring etc. when you look at the games that run at 60 FPS on PS5, these games will come on top and people will think that ps5 is good to go for 60 fps because it can run 60 fps in such a wide range of titles. yet I do not get such a comfort abou being able to hit 60 fps with my cpu based on these titles. so props to you people having this much belief in zen 2 cpu on consoles
 
Last edited:

Bojji

Member
Huh only 8MB of CACHE!? I didnt know this part.
Still I think the current cpu will handle the games just fine.

This limitation was always the case with consoles further in their respective gens. Its not new, but that 8mb is barebones at least.
Guess it also comes down how it's adressed, how the RAM handles instructions etc....

There are also some other things cutted vs desktop version. This CPU is obviously SO MUCH better than Jaguar but it's not as powerful as some people might have thought.

Developers need to have realistic targets, if everyone was targeting running everything CPU related in 60FPS on Zen 2 found on consoles then we would these variations:

- low res SS version with low resolution textures - 60FPS or average res version 30FPS
- average res PS5/SX version with high res textures - 60FPS or high res version 30FPS
- high res PS5 Pro version with high res textures - no need for high res version?

But when games are CPU limited to 30 - something FPS then every version will be 30FPS locked (or variable like DD2), no way to avoid this.
 

mansoor1980

Gold Member
I wonder how they tested this. It's common knowledge on PC that you set your anisotropic filtering to ON in the GPU's control panel within Windows and turn it off in the game itself.

50% of games with this option enabled use the CPU for this for some strange reason if the GPU is not manually set to handle it. That could be the case here.
what? explain plz
 
We don't know how efficient it is, we don't have any solid info about that. GPU tests done by DF shows that PS5 trades blows with 6700 - equivalent PC GPU, this shows that there aren't any magic optimizations done on consoles.



Those tests were GPU limited so CPU wasn't important as long as it wasn't bottlenecking GPU.



You can see where 3600 is better than PS5/4700S:

PS5:

u0XnUeK.jpg


3600:

AikLDBv.jpg


Ryzen 3600 is also 4.2GHz.

Cache alone can make massive difference on Zen processors:



Callisto is running without ant RT in performance mode on consoles and RT makes it this CPU limited on PC.



I don't like bullshit, that's the thing. There are many people here that don't have aby idea how games utilize hardware yet act like experts and spread bullshit.

You can see my post history, I'm mostly on PS side and I was a console player most of my life - even PS fanboy back in PS3 days (dark times...) but I won't stay silent when Sony does some stupid things (I'm not talking about Pro in this example). Some people here are aggressive to anyone that don't praise Sony and god Cerny to the heavens, anyone that points out any shortcomings of PS5 and Pro is treated like enemy.

I learned on Gaf that PS5 Pro will:

- render native games at 8k
- will be 3x more powerful than PS5
- will scale down 8k games to 4k because why not
- will blow Nvidia DLSS out of the water (their first try at Ai upscaling vs Nvidia with 6 years of experience)
- No CPU upgrade doesn't matter because modern games don't need fast CPUs at all


Agreed, cache can make a difference.

Calisto is a single game, COD BlackOps Coldwar runs at 60FPS with RT Shadows.


I learned on Gaf that PS5 Pro will:

- render native games at 8k < Not sure who's expecting this.
- will be 3x more powerful than PS5 < dual issue tflops? Sure. But not 3 times the graphical power.
- will scale down 8k games to 4k because why not < Not sure about this one, The Touryst did downscale from 8K > 4K. Not sure how practical that is in other games though.
- will blow Nvidia DLSS out of the water (their first try at Ai upscaling vs Nvidia with 6 years of experience) < Def not, but will be better than FSR, having dedicated tensor cores for this will undoubtedly yield fantastic results.
- No CPU upgrade doesn't matter because modern games don't need fast CPUs at all < Agreed, but then, what's the goal of a CPU upgrade if we're still targeting 30/60/120.
 

yamaci17

Member
The norm is 60fps mode. You are literally doing the Freezer "I'll ignore that" meme.
not necessarily. I literally have a 2700x (overclocked and paired with a 3466 mhz cl14 ram) and my opinion is formed around my experiences with it. that is why I said "for me" because I'm in a similar situation as PS5 / Pro is (having a stronk gpu paired with a whack CPU)

here, me being able to get 64 fps locked in spiderman with ray tracing


same cpu dropping below 40 fps in starfield


look at how performant gotg is on the 2700x!


whoops


but hey I can get 90+ fps cpu bound in fh5, cpu must be great


oh no, rats drop frames below 32




Understand that for every game that runs great on my CPU (after 2020), there's another title that runs horrible. The same applies to PS5 but some people here try to paint a good picture instead.

I can find maybe hundreds of games I can hit 60-100 FPS on my CPU. No problem. But as you can see, I've easily listed 3 nextgen only games that destroy CPU bound performance (a plague tale, starfield and jedi survivor). The amount and frequency of such titles are increasing every year and there's a clear pattern. Their numbers will increase. If it doesn't, all good for me. I will be happy as much as you PS5 folks are. It would mean I can keep my CPU forever. But something tells me that won't be the case. If you don't see the pattern and think the bad performers are the outliers, I just can say, I hope you're right.

forza horizon, spiderman and guardians of galaxy being able to run at 60 fps, cool, does that help with starfield, jedi and a plague tale? no? I need an upgrade if I want to enjoy 60 FPS in starfield, a plague tale requiem and jedi survivor. it is that simple.

and FOR me, these 3 games pushing my CPU to its limits irks me and it will drive me to upgrade if they become more freqent. it just seems inevitable. a 150 bucks ryzen 5600 upgrade will easily get me 1.6x improvement, for reference.

cerny should at least make an effort to make a jump to Zen 3. or else, it is just a waste of a good GPU power.

it also irks me that i can get 60 fps in most places in a plague tale but get 30 fps with rats on the screen. same for starfield, it jumps between 40 fps and 60 fps depending on where you are. then comes the baldurs gate 3, most of the game running at 60 fps but act 3 will get you back to 30 FPS range. it is just not a fun experience. it is a subpar experience that I torment myself with. I don't see why "genius" Cerny is okay with such subpar experience in that case. all the power to him though.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
This should show just how CPU hungry RT can be. You have 3600 which we know from above is roughly 50-70% more powerful than the PS5 pro CPU, completely bottlenecking the 3080. Only 60% GPU utilization so effectively its working as a base PS5 GPU. And thats the point, if you dont improve the CPU, upgrading the GPU to nvidia quality RT is pointless if you want to run RT games at 60 fps on the Pro. The pro will run games with more RT effects at 30 fps with no problems. But if your goal was to play these games at 60 fps with RT on, you are going to be disappointed. Im sure there will be exceptions like Avatar, Spiderman 2 and Metro but they are already 60 fps on the base PS5.

cgll9Q6.jpg



To be fair, even I am CPU bottlenecked on my 3080 when turning on RT and i have way better CPU than the 3600 above. I have had to turn off RT in games like gotham knights, hogwarts, re4 and alan wake 2 because of either CPU or vram related bottlenecks. Even Avatar which is very well optimized on both the CPU and GPU causes these massive frametime spikes and their CPU benchmark is a fucking mess frametime wise on my CPU, but if i were upgrading and spending $600 on a new hardware, id want to ensure i remove those bottlenecks. Id go and get a 7800x3d.

And thats why this meager CPU upgrade is so disappointing. They had zen 4 just sitting there and chose not to take it. If it costs an extra $50, do it. Release a $649 console. At least that wont be bottlenecked.
 
what? explain plz


1 example. Immortals clearly uses the CPU instead of the GPU. Watch the cpu usage plummet as soon as he turns it off. Some cores are even idling after this. He is rocking a 4090, which can run AF at 16x without breaking a sweat. Yet its the cpu which is handling it. But the GPU values dont change when he turns it off. Which means the game asks the CPU to do it.

This can be a massive hit on systems with an older CPU.

If you're rocking a cpu from the last 3 years or so, the impact would be less, but its still there. Waste of resources.

This shouldn't be the case if the GPU was handling it. ;)

Best is you try it yourself. I had it with Assetto Corsa and BF2042. Had AF on ingame until a friend of mine who develops games mentioned this.
It gave me a 10 fps boost in those games and my CPU ran cooler.

Not all games do this. 50% was a bit too high I think, but since then I set it to 16x in the Nvidia control panel and always turn it off in games just to be sure.
 
Last edited:

SoloCamo

Member
Now we don't have Jaguar CPU, but they complain anyway.... :D

The difference is the Jaguar CPU's were complete trash when they were new. The current PS5 cpu was "ok" when it came out, but a lot of time has passed since. They now have better (and cheaper) options in the power envelope than they did so it's pretty sad to see they aren't improving much at all when there are already plenty of CPU bound moments with the current setup.
 

Mr.Phoenix

Member
not necessarily. I literally have a 2700x (overclocked and paired with a 3466 mhz cl14 ram) and my opinion is formed around my experiences with it. that is why I said "for me" because I'm in a similar situation as PS5 / Pro is (having a stronk gpu paired with a whack CPU)

here, me being able to get 64 fps locked in spiderman with ray tracing

This is crazy to me. Do you agree that fast loading/streaming and RT are next gen features? So SM2 does both, and does both very well. But you dismiss it because it doesnt cripple your PC? I would think what we should be doing instead is applauding them for making a well-optimized game?
same cpu dropping below 40 fps in starfield

Don't get me started here... that was an "artistic choice"

The argument I will always make with this piss-poor game, is just use the eye test. SM2 is a bigger world, has more going on and looks better than this game in every single way. The fact that it performs far better than this should tell you that using this game as some sort of benchmark is ill-advised.
Understand that for every game that runs great on my CPU, there's another title that runs horrible. The same applies to PS5 but some people here try to paint a good picture instead.
I dont get what you are saying, what good picture are people trying to paint? We are all saying the same thing. Some games run better than others. The issue is this narrative that some are pushing as if ALL games run bad because of CPU. And some people go as far as singling out the worst-performing games and use that to define the entire platform.
forza horizon, spiderman and guardians of galaxy being able to run at 60 fps, cool, does that help with starfield, jedi and a plague tale? no? I need an upgrade if I want to enjoy 60 FPS in starfield, a plague tale requiem and jedi survivor. it is that simple.
I am truly sorry that this is your chosen mindset. I have a PS5 and a PC with a 3080. And I know a poorly optimized game when I see one. I hope one day you can also learn the difference, and not be there expecting a console to use the brute force approach to solve problems for devs that can t take the time to optimize their games properly. Lol starfeild...
 

Radical_3d

Member
not necessarily. I literally have a 2700x (overclocked and paired with a 3466 mhz cl14 ram) and my opinion is formed around my experiences with it. that is why I said "for me" because I'm in a similar situation as PS5 / Pro is (having a stronk gpu paired with a whack CPU)

here, me being able to get 64 fps locked in spiderman with ray tracing


same cpu dropping below 40 fps in starfield


look at how performant gotg is on the 2700x!


whoops


but hey I can get 90+ fps cpu bound in fh5, cpu must be great


oh no, rats drop frames below 32




Understand that for every game that runs great on my CPU (after 2020), there's another title that runs horrible. The same applies to PS5 but some people here try to paint a good picture instead.

I can find maybe hundreds of games I can hit 60-100 FPS on my CPU. No problem. But as you can see, I've easily listed 3 nextgen only games that destroy CPU bound performance (a plague tale, starfield and jedi survivor). The amount and frequency of such titles are increasing every year and there's a clear pattern. Their numbers will increase. If it doesn't, all good for me. I will be happy as much as you PS5 folks are. It would mean I can keep my CPU forever. But something tells me that won't be the case. If you don't see the pattern and think the bad performers are the outliers, I just can say, I hope you're right.

forza horizon, spiderman and guardians of galaxy being able to run at 60 fps, cool, does that help with starfield, jedi and a plague tale? no? I need an upgrade if I want to enjoy 60 FPS in starfield, a plague tale requiem and jedi survivor. it is that simple.

and FOR me, these 3 games pushing my CPU to its limits irks me and it will drive me to upgrade if they become more freqent. it just seems inevitable. a 150 bucks ryzen 5600 upgrade will easily get me 1.6x improvement, for reference.

cerny should at least make an effort to make a jump to Zen 3. or else, it is just a waste of a good GPU power.

it also irks me that i can get 60 fps in most places in a plague tale but get 30 fps with rats on the screen. same for starfield, it jumps between 40 fps and 60 fps depending on where you are. then comes the baldurs gate 3, most of the game running at 60 fps but act 3 will get you back to 30 FPS range. it is just not a fun experience. it is a subpar experience that I torment myself with. I don't see why "genius" Cerny is okay with such subpar experience in that case. all the power to him though.

This is a very good point. And although I like to exaggerate for the laughs I’m not denying that there are scenarios that are CPU bounded by design. But of the 3 games you have problems 2 have 60fps modes on PS5 and Jedi with an specially horrible image quality. So while in PC makes sense to have a more balanced machine (couldn’t be me: the day I’ll update my PC I’ll just need my GPU due to the type of games I play), for a mid gen update that has little margin in cost and price over an already expensive machine, it makes no sense to expect anything else than “nicer 60fps mode with AI”. It’s the same again as with the PS4. We have our nice 4K displays and our current console isn’t getting there, so we need a sightly less crappy image quality (Alan Wake 2 at 1200p, here I go!).
 

SlimySnake

Flashless at the Golden Globes
Call of Duty Black Ops had 60FPS and RT shadows. Just to be clear, what's the benchmark we're aiming for? 60FPS and RTGI? 60FPS and RT Reflections at full res? Half res?
Exceptions dont make the rule. CoD was 60 fps even on the cell and the jaguar CPUs. they are really basic 360 era games that dont do much with the CPU since they just need to run at 60 fps.

Modern games are hitting the CPUs much harder. RT shadows btw have a very low performance cost. Something like 3-5% on my 3080 when enabling it on top of rt reflections.

The benchmarks are simple. You have all these PS5 games i mentioned above running at 30 fps in their RT modes. Sony said RT speeds are 2-4x. Technically, they should be able to run all these games at 60 fps without removing RT like they do today. Thats it. its not rocket science. If we get that then great. i will bump this thread and say i was wrong.

With more and more games using RTGI like Star Wars Jedi Survivor, Star Wars Outlaws, GTA6 and Death Stranding 2, you would need a more powerful CPU to run them at 60 fps. I dont think its enough based on what ive seen in the PC space with vastly better CPUs and GPUs. You clearly do. We shall see.
 

Bojji

Member
Agreed, cache can make a difference.

Calisto is a single game, COD BlackOps Coldwar runs at 60FPS with RT Shadows.


I learned on Gaf that PS5 Pro will:

- render native games at 8k < Not sure who's expecting this.
- will be 3x more powerful than PS5 < dual issue tflops? Sure. But not 3 times the graphical power.
- will scale down 8k games to 4k because why not < Not sure about this one, The Touryst did downscale from 8K > 4K. Not sure how practical that is in other games though.
- will blow Nvidia DLSS out of the water (their first try at Ai upscaling vs Nvidia with 6 years of experience) < Def not, but will be better than FSR, having dedicated tensor cores for this will undoubtedly yield fantastic results.
- No CPU upgrade doesn't matter because modern games don't need fast CPUs at all < Agreed, but then, what's the goal of a CPU upgrade if we're still targeting 30/60/120.

Most of this gold is from this thread:


I love it!

PSSR only needs to be close to DLSS to make massive difference, this is the best thing about this console alongside RT upgrade.

Changing subject here, I think that Pro will be able to deliver Path Tracing games in 30FPS IF RT performance is close to what Nvidia is doing.

I have 3080ti and this GPU is ~20% better than 4070 and that GPU is rumored to be around the power of PS5 Pro.

I can do vanilla path tracing with 3200x1800 resolution with DLSS Performance (~900p internal) and it runs surprisingly well:

ylFclBC.jpeg
2djluN1.jpeg


I think Pro could easily run this in 30FPS if it's 4070 performance

But there is also this little mod that changes parameters of PT, instead if 2 rays and 2 bounces we have 2 rays and 1 bounce. Game looks very close but runs much better:

B43loZ5.jpeg


fN2mW0H.jpeg


Pro could run this in 40FPS no problem ^
 
Last edited:

yamaci17

Member
This is crazy to me. Do you agree that fast loading/streaming and RT are next gen features? So SM2 does both, and does both very well. But you dismiss it because it doesnt cripple your PC? I would think what we should be doing instead is applauding them for making a well-optimized game?

Don't get me started here... that was an "artistic choice"

The argument I will always make with this piss-poor game, is just use the eye test. SM2 is a bigger world, has more going on and looks better than this game in every single way. The fact that it performs far better than this should tell you that using this game as some sort of benchmark is ill-advised.

I dont get what you are saying, what good picture are people trying to paint? We are all saying the same thing. Some games run better than others. The issue is this narrative that some are pushing as if ALL games run bad because of CPU. And some people go as far as singling out the worst-performing games and use that to define the entire platform.

I am truly sorry that this is your chosen mindset. I have a PS5 and a PC with a 3080. And I know a poorly optimized game when I see one. I hope one day you can also learn the difference, and not be there expecting a console to use the brute force approach to solve problems for devs that can t take the time to optimize their games properly. Lol starfeild...
I'm not dismissing what is spiderman 2 is doing. I'm just telling that it is like last of us part 2 on PS4, an anomaly. can you tell me if last of us part 2 on PS4 is the norm of PS4 games? more than %99 of games won't come close to looking like last of us part 2. and if other devs tried it with their abilities, they would probably hit 10 FPS at 720p on that hardware. it is why we have unreal engine 5 indie developers who waste 10 tflops while barely making their games look anywhere near last of us part 2. but at least they can get there now while using more resources. in the end, it is what the hardware is for.

I'm appreciating what imsomniac pulled off there. but streaming / fast travel part is mostly to do with SSD/ i/o processors etc. it is clear that 3rd party developers have no interest in that. I'm not saying that console is fully to be blamed here anways, I openly say that it is a problem of 3rd party developers. but I also acknowledge that they have responsibility to optimize a game code that can easily run on countless pc specs, xbox and ps5 all at the same time. this is why they cannot use specialzed ps5 based hardware. if they could, we wouldn't even need PS5 pro. do you think the way spiderman 2 runs, is there a need for ps5 pro? it looks good even in its performance mode. but ffxvi instead falls down to comical resolutison. if we're going to take spiderman 2 as a good example of how optimization should be, then why should there be a ps5 pro to begin with? it is clear that imsomniac and 1st party devs can make magic with the already existing PS5. sure, they will create even further magic with ps5 pro. but I thought the point of ps5 pro was to fix the problems of ps5. the problems that are non existent for Imsomniac. which is why bringing spiderman 2 into this specific discussion of CPU bottlenecks is not meaningful. because ps5 pro is not really that meaningful for imsomniac. they can already make magic with PS5 as is.

this is why examples like spiderman 2 are oxymoron. they prove that ps5 pro is not even needed. and if ps5 pro is not going to be a magical fix for the outlier (per you) jedi survivor, what is the point of it?
 
Last edited:

shamoomoo

Member
DF already got the CPU in the PS5 and XSX and did ran in game benchmarks. The CPU runs at 4.0 GHz so slightly higher than the PS5 Pro and still gets bitch slapped by the 3600 by 50-70%+. The zen 4 CPU is over 2.5x better with 2 fewer cores and 4 fewer threads.

92pX8CS.jpg

Dd2iR16.jpg
Are those GPU bound games?
 

SlimySnake

Flashless at the Golden Globes
lol Dragons Dogma is severely CPU bound on consoles and on PC, and they decided to add RTGI on consoles as well. Geniuses at work everyone.

This is why i laugh whenever people here try to dismiss opinions of PC gamers who have been there and done that just because devs and engineers SHOULD know better. They should but they typically dont.

It took 6 months for Respawn engineers to figure out that RT was the bottleneck for 60 fps performance on consoles.
 

yamaci17

Member
This is a very good point. And although I like to exaggerate for the laughs I’m not denying that there are scenarios that are CPU bounded by design. But of the 3 games you have problems 2 have 60fps modes on PS5 and Jedi with an specially horrible image quality. So while in PC makes sense to have a more balanced machine (couldn’t be me: the day I’ll update my PC I’ll just need my GPU due to the type of games I play), for a mid gen update that has little margin in cost and price over an already expensive machine, it makes no sense to expect anything else than “nicer 60fps mode with AI”. It’s the same again as with the PS4. We have our nice 4K displays and our current console isn’t getting there, so we need a sightly less crappy image quality (Alan Wake 2 at 1200p, here I go!).
for jedi survivor, they had to disable ray tracing. I also can get 60 fps without ray tracing

for a plague tale requiem, they literally reduced everything, even NPCs move like they're at 15 FPS now. and they reduced vegetation density etc. they practically turned the game into a lastgen game. I could also get 60 fps with such settings now (and they ported those settings to PC). I mean I was already getting 60 fps just before rats. but at least those rats were refreshing at the framerate of the game. Now there is an option to make them low FPS GIFs. which defeats the purpose of game being "nextgen" with its rats. I would personally still play that game at those settings and lock it to 30 FPS. the whole game is about rats lmao nerfing them defeats the whole purpose of it targeting 30 fps on zen 2 to begin with.
 

recursive

Member
This totally gives the vibe it's an upgrade to get out more from already launched games and eventually GTA 6.

But I don't get the worry about GTA 6 to be honest, that game alone will still be around when PS7 launches 🤷‍♂️.
And you will also be able to likely get it's 2nd remaster on ps8.
 
lol Dragons Dogma is severely CPU bound on consoles and on PC, and they decided to add RTGI on consoles as well. Geniuses at work everyone.

This is why i laugh whenever people here try to dismiss opinions of PC gamers who have been there and done that just because devs and engineers SHOULD know better. They should but they typically dont.

It took 6 months for Respawn engineers to figure out that RT was the bottleneck for 60 fps performance on consoles.
Meanwhile, be a pc gamer, see a slide that says Raytraced shadows/ reflections and turn it off.

Devs:
Surprised Meme GIF
 

yamaci17

Member
Meanwhile, be a pc gamer, see a slide that says Raytraced shadows/ reflections and turn it off.

Devs:
Surprised Meme GIF
I told everyone that dragon dogma 2 was gunning for ray traced GI on consoles just like jedi. it is why it is extremely CPU bound. remove ray tracing and you will get decent 45+ fps experience on 3600 and alike
 

Mr.Phoenix

Member
I'm not dismissing what is spiderman 2 is doing. I'm just telling that it is like last of us part 2 on PS4, an anomaly. can you tell me if last of us part 2 on PS4 is the norm of PS4 games? more than %99 of games won't come close to looking like last of us part 2. and if other devs tried it with their abilities, they would probably hit 10 FPS at 720p on that hardware. it is why we have unreal engine 5 indie developers who waste 10 tflops while barely making their games look anywhere near last of us part 2. but at least they can get there now while using more resources. in the end, it is what the hardware is for.

I'm appreciating what imsomniac pulled off there. but streaming / fast travel part is mostly to do with SSD/ i/o processors etc. it is clear that 3rd party developers have no interest in that. I'm not saying that console is fully to be blamed here anways, I openly say that it is a problem of 3rd party developers. but I also acknowledge that they have responsibility to optimize a game code that can easily run on countless pc specs, xbox and ps5 all at the same time. this is why they cannot use specialzed ps5 based hardware. if they could, we wouldn't even need PS5 pro. do you think the way spiderman 2 runs, is there a need for ps5 pro? it looks good even in its performance mode. but ffxvi instead falls down to comical resolutison. if we're going to take spiderman 2 as a good example of how optimization should be, then why should there be a ps5 pro to begin with? it is clear that imsomniac and 1st party devs can make magic with the already existing PS5. sure, they will create even further magic with ps5 pro. but I thought the point of ps5 pro was to fix the problems of ps5. the problems that are non existent for Imsomniac. which is why bringing spiderman 2 into this specific discussion of CPU bottlenecks is not meaningful. because ps5 pro is not really that meaningful for imsomniac. they can already make magic with PS5 as is.

this is why examples like spiderman 2 are oxymoron. they prove that ps5 pro is not even needed. and if ps5 pro is not going to be a magical fix for the outlier (per you) jedi survivor, what is the point of it?
Well in that case there is nothing I can say anymore.

What kinda CPU would have been enough then? To make up for devs inability to optimize? Because even in this DF video, they talked about the ROG ally, which is using Zen 4 8c/16T (double that of steam deck that is on Zen 2) 24MB cache and up to 5Ghz clock. And all that amounts to around 20% better performance than a game on the steam deck.

So what should sony have done here? And at what cost? Cause if you are just clocking the CPU higher to say 4.5Ghz, then that is useless to you if you don not also increase the cache. But say you use Zen 3/4, the cluster footprint is bigger than that of Zen 2, and again, you also have to add more cache too.
 
Are those GPU bound games?
These are all set in CPU bound scenarios. Ie running on a 3090, at only 1080p, sometimes with upscaling enabled even. Forced into being CPU bound.

Standard testing methodology for CPU gaming performance. So yes, the results shown are valid.
 
Last edited:

Mr.Phoenix

Member
I told everyone that dragon dogma 2 was gunning for ray traced GI on consoles just like jedi. it is why it is extremely CPU bound. remove ray tracing and you will get decent 45+ fps experience on 3600 and alike
Lets just act like the PS5pro doesn't have 2-4x faster RT than the PS5...
 

skit_data

Member
DF have been oddly skeptical towards the idea of a PS5 Pro for a number of strange reasons that never come up when similar stuff should apply (supposedly more work for developers, pricepoint too high for most people etc.)

Can't say I care much about their opinions/speculation here either.
 

JackMcGunns

Member
Hilarious to think people would expect a CPU upgrade considering compatibility, and games being able to run on both systems without a huge disparity..



What you said is true, but only really an issue for the lower power one, the CPU has to be at least as powerful for there to be compatibility with games. You can easily scale GPU related tasks, but when it comes to collision, AI, Triangle setup, particles and other CPU related tasks, it's not something you can easily scale or you will have to change the core of the game. That being said, if you design a game on XSX/PS5 CPU, it can easily be played on a high end PC. The point where it DOESN'T make sense for PS5 Pro to have a much more powerful CPU is monetary, because although it will help with improving the game's framerate and future taxing games, no PlayStation game will take advantage of the extra power, because doing so will make the game difficult or even impossible to play on PS5, but at the end of the day, a more powerful CPU would help with better framerate and also future proofing, you can literally transition to the next generation or the later stage of games going high end PC and be able to keep up longer.
 
Last edited:

yamaci17

Member
Well in that case there is nothing I can say anymore.

What kinda CPU would have been enough then? To make up for devs inability to optimize? Because even in this DF video, they talked about the ROG ally, which is using Zen 4 8c/16T (double that of steam deck that is on Zen 2) 24MB cache and up to 5Ghz clock. And all that amounts to around 20% better performance than a game on the steam deck.

So what should sony have done here? And at what cost? Cause if you are just clocking the CPU higher to say 4.5Ghz, then that is useless to you if you don not also increase the cache. But say you use Zen 3/4, the cluster footprint is bigger than that of Zen 2, and again, you also have to add more cache too.
I mean this example falls flat here. rog ally/steam deck etc. are extremely GPU limited. you're talking about 4x ray tracing improvements but then come up with cpu not being a meaningful difference between rog ally and steam deck. isn't rog ally like a gtx 1650? of course it won't see much of a meaningful difference from a zen 4 cpu or zen 2 cpu

sony should've at least have the decency of putting a zen 3 there. it would turn unstable 30 fps CPU bound games to rock solid 30 FPS modes and would allow 40 FPS mode to be a thing, or maybe unlocked 45 FPS



10:40

regardless, notice how limiting 3600 is for 3060ti here in novigrad/witcher 3/ray tracing as well and see how much zen 3 with higher clocks helps with that. this 3600 literally drops below 30 fps in novigrad with ray tracing. zen 3 is a must for ray tracing for 3rd party games if you ask me.
 
Last edited:

Radical_3d

Member
for jedi survivor, they had to disable ray tracing. I also can get 60 fps without ray tracing

for a plague tale requiem, they literally reduced everything, even NPCs move like they're at 15 FPS now. and they reduced vegetation density etc. they practically turned the game into a lastgen game. I could also get 60 fps with such settings now (and they ported those settings to PC). I mean I was already getting 60 fps just before rats. but at least those rats were refreshing at the framerate of the game. Now there is an option to make them low FPS GIFs. which defeats the purpose of game being "nextgen" with its rats. I would personally still play that game at those settings and lock it to 30 FPS. the whole game is about rats lmao nerfing them defeats the whole purpose of it targeting 30 fps on zen 2 to begin with.
Funny that you keep mentioning the GPU generated rats animation. Not that the bump on GPU is that much either…

Edit: now that I think about it. Reading this thread and watching that DF video feels as if I’m being astroturfed to focus my attention on the CPU bump rather than being upset at the GPU one. ITT: Sony didn’t bother to ramp up 2X GPU and mfs are expecting something in the more problematic CPU… soooo GAF…
 
Last edited:
DF have been oddly skeptical towards the idea of a PS5 Pro for a number of strange reasons that never come up when similar stuff should apply (supposedly more work for developers, pricepoint too high for most people etc.)

Can't say I care much about their opinions/speculation here either.

Most of us are used to their horseshit that no ones even surprised anymore, "bUt iT dOesN't hAve vAriAbLE rAte sHadIng".
 
People were arguing PS5 custom geometry shaders are as good as vrs, what happens now?

You're two parts confused and three parts ignorant.

Firstly, VRS is shit. All implementations of Xbox's HW-accelerated VRS outside of the Coalition's in the last Gears game have been utter creamy horseshit. In fact, the COD devs produced better results with a purely software solution that runs just as well on the PS5 as the XSX; making the VRS BS peddled by Xbots even more lol-worthy.

Secondly, VRS and geometry shaders are not even similar. They're completely different things. What I think you meant to say is that people argued that Sony's custom geometry shaders were equivalent to mesh shaders... and they would be right because they are. They both provide full programmability to the geometry pipeline. Every piece of technical documentation that describes what Sony's geometry shaders and mesh shaders do, shows that they achieve the same thing but just with two slightly different approaches.
 
Top Bottom