• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: Cyberpunk 2077: PC Benchmarks Running on PS5 and Xbox Series X - So What Do They Do?

SmokSmog

Member
Basic math, but 45fps is 75% of 60fps whereas 1008p is 87.5% of 1152p. So it still doesn't account for all of those lost frames.

Correct basic math 😉
1152P is 1.1428 as much pixels as 1008P
But image has two axis
So 1.1428*1.1428=1.306~
XSX is pushing 30~% more pixels in the best scenario.
1/1.3=0.77~
PS5 does 77% the pixels of XSX in the worst case scenario.
Also PS5 has higher pixel rate cuz it has same amount of rops as xsx but at higher clocks

4659f6ff0ce67efb5c4d31cd8de130fc85e0aae3cf7f0121472c8e3aff7c79d1.png
58c080eca9f4173a74652f79d91624906c64da51dd610386381547a78d16bab8.png

So it makes sense why Xbox has such a poor performance.
Daily-reaction-PS5-specs-mark-cerny.png
 

DanielG165

Member
It’s crazy how deterministic both consoles are on software and specific engines. Alan Wake 2 was clearly better performing on Series X, whereas in this specific benchmark test for 2077, the PS5 has the advantage, but then both consoles apparently perform the same in Phantom Liberty. I would certainly think that after 3, going on 4 years of these machines being out, the notion of them being extremely close and going tit for tat would be firmly supplanted.

One week: The Series X takes the edge in a certain game.

Another week: The PS5 has the advantage in a different game.

The next week: Both are performing the exact same in another game.

This has, and will always be the story between these two specific consoles for the rest of their lives, barring the PS5 Pro.
 

Dr.D00p

Member
One week: The Series X takes the edge in a certain game.

Another week: The PS5 has the advantage in a different game.

The next week: Both are performing the exact same in another game.

This has, and will always be the story between these two specific consoles for the rest of their lives, barring the PS5 Pro.

It's simply a case of what the specific game engine prefers and how it uses the available resources.

PS5 with its higher GPU clock speeds with fewer compute units or the Series X with more compute units but much slower GPU clocks.

PS5 is the market leader so developers are programming to the PS5 strengths rather than the Series X.
 

SKYF@ll

Member
It's simply a case of what the specific game engine prefers and how it uses the available resources.

PS5 with its higher GPU clock speeds with fewer compute units or the Series X with more compute units but much slower GPU clocks.

PS5 is the market leader so developers are programming to the PS5 strengths rather than the Series X.
The benchmark score of 4800S (CPU) and RX6700 (2.23GHz) was almost the same as PS5.
You can see that this game does not use PS5's custom I/O (HW Decompress and DMA on SRAM).
I think it's a simple transport from the PC version.
e3d22of.jpg
 

ChiefDada

Gold Member
I think the PS5/6700 analysis gives us a rare opportunity to compare memory subsystems of both GPUs that are identical in many ways. Specifically AMD Infinity Cache vs PS5 cache coherency engines, both of which are ultimately different solutions addressing the same problem. I have always wondered which approach was the most performant.

At the 1008p-1440p range is where Infinity Cache really shines, yet PS5 manages to come out on top. Perhaps we have our answer, at least a data point we can reference in the future. Just a thought.
 
So in conclusion, the major bottleneck in this gen is CPU not the GPU for PC as for even in Avatar, RTX 2070 Super was ahead of PS5 in performance, even after DF could not lower settings on PC, to compare apple to apple with PS5, but still it beat it. When PS5 games are ported for PC, the people are not aware that the major issue is Vram and CPU bottleneck because API changes and OS, and driver overhead.

DF said Yesterday "All 3rd parties games are made on PC first now and than ported later on Consoles" so clearly PS5 is enjoying this advantage over Xbox Series X even though the developers are using DX12 to port the games on PS5. Clearly, something is wrong on Xbox Series end.
 
Last edited:

Lysandros

Member
As the analysis emphasizes, this is mainly a GPU benchmark. PS5 having actually a faster GPU in quite a few ways (not only limited to pixel fill rate) in comparison to XSX is still new info for some apparently.

Both being 2 SE/4 SA designs, PS5's ~20% higher frequency increases the whole GPU fixed function throughputs ahead of XSX' by that amount, including primitives generated and rasterized per sec, command processing, ASYNC scheduling, triangle culling etc. The higher frequency and less CUs per SA also gives each PS5 CU 60% more L1 cache bandwidth to play with not to mention more efficient consumption enabled by the custom cache scrubbers. Yes, it is slightly slower in other ways such as compute and texel fill rate but even in those particular metrics the efficiency factor can muddy the waters.

In the end those system's GPUs are evenly matched based on their hard specs with slightly different strengths and 'weakness'. Depending on engine specifics and work flow each can pull a bit ahead, that is to be expected. Most titles run virtually identical and overall difference favoring each system is usually very modest anyway.
 
Last edited:

SKYF@ll

Member
As the analysis emphasizes, this is mainly a GPU benchmark. PS5 having actually a faster GPU in quite a few ways (not only limited to pixel fill rate) in comparison to XSX is still new info for some apparently.

Both being 2 SE/4 SA designs, PS5's ~20% higher frequency increases the whole GPU fixed function throughputs ahead of XSX' by that amount, including primitives generated and rasterized per sec, command processing, ASYNC scheduling, triangle culling etc. The higher frequency and less CUs per SA also gives each PS5 CU 60% more L1 cache bandwidth to play with not to mention more efficient consumption enabled by the custom cache scrubbers. Yes, it is slightly slower in other ways such as compute and texel fill rate but even in those particular metrics the efficiency factor can muddy the waters.

In the end those system's GPUs are evenly matched based on their hard specs with slightly different strengths and 'weakness'. Depending on engine specifics and work flow each can pull a bit ahead, that is to be expected. Most titles run virtually identical and overall difference favoring each system is usually very modest anyway.
"on a wide highway" First benchmark and least demanding one: Series X drops 36/3600 frames (1%), PS5 25/3600 (0.7%)
"going into the city" Second benchmark: Series X drops 74/3100 frames (2.4%), PS5 34/3100 (1.1%)
"night time race with combat" Third benchmark: Series X drops 1264/15500 frames (8.2%), PS5 32/15500 (0.2%)

The first and second benchmarks are almost equivalent.
"night time race with combat” may have caused the frame rate to drop(XSX).
In other games, depending on the situation, there are often scenes where there is no difference between PS5 and XSX, and scenes where there is a big difference.
Let's continue to take a look at what each console is good at.
 

ChiefDada

Gold Member
What a weirdly put together sentence….Since when do mid gen consoles come out at the start of a new generation? Put down whatever you been smoking. 🤣🤣🤣🤣

He is referencing John's (DF) quote of what Microsoft allegedly said to them in regards to mid-gen upgrade console expectations during the SX/SS launch period. Microsoft is the original source of the idiotic statement.

Over My Head Reaction GIF by MOODMAN
 
Last edited:

Lysandros

Member
Kudos to DF - this is a methodically sound benchmark analysis compared to many of their other platform comparisons. I would like to see Richard spearheading more of these analysis.




Look Behind Black Panther GIF by Marvel Studios
It is indeed methodically sound. Yet, he is still 100% into "XSX should outperform PS5 based on what we know about the consoles" narrative framing, which is either very disingenuous or simply ignorant at a basic level. No Richard, XSX shouldn't outperform PS5 across the board, hence it isn't, and vice-versa. Turning a blind eye to PS5 GPU/hardware advantages will not erase them from reality. You were dead wrong in your original assessment along your team, just admit it and move on. Stick to analysis, drop the damage control. It is the fourth year, about time.
 
Last edited:

Skifi28

Member
It is indeed methodically sound. Yet, he is still 100% into "XSX should outperform PS5 based on what we know about the consoles" narrative framing, which is either very disingenuous or simply ignorant at a basic level. No Richard, XSX shouldn't outperform PS5 across the board, hence it isn't, and vice-versa. Turning a blind eye to PS5 GPU/hardware advantages will not erase them from reality. You were dead wrong in your original assessment just admit and move on. Stick to analysis drop the damage control. It is the fourth year, about time.
When the Series X is ahead: "As expected" . When it's behind, "something is wrong."

I won't be calling shills or anything, I've always respected Ritch. Still, it's baffling to see after 3 years of nearly identical results. At what point do we shelve theoretical metrics and just focus on what's actually there? Kudos to MS for their marketing, I suppose.
 

RickMasters

Member
He is referencing John's (DF) quote of what Microsoft allegedly said to them in regards to mid-gen upgrade console expectations during the SX/SS launch period. Microsoft is the original source of the idiotic statement.

Over My Head Reaction GIF by MOODMAN

….. some people love to quote Phil as the gospel when it suits them round here. Hard to tell when people are joking, being sarcastic or being serious. I dunno…. I don’t cling onto what these execs say they way some do round here. I forgot they even said that. And I’m an actual Xbox owner 🤷🏾‍♂️🤷🏾‍♂️🤷🏾‍♂️

Maybe if I cared more I would ‘get’ their sense of humour? personally Dave chappel, bill burr and Katt Williams Is more my idea of funny.

So err…… you was really rolling around with laughter on the floor over this type of stuff?


Joey Bosa Video GIF
 

RickMasters

Member
He is referencing John's (DF) quote of what Microsoft allegedly said to them in regards to mid-gen upgrade console expectations during the SX/SS launch period.


Curious…. And what happens when they do a mid gen refresh? I’d say it’s a given we hear something about that soon enough. You guys seem to forget the fathered staggered start to this gen…. Covid….. cross gen games being a thing far longer into the gen as a result… this gen has not exactly been as straightforward as previous.


A lot of what was said back then by both companies has aged badly
 

Lysandros

Member
Shouldnt matter because the XSX OS already reserves 2.5 GB of ram and 1 CPU core. PS5 reserves 3.5 GB and 1.5 CPU cores according to DF.
The source for this info from 2020 was Alex ('the' person to be trusted on anything related to Playstation hardware...) In all likelihood it is inaccurate/outdated. Much more recently, we had Immortals of Aveum developer stating that PS5 offered more usable RAM compared XSX for their game and they bumped up a few settings accordingly. (PS5 has 512 MB more memory on board to begin with by the way.)
 
Last edited:
You can't reserve 1.5 cores of a CPU, if .5 of a core is reserved that means 2 cores are reserved, you can't say you can have 50% utilisation of this core, Alex doesn't know what he's talking about.
 
You can't reserve 1.5 cores of a CPU, if .5 of a core is reserved that means 2 cores are reserved, you can't say you can have 50% utilisation of this core, Alex doesn't know what he's talking about.
They can reserve 50% utilization time to OS. Each frame one CPU core is reserved to OS for 8.3ms then given to game the rest of the frame. It was also done like this on PS4.
 
They can reserve 50% utilization time to OS. Each frame one CPU core is reserved to OS for 8.3ms then given to game the rest of the frame. It was also done like this on PS4.

Like I said, you can't allocated 50% utilisation of a core, CPU cycles are not linked to frames, it's also not efficient to say have 8.3ms of CPU cycles from this a core as it could end up dumping something it's working on if it breaches that time.

The OS is allocated 2 threads of the CPU if SMT is enabled or a single thread if not.
 

SlimySnake

Flashless at the Golden Globes
You can't reserve 1.5 cores of a CPU, if .5 of a core is reserved that means 2 cores are reserved, you can't say you can have 50% utilisation of this core, Alex doesn't know what he's talking about.
Every zen 2 core has 2 threads. They can reserve 1 core with 2 threads and 1 thread from the other core.
 
Every zen 2 core has 2 threads. They can reserve 1 core with 2 threads and 1 thread from the other core.
Yes and that only means there are two lines of communication to the core that wont interfere with one another however, the Core can only process one action at a time so the core itself is either 100% reserved (allocated) or not like the previous poster mentioned.

If the core could process two actions at the same time then in therory yes you could reserve half a core.
 
Every zen 2 core has 2 threads. They can reserve 1 core with 2 threads and 1 thread from the other core.

The PS5 does use SMT by default actually, I was confused with the XSX which clocks down when SMT is enabled and only uses 8 cores at default with SMT available to be toggled by developers.

but never the less, the PS5 allocates 1 core 2 threads to the OS and 14 cores for Game Execution.
 
Last edited:

Lysandros

Member
I don't know how it fits to ongoing discussions but XSX CPU also reserves 10% of one core for I/O processing.

Edit: As an addition to the separate issue of available CPU and memory resources for the systems.
This benchmark is a GPU one like demonstrated in the video. PS5 coming at top should be related to the engine relying more to throughputs where PS5 GPU is faster comparatively, there are quite a few them (i have an earlier post detailing it).
 
Last edited:

Vergil1992

Member
I don't know how it fits to ongoing discussions but XSX CPU also reserves 10% of one core for I/O processing.
Even if it were true (which I doubt, because when games are CPU-limited, their performance is practically identical), in this case I don't think it has much to do with it. You can also say that the XSX's CPU is slightly faster, so there could be a trade-off.

XSS has a better framerate and its CPU is exactly the same but slower.

The framerate drops we see in this case are, without a doubt, due to the GPU. The question is whether it is due to the resolution difference between PS5-XSX, or simply XSX in this game underperforms. In my opinion, it is due to both factors.
 

Mr Moose

Member
Even if it were true (which I doubt, because when games are CPU-limited, their performance is practically identical), in this case I don't think it has much to do with it. You can also say that the XSX's CPU is slightly faster, so there could be a trade-off.

XSS has a better framerate and its CPU is exactly the same but slower.

The framerate drops we see in this case are, without a doubt, due to the GPU. The question is whether it is due to the resolution difference between PS5-XSX, or simply XSX in this game underperforms. In my opinion, it is due to both factors.
"It's less latent and it saves a ton of CPU. With the best competitive solution, we found doing decompression software to match the SSD rate would have consumed three Zen 2 CPU cores. When you add in the IO CPU overhead, that's another two cores. So the resulting workload would have completely consumed five Zen 2 CPU cores when now it only takes a tenth of a CPU core. "
 
Last edited:

Vergil1992

Member
Interesting. 10% of a single core is really very, very little. But are we sure that PS5 does not allocate any CPU resources to I/O processing?

Likewise, it would be a negligible difference. We cannot ignore that the XSX CPU is slightly faster as well (3-5%?). According to DF, PS5 has 6.5 cores available for gaming. Is XSX the same in this regard?

CPU-PS5.jpg



Anyway, it's a separate discussion. Cyberpunk's problem is clearly a GPU limitation.
 

Mr Moose

Member
Interesting. 10% of a single core is really very, very little. But are we sure that PS5 does not allocate any CPU resources to I/O processing?

Likewise, it would be a negligible difference. We cannot ignore that the XSX CPU is slightly faster as well (3-5%?). According to DF, PS5 has 6.5 cores available for gaming. Is XSX the same in this regard?

CPU-PS5.jpg



Anyway, it's a separate discussion. Cyberpunk's problem is clearly a GPU limitation.
20200329142736.jpg

By the way in terms of performance that custom decompressor equates to nine of our Zen2 cores that's what it would take to decompress the Kraken stream with a conventional CPU.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
I don't know how it fits to ongoing discussions but XSX CPU also reserves 10% of one core for I/O processing.
Nah, this game is 100% GPU-bound. They paired the 4800S desktop kit which has the same CPU as the SX with a 7900 XTX and not a single frame drop. Consoles have much less CPU overhead than PCs so it's even less likely to be CPU-limited there, unless someone was majorly incompetent with the programming.
 
Last edited:

Vergil1992

Member

Apart from the fact that it does not address exactly what I asked, I believe that this information is false, or at the very least inaccurate. It doesn't make any sense or the Kraken decompression system isn't being used at all.

A conventional Ryzen 3600 runs almost on par with the PS5 in most games, even games like Spiderman or Ratchet. In the case of Spiderman, the game on PC used the CPU for decompression tasks, which is why it worked slightly worse than PS5 on a Ryzen 3600 (although after several patches, performance improved), but if what it says there was true, a Ryzen 3600 couldn't come that close to PS5 performance, the impact on CPU performance would be massive.

Besides, we are talking (in the case of Spiderman Remastered) about an exclusive game designed for PS5. In other cases, a 3600 always has performance on par or even better than the PS5. And it does not have any custom decompression system.


If that paragraph were true, the performance difference should be huge. I think it's typical exaggeration. I still remember when there were developers saying that with DirectX12 the performance in games would increase up to 300% hahaha.
 

Mr Moose

Member
Apart from the fact that it does not address exactly what I asked, I believe that this information is false, or at the very least inaccurate. It doesn't make any sense or the Kraken decompression system isn't being used at all.

A conventional Ryzen 3600 runs almost on par with the PS5 in most games, even games like Spiderman or Ratchet. In the case of Spiderman, the game on PC used the CPU for decompression tasks, which is why it worked slightly worse than PS5 on a Ryzen 3600 (although after several patches, performance improved), but if what it says there was true, a Ryzen 3600 couldn't come that close to PS5 performance, the impact on CPU performance would be massive.

Besides, we are talking (in the case of Spiderman Remastered) about an exclusive game designed for PS5. In other cases, a 3600 always has performance on par or even better than the PS5. And it does not have any custom decompression system.


If that paragraph were true, the performance difference should be huge. I think it's typical exaggeration. I still remember when there were developers saying that with DirectX12 the performance in games would increase up to 300% hahaha.
It's from Mark Cerny. It's talking about decompression and removing bottlenecks.
 
Last edited:
Apart from the fact that it does not address exactly what I asked, I believe that this information is false, or at the very least inaccurate. It doesn't make any sense or the Kraken decompression system isn't being used at all.

A conventional Ryzen 3600 runs almost on par with the PS5 in most games, even games like Spiderman or Ratchet. In the case of Spiderman, the game on PC used the CPU for decompression tasks, which is why it worked slightly worse than PS5 on a Ryzen 3600 (although after several patches, performance improved), but if what it says there was true, a Ryzen 3600 couldn't come that close to PS5 performance, the impact on CPU performance would be massive.

Besides, we are talking (in the case of Spiderman Remastered) about an exclusive game designed for PS5. In other cases, a 3600 always has performance on par or even better than the PS5. And it does not have any custom decompression system.


If that paragraph were true, the performance difference should be huge. I think it's typical exaggeration. I still remember when there were developers saying that with DirectX12 the performance in games would increase up to 300% hahaha.
It exactly address your point.

The point is the CPU in PS5 is not used to compress or decompression and it has dedicated silicon for the function.

Kraken is used in conjunction with the dedicated unit. They covered all of this in road to PS5. I recommend you rewatch it, it will answer all your questions.
 

Vergil1992

Member
It exactly address your point.

The point is the CPU in PS5 is not used to compress or decompression and it has dedicated silicon for the function.

Kraken is used in conjunction with the dedicated unit. They covered all of this in road to PS5. I recommend you rewatch it, it will answer all your questions.
I'll spend a little time later, but that's not exactly the point.

In one paragraph it says that Kraken's workload is equivalent to 5 full Zen2 cores, that is, if you intensively use Kraken to decompress and then take it to a platform where there is no custom decompression system, in theory the impact on the Performance should be absolutely massive if this were true.


Speaking more clearly: On PS5 Spiderman Remastered uses the Kraken system for asset decompression. When they made the PC version, when you put a CPU similar in terms of performance to the PS5 CPU, the performance is very close to PS5, but if really the Kraken loading work could occupy 5 Zen2 cores, the game performance would be a disaster in a processor so similar to the PS5.


Everything we have seen leads us to the conclusion that it is true that having a custom decompression system like Kraken is very interesting and you can gain performance (because the CPU and/or GPU would not have that task), but not as much as it says there. . On PC there is no Kraken, not even DirectStorage worked until recently, and the impact is not especially severe when decompression is done by the CPU or GPU.


I don't know if I'm explaining myself well and I don't claim to have the absolute truth; I have a lot to learn and I appreciate your efforts. But I still can't find an explanation for what I'm saying.
Likewise, we are going to assume that all this is true, but...


- Cyberpunk's problem seems to be a GPU limitation, not a CPU one.

- 10% of a core (8 cores) is "almost nothing" at the performance level, it is not very different from the advantage that XSX has in frequencies (its cpu can be 5% faster in general).


I don't think there's anything to debate about the CPU here. The question is what is causing the difference in GPU performance. One possible explanation is the difference in resolution, but I don't think it explains the +10fps difference, personally, at least on its own.
 

Mr Moose

Member
I'll spend a little time later, but that's not exactly the point.

In one paragraph it says that Kraken's workload is equivalent to 5 full Zen2 cores, that is, if you intensively use Kraken to decompress and then take it to a platform where there is no custom decompression system, in theory the impact on the Performance should be absolutely massive if this were true.


Speaking more clearly: On PS5 Spiderman Remastered uses the Kraken system for asset decompression. When they made the PC version, when you put a CPU similar in terms of performance to the PS5 CPU, the performance is very close to PS5, but if really the Kraken loading work could occupy 5 Zen2 cores, the game performance would be a disaster in a processor so similar to the PS5.


Everything we have seen leads us to the conclusion that it is true that having a custom decompression system like Kraken is very interesting and you can gain performance (because the CPU and/or GPU would not have that task), but not as much as it says there. . On PC there is no Kraken, not even DirectStorage worked until recently, and the impact is not especially severe when decompression is done by the CPU or GPU.


I don't know if I'm explaining myself well and I don't claim to have the absolute truth; I have a lot to learn and I appreciate your efforts. But I still can't find an explanation for what I'm saying.
Likewise, we are going to assume that all this is true, but...


- Cyberpunk's problem seems to be a GPU limitation, not a CPU one.

- 10% of a core (8 cores) is "almost nothing" at the performance level, it is not very different from the advantage that XSX has in frequencies (its cpu can be 5% faster in general).


I don't think there's anything to debate about the CPU here. The question is what is causing the difference in GPU performance. One possible explanation is the difference in resolution, but I don't think it explains the +10fps difference, personally, at least on its own.
I don't think it's always decompressing everything, and when it does it doesn't take up CPU resources because it has its own to deal with that stuff.
You might be overthinking it, it's just about loading crap into memory from the SSD. It's not like the console gains extra CPU power it just doesn't need to use any of the main CPU for those tasks.
 
Last edited:

Vergil1992

Member
I don't think it's always decompressing everything, and when it does it doesn't take up CPU resources because it has its own to deal with that stuff.
You might be overthinking it, it's just about loading crap into memory from the SSD. It's not like the console gains extra CPU power it just doesn't need to use any of the main CPU for those tasks.
Maybe, but I think he was exaggerating. I think having a decompression system helps, but I don't think it has that much of an impact on the CPU and even less so on the GPU. Generally, these types of customizations have the purpose of helping the CPU/GPU, that is, freeing them from certain loads, but what is said in the interviews and the final result tends to be greatly exaggerated.


It's my opinion, of course.
 

Mr Moose

Member
Maybe, but I think he was exaggerating. I think having a decompression system helps, but I don't think it has that much of an impact on the CPU and even less so on the GPU. Generally, these types of customizations have the purpose of helping the CPU/GPU, that is, freeing them from certain loads, but what is said in the interviews and the final result tends to be greatly exaggerated.


It's my opinion, of course.
Do you believe MicroSoft?
"Plus it has other benefits," enthuses Andrew Goossen. "It's less latent and it saves a ton of CPU. With the best competitive solution, we found doing decompression software to match the SSD rate would have consumed three Zen 2 CPU cores. When you add in the IO CPU overhead, that's another two cores. So the resulting workload would have completely consumed five Zen 2 CPU cores when now it only takes a tenth of a CPU core. So in other words, to equal the performance of a Series X at its full IO rate, you would need to build a PC with 13 Zen 2 cores. That's seven cores dedicated for the game: one for Windows and shell and five for the IO and decompression overhead."
PS5s is better than the one in Xbox.
 
Last edited:
I don't know how it fits to ongoing discussions but XSX CPU also reserves 10% of one core for I/O processing.

No it reserves 1 core for non game execution of which 10% of the utilisation will be for I/O processing, the OS is running at the same level as the IO with the games running on a level above that in operation because DirectX is not a low level API it's running above the OS.
 
I would say the engine is issuing jobs that are more sequential in nature than concurrent, this would benefit a faster GPU clock vs more jobs executing at once (not everything can be parallelized to this extent that isn't strictly vector data for SIMD instructions to take advantage of).
 

sachos

Member
This is the type of like for like benchmarks that i love from DF. I don't know why they stopped doing them in their more recent videos, they used to do long benchmarks side by side comparissons, now instead they show one console at a time, first one console in one area and then they show the other in a slightly or totally different run.
 
Last edited:

Vergil1992

Member
This is the type of like for like benchmarks that i love from DF. I don't know why they stopped doing them in their more recent videos, they used to do long benchmarks side by side comparissons, now instead they show one console at a time, first one console in one area and then they show the other in a slightly or totally different run.
I agree, in that sense today VG Tech is much better, the problem is that they do very little analysis.



On the topic at hand, I think we think too much about hardware differences when, for example, Cyberpunk ran better on PS4 Pro than on One X:


You can see in the video how this affects the overall presentation, especially on PlayStation 4 Pro, but the upshot is that there are clear performance benefits. Sony's enhanced console always ran the game best, even beating out the more powerful Xbox One X.

Captura-de-pantalla-2024-01-12-172340.png



PS4 Pro performs much better than Xbox One X in this game. We know that the power difference was large in favor of One X and the latter far outperformed the PS4 Pro in most cross-platform games.

It has a more powerful GPU across the board, a slightly faster CPU, more VRAM, more bandwidth, etc.

There is nothing at the hardware level that justifies this. By this I mean we can speculate, but in the end, it most likely comes down to the most important factor: software.
We have already seen that in each multiplatform the difference can vary.


Alan Wake 2 runs best on XSX.
Cyberpunk works best on PS5.
Prince of Persia: The Lost Crown works best on XSX.
Baldurs Gate 3 works best on PS5.
The Talos Principle works best in XSX.
Call of Duty MW3 works best on PS5.


In some the differences are relatively large (Alan Wake 2, Cyberpunk), and in others very scarce (COD), and we have examples of games in which there is no clear winner (Robocop has a better framerate in XSX but a reduced graphic setting, Avatar has a higher resolution in XSX but has more stuttering and more noise, AC: Mirage has a higher minimum resolution in XSX but tends to have more low frame rates...).

Even in the cases cited above there is debate, because for example Cyberpunk also works at a slightly higher resolution on XSX, and COD: MW3 according to NXGamer in 120fps mode works better on XSX.


I PERSONALLY think that XSX has a slight advantage in hardware.

This explains why it is more common to run at slightly higher resolutions, or the DRS is less aggressive, but if PS5 is the main console for game development and receives better treatment (optimization), then this is the result, which remains in technical tie.
 
Last edited:

Mr Moose

Member
There is nothing at the hardware level that justifies this. By this I mean we can speculate, but in the end, it most likely comes down to the most important factor: software.
Apart from the resolution difference? And they improved performance with patches.
One X tops out at 1800p, while Pro maxes out at 1200p and also has a lower lowest res, shocking I know. A few games push for too high res on One X.
 
Interesting. 10% of a single core is really very, very little. But are we sure that PS5 does not allocate any CPU resources to I/O processing?

Likewise, it would be a negligible difference. We cannot ignore that the XSX CPU is slightly faster as well (3-5%?). According to DF, PS5 has 6.5 cores available for gaming. Is XSX the same in this regard?

CPU-PS5.jpg



Anyway, it's a separate discussion. Cyberpunk's problem is clearly a GPU limitation.
Yes we are sure if done properly and not using PS4 / PC compatible pathway (CPU decompression).
 

Vergil1992

Member
Apart from the resolution difference? And they improved performance with patches.
One X tops out at 1800p, while Pro maxes out at 1200p and also has a lower lowest res, shocking I know. A few games push for too high res on One X.


In the DF video they thought that the difference in framerate seems to be due to the CPU limitation, which does not make sense, because in fact it has a faster CPU than the PS4 Pro, they are not convinced that the limitation is the GPU.

Likewise, we have cases where "inexplicably" the most powerful console received a worse version. Mad Max or Assassin Creed Unity performed better on Xbox One than on PS4. Borderlands 3 had better graphical settings on PS4 Pro than on One X.

Captura-de-pantalla-2024-01-14-212756.png



What is observed here about Borderlands, if it were a current game, we would have several on the forum explaining that it is due to "something" that the PS5 has better, but it was One X vs PS4 Pro... there is practically nothing that One X doesn't do better than PS4 Pro. But the vegetation coincided between One X and One S. Does the story sound familiar to anyone?


Monster Hunter World was also a problematic version on OneX... less vegetation than on PS4 Pro and resolution sometimes lower in some modes.

There were games that weren't "clear" wins for PS4 Pro, others had worse performance but much higher resolutions...

Even Xbox 1 received some worse cross-platform gaming than on PS2!


Obviously it is not the same case between One X and PS4 Pro as between PS5 and Series Right now two cross-platform games have been released that work better on XSX (Prince of Persia and The Finals). Avatar performed similarly (slightly better on XSX perhaps?), COD: MW3 slightly better on PS5, The Crew performed better on PS5, and Alan Wake 2 performed better on XSX... the differences are never consistent between these machines.


Obviously between One X and PS4 Pro the absolute winner in general terms was the first, but there were exceptions. If there were exceptions in a situation where One X was much more powerful and also PS4 Pro was not the "main" console of development...When we talk about PS5 vs Series X we have to take into account many factors: the differences in hardware are less significant, and Sony has an overwhelming sales difference in its favor plus it does not have a less powerful version of its console. It is quite naive in my opinion that these two facts have no relevance in cross-platform versions.
 
Last edited:
Top Bottom