• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[DF]: The Last of Us Part 1 PC vs PS5 - A Disappointing Port With Big Problems To Address

yamaci17

Member
The difference in performance has probably mainly to do with console architecture being completely different to pc architecture. You sort of try to run a game that is coded to use unified memory and other stuff on a PC which has every resource split which leads to much copying in between and as a result not only make things slower but also having to wait for each other, drastically increasing necessary cpu as well since it has to orchestrate all these things as well.

Unless they would completely rewrite the engine for PC use its going to to run much less good on a pc with comparable specs. It takes a much stronger PC to reach an equal state.
I don't honestly care; what I try to prove is 1.6 GB thing is fake, it only promotes FUD They should remove it ASAP. RDR2 was similar, would claim "other apps" were using some weirdly high number of VRAM, but game would use all free available VRAM regardless.

What I'm trying to show and prove is that you can perfectly run critical high quality textures with 8 GB budget even at 1440p given you have the performance headroom to do so. 1080p is a cakewalk regardless. You can push textures to 7.2-7.3 GB VRAM usage if you're a casual user. If you disable steam's hardware acceleration you can push it to 7.4-7.5 GB instead.

Its just visual, it has no effect on how game ACTUALLY uses memory. It actually USES free memory if there's free memory.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Hogwarts and TLOU show far better consistency in frames with 32gb. Check out some YouTube videos comparing them. 1% and 0.1% lows are up quite a bit.

I know these are considered unoptimized games but it could also be a glimpse into the future.
Outliers do not the norm make.
And id need to actually retest Hogwartz since the newer patches, but earlier it was impossible to actually tell if the hitches were due to RAM or the game just deciding it doesnt like whats going on.


Now im not denying we will get there....its inevitable.
But as is right now there are maybe 3 outlier games that would "need" 32GB, but run near perfectly on 16GB.
The way basically every game released suddenly needed 16GB around XB1/PS4, the outliers were the games that ran on 8GB well........that moment hasnt happened yet for 24GB let alone 32GB.

But if I was building a system today, I would for sure jump on 32 or 48GB of RAM.
If PC users are running the game in accordance with spec recommendations Naughty Dog release and they are having performance issues, then that is absolutely on Naught Dog and they should fix ASAP. But which specs are you referring to? Because console has an edge on PC in terms of memory management that's just a fact. Maybe positions will change in the future, but for now this is where we are. The issue is many of them brushed off the spec requirements and assume CPU/GPU is all that matters, like these guys below (I won't directly tag because I don't want them to think I'm picking on them or attacking them),


And now they are arguing that X amount of GB or their chosen PC GPU should be enough to run the game on their preferred settings (which is often ultra) since it was enough in the PS4/XBONE generation. As if they know better than the developers about what their game should require. To me, this mentality reeks of arrogance.
i-said-what-i-said-nene-leakes.gif

Ill mark this post so i can return to it once they have actually fixed the game.






P.S Never be afraid to tag me, im not scared to be wrong and PC discussions are my favorite on this forum......so please dont leave me out regardless of whether you think im talking out my ass or agree with my sentiments.
EKob-O_XUAAffRK.jpg
 

Kataploom

Gold Member
Plague Tale Requiem is a PC game ported to consoles, using the base from PC is easier than the other way around.

TLOU is optimized for PS5, it has a total different, more efficient main architecture. It’s way harder to port that to PC efficiently and effectively than the other way around.
The problem with that logic is that they should also optimize it for PC, as on consoles, it's not magical, it's easier once you have your tools set up for each architecture but this is clearly not the case, as you mentioned above, the memory management system needs to be redone completely, it's clearly a design flaw not accounting for platform they're porting it to, and that's just about the heavy use of resources, the bugs and glitches are another issue.
 

yamaci17

Member
Outliers do not the norm make.
And id need to actually retest Hogwartz since the newer patches, but earlier it was impossible to actually tell if the hitches were due to RAM or the game just deciding it doesnt like whats going on.


Now im not denying we will get there....its inevitable.
But as is right now there are maybe 3 outlier games that would "need" 32GB, but run near perfectly on 16GB.
The way basically every game released suddenly needed 16GB around XB1/PS4, the outliers were the games that ran on 8GB well........that moment hasnt happened yet for 24GB let alone 32GB.

But if I was building a system today, I would for sure jump on 32 or 48GB of RAM.

i-said-what-i-said-nene-leakes.gif

Ill mark this post so i can return to it once they have actually fixed the game.






P.S Never be afraid to tag me, im not scared to be wrong and PC discussions are my favorite on this forum......so please dont leave me out regardless of whether you think im talking out my ass or agree with my sentiments.
EKob-O_XUAAffRK.jpg
you can see the problems and attitude with 16 gb users. many had "please wait" prompts while playing tlou remake. i did not get any. not once. I always use my system clean and have idle ram usage of 2.5 gb thanks to cleaning unnecessary working set allocations by other apps.


practically shut all programs down except game launcher, clean working set with rammap, and voila, you have 2.5 gb idle ram usage. game will use 11-12 gb vram and will run without "please wait" prompts or big stalls and stutters

most people I see on online have idle ram usage of 6-8 gb just by running "windows" with nothing on somehow. these people will naturally have to upgrade to 32 gb. not much can be done on that front.

even explanining this stuff, they will come out and say "he a 16 gb defender, he suggests 16 gb to people in 2023!1" heck not i'm not. i just happen to have 16 gb and i aint upgrading. i dont want to invest in ddr4 further. I will push it as much as I can. i dont have endless pockets either. if new system, shoot for 32 gb without a doubt. if stuck with 16 gb, turn unnecessary stuff, clean working sets, have peak idle ram usage, and enjoy the game. otherwise upgrade to 32 gb if you want medium to heavy multitasking. (16 gb still gives me light multitasking however, I can still browse stuff)
 
I don't honestly care; what I try to prove is 1.6 GB thing is fake, it only promotes FUD They should remove it ASAP. RDR2 was similar, would claim "other apps" were using some weirdly high number of VRAM, but game would use all free available VRAM regardless.

What I'm trying to show and prove is that you can perfectly run critical high quality textures with 8 GB budget even at 1440p given you have the performance headroom to do so. 1080p is a cakewalk regardless. You can push textures to 7.2-7.3 GB VRAM usage if you're a casual user. If you disable steam's hardware acceleration you can push it to 7.4-7.5 GB instead.

Its just visual, it has no effect on how game ACTUALLY uses memory. It actually USES free memory if there's free memory.
Yep and I said you’re right, when the game is built up from the ground for PC.
 

winjer

Member
3600X is a six core CPU and the CPU in the PS5 is a eight core CPU.

The PS5 reserves 1C2T for OS operations.
Also, the CPU on the PS5 onley has 4M of L3 cache per CCX. While the 3600X has 16MB.
And the 3600X uses DDR4, which has much lower latency. At around 75ns. Around 65ns with tuned ram.
The latency of GDDR6 on the PS5 is around 140ns.

So because of the smaller L3 cache, cache misses are more common of the PS5. And because of the memory latency, whenever there is a cache miss, the penalty is huge.
 
Last edited:

Lysandros

Member
And the 3600X uses DDR4, which has much lower latency. At around 75ns. Around 65ns with tuned ram. The latency of GDDR5 on the PS5 is around 40ns.
PS5 uses GDDR6 not GDDR5. And you are actually stating that PS5's RAM has lower latency here…
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
you can see the problems and attitude with 16 gb users. many had "please wait" prompts while playing tlou remake. i did not get any. not once. I always use my system clean and have idle ram usage of 2.5 gb thanks to cleaning unnecessary working set allocations by other apps.


practically shut all programs down except game launcher, clean working set with rammap, and voila, you have 2.5 gb idle ram usage. game will use 11-12 gb vram and will run without "please wait" prompts or big stalls and stutters

most people I see on online have idle ram usage of 6-8 gb just by running "windows" with nothing on somehow. these people will naturally have to upgrade to 32 gb. not much can be done on that front.

even explanining this stuff, they will come out and say "he a 16 gb defender, he suggests 16 gb to people in 2023!1" heck not i'm not. i just happen to have 16 gb and i aint upgrading. i dont want to invest in ddr4 further. I will push it as much as I can. i dont have endless pockets either. if new system, shoot for 32 gb without a doubt. if stuck with 16 gb, turn unnecessary stuff, clean working sets, have peak idle ram usage, and enjoy the game. otherwise upgrade to 32 gb if you want medium to heavy multitasking. (16 gb still gives me light multitasking however, I can still browse stuff)
My media center is what was my old old gaming rig.....its stuck with Sandy bridge, a GTX1070 and 16GB of RAM.
Every now and then I install games that supposedly kill 16GB of RAM on it just to see what will happen.
Even with background tasks still running (obviously not chrome) Windows will basically push things to the background to let the game do the most with what its got.
And while playing at 1080p in this day and age is a travesty......the GTX1070 actually still has legs as long as you do some settings-foo.
 

Hugare

Member
You guys trying to make sense of this port, done by Iron Galaxy, same guys who made the infamous Batman AK port, is just hilarious.

"Terrible port done by studio that has made one of the worst ports of all time is bad due to incompetence? No, of course not. It's the unified RAM or whatever"

Never change, GAF

PS: I find it disgusting how ND have been stealing the credits lately from other studios so they can boost sales.
Part I was partially done by them. Most of the work was done by an internal studio inside Sony and Bend also helped.

Now this port wasnt done by them, but by Iron Galaxy. But for marketing reasons, they said ND developed it in-house.

Well, get fucked now, ND
 
Last edited:

yamaci17

Member
My media center is what was my old old gaming rig.....its stuck with Sandy bridge, a GTX1070 and 16GB of RAM.
Every now and then I install games that supposedly kill 16GB of RAM on it just to see what will happen.
Even with background tasks still running (obviously not chrome) Windows will basically push things to the background to let the game do the most with what its got.
And while playing at 1080p in this day and age is a travesty......the GTX1070 actually still has legs as long as you do some settings-foo.
yup. my friend sent me a video of 16 gb 32 gb comparison on a 4090 where txtures are set to ultra, game uses 15-17 gb vram and 16 gb is shitting the bed.

game functions perfectly fine on my end with tailored settings for 8-10 gb vram. people are unable to comprehend that



see how 16 gb "is dead" on this scene on my end

YeXDvEC.png


wow dead.

who pairs a 3090 or 4090 with 16 gigs? that's their problem. never should've done that.

cfx1l0U.png
 

ClosBSAS

Member
.

You not watched the video, do you?

I suggest to watch. Unless you're suggesting a core i9 12900k with a Nvidia RTX 4090 is not up to the task.
Nah son, 3080ti 11700k here and it works just fine. Peasants with 8gb just can't play it.

I'm all seriousness though....ya. I can play it just fine, don't really care about the ones who can't.
 

DenchDeckard

Moderated wildly
I thought it was proven this was in house? Didn't Neil druckman say it himself?

And even if it was iron galaxy, which I'm confused that could be the case when there's multiple reports of them supporting it minimally compared to ND. It was still ND and Sonys decision to release it in this state.

They deserve the heat.
 
I think the direct comparison they showed of PS5 vs PC w/o DLSS (more fair to PS5) running at native 1440p saw perf being closer to ~50% and higher on the PS5. Don't know why they are surprised by this as we've seen this numerous times particular with PlayStation ports to PC in the past. Games optimized for PlayStation are going to be very difficult to get the same utilization and efficiency from similar HW on PC.


C1z720o.jpg
The PS5 has I/O advantage. MS try to close the gap on PC with it's Velocity Architecture (direct storage, SFS, GPU decompression), but PS5 (and XSX as well) still has one BIG advantage, a dedicated decompression chip.

I thought GPU decompression will be cheap, but direct storage GPU decompressiom demo shows insane GPU usage, so I doubt developers will want to use this feature for texture streaming during gameplay (people would get GPU related stutters and big performance fluctutation during texture streaming). On PC it's much better idea to preload more assets into VRAM to save CPU and GPU resources. TLOU remake was build from the ground up for the PS5 and such game will never scale the same when ported to PC.

The PS5 has 13GB RAM available to developers, but I wonder how much memory naughty dog has allocated to the GPU. On the PS4 developers were allocating aroud half of the available memory to GPU, and if that's also the case now it means the remake use around 7GB VRAM on the PS5. For comparison the PC version can use up to 14GB at 4K, and that's the price of loading everything into VRAM rather than using SSD and a dedicated decompression chip to stream the data.
 
Last edited:

ChiefDada

Member
The PS5 has I/O advantage. MS try to close the gap with Velocity Architecture (direct storage, SFS, GPU decompression), but PS5 still has one BIG advantage, a dedicated decompression chip.

Xbox has dedicated decompression hardware as well.

I thought GPU decompression will be cheap, but direct storage GPU decompressiom demo shows insane GPU usage, so I doubt developers will want to use this feature for texture streaming during gameplay (people would get GPU related stutters and big performance fluctutation during texture streaming). On PC it's much better idea to preload more assets into VRAM to save CPU and GPU resources. TLOU remake was build from the ground up for the PS5 and such game will never scale the same when ported to PC.

I would love to see support for this as it was my understanding GPU decompression via directstorage api had a very negligible impact on performance.
 
Xbox has dedicated decompression hardware as well.



I would love to see support for this as it was my understanding GPU decompression via directstorage api had a very negligible impact on performance.
Of course it depends on your GPU. On my GTX1080 I saw up to 50% GPU usage, and that's a lot, but I imagine something like RTX4090 can decompress the data with maybe like 10% performance hit.
 

BbMajor7th

Member
On balance, I think the issue here is the architectural difference. Whether or not you buy into the Cerny Secret Sauce, it's a fact that PS5 is using bespoke architecture with low-level API utilisations. Games targetting bespoke hardware configs as primary platforms are going to be doing so at the engine level and the coding re-writes necessary to port them are going to be immense. This will either get worse, as developers juice the console hardware for everything it has (as with previous generations) or, if strategy changes, developers might roll back some of that bespoke coding to make the porting process more efficient and improve the overall quality of PC releases.

Reminds me of when ND first ported TLOU to PS4: even getting that base PS3 code, without any upgrades, running on vastly more powerful hardware was a titanic struggle because the ND engine at that stage had been tuned like crazy to the PS3 architecture. This isn't nearly as extreme, but it's quantifiable in the difficulties PC ports are consistently running into.
 
Last edited:

Thebonehead

Banned
Whoever approved this to be released in this state should be fired. Didn't the same guys port Batman AC?

Naughty dog can't fire themselves.

Iron galaxy only had a minor part of this according to themselves and naughty dog. Naughty dog were the main developers on this shower of shit.
 
Last edited:

Mr.Phoenix

Member
Wait, are you saying this based solely on this PC port? They were never multiplatform/PC developers to begin with (they are now "thanks" to Sony). This doesn't mean that they aren't masters at getting the most of a single console SKU, this goes all the way up to Crash 1 on PS1, they always pushed tech boundaries.
No. I am not saying they aren't good on the technical side of things. I mean they have a lot to do with sonys ICE team and whatnot. But I feel that ND are kinda like artists first and engineers second whereas studios like GG and/or Buepont are engineers first and artists second...if that makes any sense.

eg, even with the PS4, ND was I think the only sony dev to not even bother with CBR on the PS4pro and just settled for 1440p. And they did it again on the PS5 with this port, their performance mode was just a straight-up 1440p on the PS5, nothing wrong with that, but not what I would call technically adventurous or cutting edge.
 

blastprocessor

The Amiga Brotherhood
Wait, why does the PS5 need 13 seconds to load with the miracle that its SSD is? Spiderman loaded in two seconds in Sony's marketing stunt, what is going on here that takes 13 seconds to fill up 13 gigs of RAM?
I would imagine if you've got multiple smaller files to load/decompress versus multiple larger files to load/decompress, the former is going to take longer to load. That would be my guess.

If you have a fast SSD drive unpack a zip that contain 100's of small files, it really bogs down even the largest bandwidth SSDs. My Samsung 990 Pro is super fast copying big video files however.
 
Last edited:

Lysandros

Member
Xbox has dedicated decompression hardware as well.
Of course. But not an entire I/O complex with two co-processor, SRAM, DMAC and Scrubber informing CEs designed to remove every single bottleneck between storage and V/RAM. XSX's decompressor block in itself is also far less powerful. There is an undeniable hardware gap on the I/O side between the systems.
 
Last edited:
1- It's a bad port, drawing conclusions on power issues on console and pc is absurd.

2- Who expects to pass the entire generation with graphics with 8 gigabytes of vram and play in high settings... no.

3- The port has been made by iron galaxy (although ND and Sony are co-responsible), doubting ND technically at this point seems like a gigantic LOL to me.
 

ChiefDada

Member
Of course. But not an entire I/O complex with two co-processor, SRAM, DMAC and Scrubber informing CEs designed to remove every single bottleneck between storage and V/RAM. XSX's decompressor block in itself is also far less powerful. There is an undeniable hardware gap on the I/O side between the systems.

No disagreements from me here. The guy I was responding too specifically mentioned hardware decompression, but maybe he meant to say the entire i/o hardware. Just wanted to set the record straight.
 

platina

Member
Plague Tale Requiem is a PC game ported to consoles, using the base from PC is easier than the other way around.

TLOU is optimized for PS5, it has a total different, more efficient main architecture. It’s way harder to port that to PC efficiently and effectively than the other way around.
It really gets you thinking. With the whole ND developing on pc initiative from here on I can imagine they will tweak their engine to drop some of “coding to the metal” optimizations for the ps5. For example, if we’re getting this level of performance disparity now I can’t imagine them iterating this engine to support streaming at 5.5 gb/s as the bare minimum for pc because you still need to factor in low/medium spec pc’s most people have.
 

Tqaulity

Member
it would be fair if the port wasn't pure garbage. as it is, there's no fair comparison
See this is always the same excuse every time there is a PC port that doesn’t seem to meet expectations. At some point when it becomes the norm and not the exception to see poor PC ports you have to realize that maybe it’s not just the developer being incompetent. Maybe it’s really that porting to PC is f**kin difficult, more so now than ever before. And while everyone only wants to talk about how superior the specs are for high end PCs today, maybe we need to realize that a real core part of the PC experience (today) is development hurdles and system inefficiencies that make it likely that there will be issues and bugs at launch.

Callisto Protocol, Forspoken, Spider-Man, Horizon, Witcher 3, Hogwarts Legacy, Atomic Heart, Returnal, now TLOU….and the list goes on. Different engines, different developers, different budgets, similar results. It can’t always just be “developer sucks and don’t know what they’re doing” or “it’s a crappy port. Yeah it is a bad port, but there are real reasons for that that are inherent in the PC platform. OS, drivers, HW permutations, open platform, system architecture (Separate CPU/GPU dies, separate memory pools etc) all are real aspects of PC development that are oftentimes expensive hurdles to overcome. And clearly many of these issues cannot be resolved by pure brute force HW. Shader comp stutters for example and CPU boundiness (which really makes all that 4090 GPU power somewhat wasteful) is often limiting the more powerful HW. It really will take new SW architecture for games to fix this which take a tremendous amount of resources.

Since the consoles are basically just PC parts today, it really has never been more apparent of the value of designing a system optimized for gaming. Same general HW components, but will often run way more efficient and just easier to utilize due to the removal of many of the hurdles I eluded to. I work in the industry and I work with developers everyday and I can tell you they are frustrated by the PC issues as well. And it usually comes down to resources (time, people, and money) as to why they cannot find and squash these issues before launch. With games increasing in technology, this probably will continue to get worse before it gets better. No wonder everyone is moving to UE5 😜
 

Senua

Gold Member
See this is always the same excuse every time there is a PC port that doesn’t seem to meet expectations. At some point when it becomes the norm and not the exception to see poor PC ports you have to realize that maybe it’s not just the developer being incompetent. Maybe it’s really that porting to PC is f**kin difficult, more so now than ever before. And while everyone only wants to talk about how superior the specs are for high end PCs today, maybe we need to realize that a real core part of the PC experience (today) is development hurdles and system inefficiencies that make it likely that there will be issues and bugs at launch.

Callisto Protocol, Forspoken, Spider-Man, Horizon, Witcher 3, Hogwarts Legacy, Atomic Heart, Returnal, now TLOU….and the list goes on. Different engines, different developers, different budgets, similar results. It can’t always just be “developer sucks and don’t know what they’re doing” or “it’s a crappy port. Yeah it is a bad port, but there are real reasons for that that are inherent in the PC platform. OS, drivers, HW permutations, open platform, system architecture (Separate CPU/GPU dies, separate memory pools etc) all are real aspects of PC development that are oftentimes expensive hurdles to overcome. And clearly many of these issues cannot be resolved by pure brute force HW. Shader comp stutters for example and CPU boundiness (which really makes all that 4090 GPU power somewhat wasteful) is often limiting the more powerful HW. It really will take new SW architecture for games to fix this which take a tremendous amount of resources.

Since the consoles are basically just PC parts today, it really has never been more apparent of the value of designing a system optimized for gaming. Same general HW components, but will often run way more efficient and just easier to utilize due to the removal of many of the hurdles I eluded to. I work in the industry and I work with developers everyday and I can tell you they are frustrated by the PC issues as well. And it usually comes down to resources (time, people, and money) as to why they cannot find and squash these issues before launch. With games increasing in technology, this probably will continue to get worse before it gets better. No wonder everyone is moving to UE5 😜
Wait what was wrong with Spiderman, Atomic Heart and Returnal? Atomic Heart was missing the advertised RTX which is a piss take but it runs and looks great.
 

GHG

Member
the VRAM setup is insanity. you can't allocated MULTIPLE GIGABYTES of VRAM to Windows tasks and think that this makes any sense.

Windows and other programs that you usually have open while playing games don't use any significant amount of video memory.
this is clearly just a copy paste job of the PS5 allocation that wasn't properly adjusted for the PC port.

Look, maybe it is the case that this particular game does a poor job with memory management but let's not pretend 8GB will be sufficient going forwards for those wanting to game at higher resolutions. Here is RE4: remake, a game that many in this thread have used as a point of reference for being a good port:




So if even the good ports are running out of vram on 8gb cards and that becomes the primary bottleneck before anything else is then what does that tell you?
 

Lasha

Member
How many games even use 20GB let alone needing 32GB?

Not many. Star citizen will eat up 20+ if available. Star citizen is in development and poorly optimized though. DCS can go over 32 in some of the big maps. DCS maps are enormous and coordinate a sim battlefield for 24-100 players. Some FS2020 add ons will push the usage over 20 due to the complexity of the sim. Not much else that I know of. TLOU port's resource usage seems excessive by comparison.
 

Thaedolus

Member
See this is always the same excuse every time there is a PC port that doesn’t seem to meet expectations. At some point when it becomes the norm and not the exception to see poor PC ports you have to realize that maybe it’s not just the developer being incompetent. Maybe it’s really that porting to PC is f**kin difficult, more so now than ever before. And while everyone only wants to talk about how superior the specs are for high end PCs today, maybe we need to realize that a real core part of the PC experience (today) is development hurdles and system inefficiencies that make it likely that there will be issues and bugs at launch.

Callisto Protocol, Forspoken, Spider-Man, Horizon, Witcher 3, Hogwarts Legacy, Atomic Heart, Returnal, now TLOU….and the list goes on. Different engines, different developers, different budgets, similar results. It can’t always just be “developer sucks and don’t know what they’re doing” or “it’s a crappy port. Yeah it is a bad port, but there are real reasons for that that are inherent in the PC platform. OS, drivers, HW permutations, open platform, system architecture (Separate CPU/GPU dies, separate memory pools etc) all are real aspects of PC development that are oftentimes expensive hurdles to overcome. And clearly many of these issues cannot be resolved by pure brute force HW. Shader comp stutters for example and CPU boundiness (which really makes all that 4090 GPU power somewhat wasteful) is often limiting the more powerful HW. It really will take new SW architecture for games to fix this which take a tremendous amount of resources.

Since the consoles are basically just PC parts today, it really has never been more apparent of the value of designing a system optimized for gaming. Same general HW components, but will often run way more efficient and just easier to utilize due to the removal of many of the hurdles I eluded to. I work in the industry and I work with developers everyday and I can tell you they are frustrated by the PC issues as well. And it usually comes down to resources (time, people, and money) as to why they cannot find and squash these issues before launch. With games increasing in technology, this probably will continue to get worse before it gets better. No wonder everyone is moving to UE5 😜
Every one of the issues you bring up can be addressed in ways that make the user experience not just ok, but far superior to the console experience. Stuttering is a shader comp issue that can be addressed by precompiling shaders, it’s a stupid developer choice to not do so. APIs and drivers and what have you have never been better than they are now on PC. I don’t seem to have any issues with GamePass releases, which kinda says to me Microsoft’s quality control is pretty good compared to any rando developer on Steam.

I booted up my PS5 for the first time in a while last night to try the new GT7 patch…guess what? Had to rebuild the database because I guess the power bumped at some point, had to download a system update, and oh yeah GT7 wasn’t updated yet either so there’s a 13GB download that still takes forever on PSN for whatever reason, while Steam maxxes out my wifi bandwidth. PS5 is still a great piece of hardware but let’s not pretend that PC is some kind of shit show while consoles don’t have their dumb issues too. And RE4 with M+KB > pleb controller
 

01011001

Banned
Look, maybe it is the case that this particular game does a poor job with memory management but let's not pretend 8GB will be sufficient going forwards for those wanting to game at higher resolutions. Here is RE4: remake, a game that many in this thread have used as a point of reference for being a good port:




So if even the good ports are running out of vram on 8gb cards and that becomes the primary bottleneck before anything else is then what does that tell you?


if you play RE4 on an 8GB card you can play it on the high 2GB textures setting and RT and you won't have any issues. these settings are broadly equivalent to what the SX version looks like.

RE4 also isn't an especially great port either. the fact that it crashes so easily when running out of memory isn't good and shouldn't be how the game handles this.
 
Last edited:

Zathalus

Member
See this is always the same excuse every time there is a PC port that doesn’t seem to meet expectations. At some point when it becomes the norm and not the exception to see poor PC ports you have to realize that maybe it’s not just the developer being incompetent. Maybe it’s really that porting to PC is f**kin difficult, more so now than ever before. And while everyone only wants to talk about how superior the specs are for high end PCs today, maybe we need to realize that a real core part of the PC experience (today) is development hurdles and system inefficiencies that make it likely that there will be issues and bugs at launch.

Callisto Protocol, Forspoken, Spider-Man, Horizon, Witcher 3, Hogwarts Legacy, Atomic Heart, Returnal, now TLOU….and the list goes on. Different engines, different developers, different budgets, similar results. It can’t always just be “developer sucks and don’t know what they’re doing” or “it’s a crappy port. Yeah it is a bad port, but there are real reasons for that that are inherent in the PC platform. OS, drivers, HW permutations, open platform, system architecture (Separate CPU/GPU dies, separate memory pools etc) all are real aspects of PC development that are oftentimes expensive hurdles to overcome. And clearly many of these issues cannot be resolved by pure brute force HW. Shader comp stutters for example and CPU boundiness (which really makes all that 4090 GPU power somewhat wasteful) is often limiting the more powerful HW. It really will take new SW architecture for games to fix this which take a tremendous amount of resources.

Since the consoles are basically just PC parts today, it really has never been more apparent of the value of designing a system optimized for gaming. Same general HW components, but will often run way more efficient and just easier to utilize due to the removal of many of the hurdles I eluded to. I work in the industry and I work with developers everyday and I can tell you they are frustrated by the PC issues as well. And it usually comes down to resources (time, people, and money) as to why they cannot find and squash these issues before launch. With games increasing in technology, this probably will continue to get worse before it gets better. No wonder everyone is moving to UE5 😜
Almost all of the games you mentioned were better on PC despite having issues. Let's not pretend console games have not had issues either. Callisto Protocol on Xbox was a mess, Hogwarts Legacy RT mode is basically unusable on consoles and the image quality for Dead Space, Forspoken and RE4 for PS5 at launch was terrible (in RE4 case it still is).
 

GHG

Member
RE4 also isn't an especially great port either. the fact that it crashes so easily when running out of memory isn't good and shouldn't be how the game handles this.

How else do you think games typically handle running out of vram? They either crash or stutter as the framerate goes through the floor. Take your pick.

Also I think you should actually watch the video. The textures at the setting you described are exactly what he ends up doing but it doesn't look so hot while doing so.
 

01011001

Banned
How else do you think games typically handle running out of vram? They either crash or stutter as the framerate goes through the floor. Take your pick.

Also I think you should actually watch the video. The textures at the setting you described are exactly what he ends up doing but it doesn't look so hot while doing so.

a game should never crash when running out of vram. stutters are fine, you can adjust and dial that in on the fly. crashing is unacceptable.

and the textures in the console versions also don't look all that great at times. the way the game handles texture streaming on all systems is weird and has a negative impact on the overall presentation.

I've seen multiple texture issues playing performance mode + rt on Series X
 

GHG

Member
and the textures in the console versions also don't look all that great at times. the way the game handles texture streaming on all systems is weird and has a negative impact on the overall presentation.

I've seen multiple texture issues playing performance mode + rt on Series X

Well that's the sticking point. The 3070 ti is a GPU with more grunt than any of the consoles, that's not even up for debate, however because of the VRAM situation it's not able to run this particular game (RE4 remake, and the game in question for this thread to a certain extent) at settings any better than what's on the consoles. That should not be happening, if that card is going to be bottlenecked it should happen elsewhere and it should happen when it's pushing things at a level above what the consoles are producing.

I sympathise with people when it comes to VRAM because by and large everyone is at the mercy of the GPU vendors and their decisions on memory configurations. What it often means is that you need to step up to higher level cards in order to get more VRAM. However, people who are purchasing cards with 8GB of VRAM or less with the mindset that it's going to carry them through this console generation are kidding themselves. If you must purchase a card with 8GB or less then you need to be realistic.
 
Last edited:

GHG

Member
During travel, I too prefer the airplane to crash instead of dealing with annoying turbulence.

I prefer a smooth experience, preferably with my feet up and a beverage in hand. But that's why I fly A380 business.

I tried to tell you your little turboprop planes weren't going to be sufficient for future journeys, some of you didn't listen.

Matt Leblanc Whatever GIF


Strap yourselves in and enjoy your trip.

Scared Explosion GIF by 9-1-1: Lone Star
 
Last edited:

01011001

Banned
Well that's the sticking point. The 3070 ti is a GPU with more grunt than any of the consoles, that's not even up for debate, however because of the VRAM situation it's not able to run this particular game (RE4 remake, and the game in question for this thread to a certain extent) at settings any better than what's on the consoles. That should not be happening, if that card is going to be bottlenecked it should happen elsewhere and it should happen when it's pushing things at a level above what the consoles are producing.

I sympathise with people when it comes to VRAM because by and large everyone is at the mercy of the GPU vendors and their decisions on memory configurations. What it often means is that you need to step up to higher level cards in order to get more VRAM. However, people who are purchasing cards with 8GB of VRAM or less with the mindset that it's going to carry them through this console generation are kidding themselves. If you must purchase a card with 8GB or less then you need to be realistic.

the issue isn't vram sizes or gpu vendor choices, or even console performance comparisons.
the issue is that these vram issues come at ZERO improvement to the fidelity of games.

suddenly games with less ambitious RT implementations and blurry textures run out of vram and have CPU issues.
meanwhile older games like Control with real time physics on every object in the environment and the full RT feature set has no VRAM or CPU issues.
ambitious games like Cyberpunk don't run out of vram and start crashing or stuttering on 8GB cards.

but here comes RE4 with blurry textures, barely interactive environments and super low end rt and gives issues.
same for TLOU
 

GHG

Member
the issue isn't vram sizes or gpu vendor choices, or even console performance comparisons.
the issue is that these vram issues come at ZERO improvement to the fidelity of games.

suddenly games with less ambitious RT implementations and blurry textures run out of vram and have CPU issues.
meanwhile older games like Control with real time physics on every object in the environment and the full RT feature set has no VRAM or CPU issues.
ambitious games like Cyberpunk don't run out of vram and start crashing or stuttering on 8GB cards.

but here comes RE4 with blurry textures, barely interactive environments and super low end rt and gives issues.
same for TLOU

Nope, Cyberpunk had the same shit at launch. I played it on a 2070 super and it would seize up and crash at random intervals after playing for 30 minutes.

Was a widespread issue at the time and the ultimate cause of the crash? VRAM being maxed out due to a memory leak.



Google and you will find plenty more examples of people talking about this.
 
Top Bottom