• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Hogwarts Legacy PC Performance Thread

Kabelly

Member
my build is an i5-12600k, 3080 12gb and 32GB of ram. Game looks pretty crisp and is on DLSS but i still get minor stutters running here or there. Hogsmead was especially noticable for me since it was the first time I visted the area. This was today with the new update as well. It's not super bad as I'm used to playing on Switch and PS4 30fps. I put my settings on high as well even though the ingame benchmark says I can run it ultra which I can but it never looks substantianlly different. Im probably getting on average between 80fps-98 with DLSS (quality i think) on.

I have yet to try raytracing yet as it'll probably tank.
 

sendit

Member
I appreciate that. Thanks for posting it.

I knew something wasn't right with ray tracing. Even on low, it fucking crushes FPS. I have to have DLSS on Max Performance for it to be playable, but that setting looks like shit, so I turned RTX off altogether, as the video suggests.

I hope they address these issues in a future patch. The game is too good to be hobbled by bad performance.
Ray tracing in general is broken in this game. Especially reflections, it's like mixture of screen space and ray tracing.
 

DanEON

Member
Nope, the only stutter i have are like microseconds, a bit more severe with rtx on but nowhere near that level, still kinda annoying with a 2000 euros machine but what can you do?

That shit is like bullet time from matrix...

But i have a 4080.
ok, so I bought the game again on steam just to test it. With more test I found that using High textures (everything else on Ultra + RT + DLSS balanced + FG) the game runs fine (tested on Hogsmead). No big drops like in the video you posted, just some micro stutter here and there.
But, with Ultra textures frames would drop to 10-12fps. It doesn't seems to be VRAM because it was using 10gb and 11gb allocated. Maybe it's some RT + Ultra textures bug. With RT off I can run Ultra textures and no issue, even with dlss quality, which uses more vram.
 

zkorejo

Member
I just realized. If you don't fuck with the settings the game plays fine. If you fuck around with settings it starts dropping fps even if you go below minimum requirements.

Changed the engine ini, updated the game yesterday and didn't fuck with settings.. it worked fine for me.
 

sertopico

Member
Just to have a clear pictures here, did the people who suffered stuttering with high end pc (so 4000\7000 series owners) had THIS type of stuttering? (just watch the first part of the video)


On the 3080 on ultra it was happening thoe whole friggin time. Cutscenes ran at 10 fps, lots of stuttering when going from one area to another. Now that I am playing on the 4090 almost all the issues have obviously disappeared, because of the higher VRAM amount and above all the frame generation. BUT! I am still getting slowdowns when entering new areas, it's just much less noticeable cause I am doing a hundred+ fps. I am also CPU bottlenecked and that plays a huge role when it comes to stuttering.
 

GymWolf

Member
On the 3080 on ultra it was happening thoe whole friggin time. Cutscenes ran at 10 fps, lots of stuttering when going from one area to another. Now that I am playing on the 4090 almost all the issues have obviously disappeared, because of the higher VRAM amount and above all the frame generation. BUT! I am still getting slowdowns when entering new areas, it's just much less noticeable cause I am doing a hundred+ fps. I am also CPU bottlenecked and that plays a huge role when it comes to stuttering.
Ok so having some microstutter here and there with a 4080 is not an hardware problem that i could have on my pc bit it is basically the game working at "its best".
 
Last edited:

sertopico

Member
Ok so having some microstutter here and there with a 4080 is not an hardware problem that i could have on my pc bit it is basically the game working at "its best".
It is the UE4 which has been -for me at least-, a big disappointment most of the times. I hope SH will get their shit together when the first UE5 titles will be around.

In any case, having a latest gen CPU helps a lot avoiding this kind of stuttering, regardless of the fact you own a 3080 or a 4090.
 

GymWolf

Member
It is the UE4 which has been -for me at least-, a big disappointment most of the times. I hope SH will get their shit together when the first UE5 titles will be around.

In any case, having a latest gen CPU helps a lot avoiding this kind of stuttering, regardless of the fact you own a 3080 or a 4090.
I have a 13600k, not great but surely not a slouch.
 

hollams

Gold Member
Finally decided to upgrade my 8700k rig that I've had since 2015 and I went with an AMD 7700. I was going to wait for the x3D chips, but when they do come out and if they are good then it will be hard to get them and the demand could keep the price high for awhile. Since I have a 4090 and play at 4k it's might not make a huge difference anyway.

With the 8700k I was getting around 60-70fps with RT on with some dips in the castle and in Hogsmeade. I was also getting some slight delays while in the map interface and moving the cursor wasn't smooth. With the 7700 I'm 90-115 in the same areas and the map cursor is smooth now which surprised me, but the other increases with the PCI and memory speeds probably helped quite a bit too.
 

kittoo

Cretinously credulous
I think I am starting to realize now that my 3080's 10GB really isn't enough anymore for ultra textures. After Dead Space this is another game where ultra textures give huge drops every now and then.

Nvidia really should've had more VRAM in the 3080. I don't want to buy another GPU within 2 years and I don't think the current ones are worth the price anyway. Oh well...I will reduce texture quality for next 2 years until the next generation GPUs come out.
 

yamaci17

Member
I think I am starting to realize now that my 3080's 10GB really isn't enough anymore for ultra textures. After Dead Space this is another game where ultra textures give huge drops every now and then.

Nvidia really should've had more VRAM in the 3080. I don't want to buy another GPU within 2 years and I don't think the current ones are worth the price anyway. Oh well...I will reduce texture quality for next 2 years until the next generation GPUs come out.
texture quality setting in this game does not govern actual texture quality
its just reduces vram allocation pool so it streams lesser quality textures in long distances. in most cases you wont see a difference in close to mid range with high and ultra textures

their texture streamer is smart. even with low setting, unless you have super low amounts of vram, you wont see reduced texture quality around you and on your character
 
Last edited:

winjer

Gold Member
texture quality setting in this game does not govern actual texture quality
its just reduces vram allocation pool so it streams lesser quality textures in long distances. in most cases you wont see a difference in close to mid range with high and ultra textures

their texture streamer is smart. even with low setting, unless you have super low amounts of vram, you wont see reduced texture quality around you and on your character

In UE4, the texture scalability settings only adjust:
r.MaxAnisotropy
r.Streaming.PoolSize
r.Streaming.MipBias - on the 2 lowest settings, this will load lower res mipmaps. So there will be a texture quality degradation at distance
 

KXVXII9X

Member
PC gaming, gotta love it.

I've been a personal computer gamer since the days of the Amiga, but this shit is reeeeeally getting ridiculous.

I bought the PS5 deluxe edition for my son and was planning to double dip on PC, but since the game's launch I've seen like dozens of 900-post reddit threads of people complaining about poor performance on really high end PCs, along with the typical "go here, do that, edit this, reboot in this and that mode, disable x, enable y" "fixes" being recommended here and there.

Meanwhile, my kid has been happily playing the perfectly fine PS5 version.

I have an i5 12600k, 16GB of super duper fancy gaming RAM, and an RTX 3080, but I have to worry about performance and sub-30 fps dips with RTX if I buy this for PC? Nah, I'm good man. Mission double dip aborted.
I share your frustrations!

I am mostly a console gamer but decided to get a gaming laptop and I'm already kind of over it. I find myself enjoying my Switch a bit more. Less stress.

There is just so much poorly optimized games. It is insane to me that even 3080 RTX cards have some trouble running cross-gen games... It is almost like it is done on purpose to get people to buy the more expensive 4000 series cards.

I regret buying Hogwarts Legacy due to major stuttering, FPS drops, and bugs. It isn't enjoyable and don't have the energy to do all of these little fixes. Lesson learned I guess.
 
  • Like
Reactions: Aja

yamaci17

Member
In UE4, the texture scalability settings only adjust:
r.MaxAnisotropy
r.Streaming.PoolSize
r.Streaming.MipBias - on the 2 lowest settings, this will load lower res mipmaps. So there will be a texture quality degradation at distance
that's practically what I said?
 

Patrick S.

Banned
I share your frustrations!

I am mostly a console gamer but decided to get a gaming laptop and I'm already kind of over it. I find myself enjoying my Switch a bit more. Less stress.

There is just so much poorly optimized games. It is insane to me that even 3080 RTX cards have some trouble running cross-gen games... It is almost like it is done on purpose to get people to buy the more expensive 4000 series cards.

I regret buying Hogwarts Legacy due to major stuttering, FPS drops, and bugs. It isn't enjoyable and don't have the energy to do all of these little fixes. Lesson learned I guess.
I don't really think there is an agreement (a conspiracy?) between the game developers and Nvidia to not put much too effort into optimizing, so that people will need to go buy out and buy new GPUs and resort to brute forcing their games to get acceptable performance.

Maybe the game is adecuately optimized, but it's just too big or uses too much RTX stuff. I dunno.

BTW I'm kinda in the same boat with the Switch; I've had it since launch and had barely touched it, but for the last few months I've been buying many multiplatform games that run well enough on it, and a few exclusive titles as well.

The Switch draws 17W while playing and charging, and with the electricity prices whe have now, it just makes more sense for me to play on a 17W system instead of on my 750W PC. I've been using my PS5 more, too, because 200W vs. 750W.
 
Last edited:

yamaci17

Member
I don't really think there is an agreement (a conspiracy?) between the game developers and Nvidia to not put much too effort into optimizing, so that people will need to go buy out and buy new GPUs and resort to brute forcing their games to get acceptable performance.

Maybe the game is adecuately optimized, but it's just too big or uses too much RTX stuff. I dunno.

BTW I'm kinda in the same boat with the Switch; I've had it since launch and had barely touched it, but for the last few months I've been buying many multiplatform games that run well enough on it, and a few exclusive titles as well.

The Switch draws 17W while playing and charging, and with the electricity prices whe have now, it just makes more sense for me to play on a 17W system instead of on my 750W PC. I've been using my PS5 more, too, because 200W vs. 750W.
there really is no conspiracy, just people that cannot come to terms with they got low VRAM devices.

3080 is 1.7x-2.5x times faster than a ps5 and up to 3x times faster in ray tracing applications combined with DLSS, but when all said and done, it only has 10 GB VRAM buffer, out of which only 9.2-9.3 GB of it can be used by video game applications, since Windows compositor and Steam by default will use around 500-700 mb on any given day.

At best case scenario, PS5 allocates around 9-10 GB VRAM for GPU related operations for settings that are sane and optimized, tuned and tweaked. Said 3080 users are unable to understand/accept that pushing settings above those "console" settings now require more VRAM. they were able to do so with crossgen games, because said games did not fully utilized 10 GB VRAM available on PS5 purely for rasterizaton.

this was not an issue until recently because most games were designed with PS4 buffer in mind. most games used around 6-7 GB VRAM even at 4K ultra settings since baseline textures were tailored for 4 GB GPU bound (out of 5.5 gb total) VRAM buffer of PS4. That helped 3080 to flex its powers, enable ray tracing on top of things.

but now hogwarts legacy is game where it will fill up 9-10 GB VRAM without ray tracing or advanced higher graphical settings on PS5. there pracitically exists no free VRAM for ray tracing on top of, not unless you sacrifice on texture settings.

3080 can be 125125x fater than a ps5. this does not change the fact above.

do notice how PS5 also has to reduce texture quality in a big manner with its ray tracing mode. the game and its primary textures are designed with 10 GB VRAM buffer purely for rasterization in mind.

really, 3080 simply should have 16 GB VRAM buffer so that it can stretch its legs. it can't in this game. there's nothing to do about it. settings are there. you can reduce texture quality to medium and enable ray tracing like PS5 does.

a 3080 user simply has to adhere whatever VRAM related limitations a PS5 adheres to. this is simply a capacity issue. people were warned and this has been told.

it is practically a design choice, its VRAM buffer only allowed to run ray tracing, high quality textures when the said textures were from the PS4 era.

if you want PS5 era textures, WITHOUT ray tracing, you're bound by that VRAM buffer.

take cyberpunk for example. this game fills up the entirety of an 10 gb buffer at 4k ray tracing. but it has lastgen textures.
now imagine cyberpunk with 1.5x higher sized textures.
how can you fit it into 10 gb buffer? you cannot.

this is what hog legacy is practically doing. it has higher quality textures than most lastgen games.

the base game is already filling the entirety of that tiny 10 gb buffer, like it does on PS5.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
I think I am starting to realize now that my 3080's 10GB really isn't enough anymore for ultra textures. After Dead Space this is another game where ultra textures give huge drops every now and then.

Nvidia really should've had more VRAM in the 3080. I don't want to buy another GPU within 2 years and I don't think the current ones are worth the price anyway. Oh well...I will reduce texture quality for next 2 years until the next generation GPUs come out.
Yes they should have had more vram but that’s not the issue here. If you look at vram usage it hovers around 7GB. Even with rt it is only 8 gb. There is something else going on here.

My 3080 runs the game at ultra settings 4k dlss quality 60 fps locked with just 40% gpu utilization in hogwarts and 60% in hogsmead. I can probably get 120 fps inside hogwarts. Just turn off rt and enjoy the game at ultra settings which is what the game recommends for 3080 users.
 
I think I am starting to realize now that my 3080's 10GB really isn't enough anymore for ultra textures. After Dead Space this is another game where ultra textures give huge drops every now and then.

Nvidia really should've had more VRAM in the 3080. I don't want to buy another GPU within 2 years and I don't think the current ones are worth the price anyway. Oh well...I will reduce texture quality for next 2 years until the next generation GPUs come out.
This is exactly why I was didn't want to get a 3080 (not that I could get one if I wanted to). 10GB is not enough. Maybe back in 2020/2021 it was passable. I remember people saying stuff like "10GB is more than enough even at 4K!" or "By the time you need 10GB you'll have to upgrade your card anyway!".

I was almost tempted by the 3080 12GB model but even that felt little too little. I held out for something with at least 16GB. I could've got a 3090/3090 Ti with 24GB but that seemed overkill. Now I have a 4080 with 16GB i am already worried. It's definitely a huge upgrade over my 8GB card but seeing games use 12-14GB is concerning. I almost wish I had went for the 4090 lol. I'm not even playing at 4K but 1440p 144hz. Hogwarts does seem to love RAM/VRAM but it's got me worried lol.

It's the same for RAM. People said 16GB was enough but I went with 32GB instead. Now I'm seeing games use 20-24GB it has me thinking about going for 64GB RAM when I upgrade my CPU/Motherboard.
 

Kenpachii

Member
there really is no conspiracy, just people that cannot come to terms with they got low VRAM devices.

3080 is 1.7x-2.5x times faster than a ps5 and up to 3x times faster in ray tracing applications combined with DLSS, but when all said and done, it only has 10 GB VRAM buffer, out of which only 9.2-9.3 GB of it can be used by video game applications, since Windows compositor and Steam by default will use around 500-700 mb on any given day.

At best case scenario, PS5 allocates around 9-10 GB VRAM for GPU related operations for settings that are sane and optimized, tuned and tweaked. Said 3080 users are unable to understand/accept that pushing settings above those "console" settings now require more VRAM. they were able to do so with crossgen games, because said games did not fully utilized 10 GB VRAM available on PS5 purely for rasterizaton.

this was not an issue until recently because most games were designed with PS4 buffer in mind. most games used around 6-7 GB VRAM even at 4K ultra settings since baseline textures were tailored for 4 GB GPU bound (out of 5.5 gb total) VRAM buffer of PS4. That helped 3080 to flex its powers, enable ray tracing on top of things.

but now hogwarts legacy is game where it will fill up 9-10 GB VRAM without ray tracing or advanced higher graphical settings on PS5. there pracitically exists no free VRAM for ray tracing on top of, not unless you sacrifice on texture settings.

3080 can be 125125x fater than a ps5. this does not change the fact above.

do notice how PS5 also has to reduce texture quality in a big manner with its ray tracing mode. the game and its primary textures are designed with 10 GB VRAM buffer purely for rasterization in mind.

really, 3080 simply should have 16 GB VRAM buffer so that it can stretch its legs. it can't in this game. there's nothing to do about it. settings are there. you can reduce texture quality to medium and enable ray tracing like PS5 does.

a 3080 user simply has to adhere whatever VRAM related limitations a PS5 adheres to. this is simply a capacity issue. people were warned and this has been told.

it is practically a design choice, its VRAM buffer only allowed to run ray tracing, high quality textures when the said textures were from the PS4 era.

if you want PS5 era textures, WITHOUT ray tracing, you're bound by that VRAM buffer.

take cyberpunk for example. this game fills up the entirety of an 10 gb buffer at 4k ray tracing. but it has lastgen textures.
now imagine cyberpunk with 1.5x higher sized textures.
how can you fit it into 10 gb buffer? you cannot.

this is what hog legacy is practically doing. it has higher quality textures than most lastgen games.

the base game is already filling the entirety of that tiny 10 gb buffer, like it does on PS5.

Non of these games have v-ram issue's running at 4k ultra settings + RT with DLSS on a 3080.

I know people with more v-ram like to validate there purchase, but the reality is most games will be designed around 10gb buffer limit at 4k because of consoles. It's pretty much the replica of the 970 on that front.

Now will there be games that struggle with v-ram on a 3080 eventually sure, but at that time a 3080 will be ancient and GPU performance will matter a lot more at that point or specifically sponsored and designed around newer nvidia cards to push sales.

At the end of the day devs decide what v-ram is getting used. If majority of people on PC can't max out a game on there expensive card u just risk refunds for absolute no reason.
 
Last edited:

yamaci17

Member
Non of these games have v-ram issue's running at 4k ultra settings + RT with DLSS on a 3080.

I know people with more v-ram like to validate there purchase, but the reality is most games will be designed around 10gb buffer limit at 4k because of consoles. It's pretty much the replica of the 970 on that front.

Now will there be games that struggle with v-ram on a 3080 eventually sure, but at that time a 3080 will be ancient and GPU performance will matter a lot more at that point or specifically sponsored and designed around newer nvidia cards to push sales.
i'm not validating anything. i have a 8 gig 3070 myself. do I sound like a 12 gb 3060 user that gets happy with 3080 getting deranged at 4k ray tracing in hogwarts legacy? no, I'm just aware what happens is happens because it is supposed to happen.

also I disagree, 3080 will never be "ancient" compared to PS5. it is rather powerful compared to a PS5. if it had 16 gb vram, its useful life for its capabilities would be longer.

the second where your gpu is capable but vram is limiting means that the card is designed abruptly.

I can get 1440p dlss quality + ray tracing + 60 fps on my 3070 with low textures (new preset). if this card had 12 gigs, i could play with higher textures at EXACT same settings. now to get higher textures, I must instead let ray tracing go (I have no trouble doing this, I'm just pointing out that gpu is capable for certain things, but VRAM buffer is not allowing it) playing with ray tracing becomes pointless when you only give textures 1.2 gb budget, which causes low quality textures to pop in and out of existence in front of you

also, as you said, they will be designed around 10 gb buffer FOR consoles. 3080 users are expecting HIGHER THAN CONSOLE settings in most cases. that is the problem.
 
Last edited:

Utherellus

Member
Well, Empress finally released Denuvoless version of the game so it will be interesting to observe the difference between versions.

 

yamaci17

Member
Do these guys have an official discussion board for support and news?

The PC game still needs some fixin...
there are two types of problems and stutters that are user bound

1) the type where u run out vram. if you have 8 gb, they fixed it. before a patch, low texture setting was 3000 mb, high was 4100 mb of texture budget. now low is 1200 mb and high is 3000 mb. and medium is 1800 mb. you can practically fix vram related issues by choosing medium or high texture

2) the type where u run out of ram. this is unsolvable. you gotta get 32 gig or bust

those who have 32 gig but keep having issues are most likely running into vram bound issues, which they can fix by reducing texture quality and effects quality

other than these 2 factors, game should be smooth sailing

asset loading stutters are completely unrelated and needs fixing from dev side
 
Last edited:

Amey

Member
Well, Empress finally released Denuvoless version of the game so it will be interesting to observe the difference between versions.

SuJrg0G.png
 

GymWolf

Member
there are two types of problems and stutters that are user bound

1) the type where u run out vram. if you have 8 gb, they fixed it. before a patch, low texture setting was 3000 mb, high was 4100 mb of texture budget. now low is 1200 mb and high is 3000 mb. and medium is 1800 mb. you can practically fix vram related issues by choosing medium or high texture

2) the type where u run out of ram. this is unsolvable. you gotta get 32 gig or bust

those who have 32 gig but keep having issues are most likely running into vram bound issues, which they can fix by reducing texture quality and effects quality

other than these 2 factors, game should be smooth sailing

asset loading stutters are completely unrelated and needs fixing from dev side
The game was not stutter free even with a 4080, 32gb of ram and rtx off, unfortunately

Atomic heart is where the real stutter free experience is at.
 

yamaci17

Member
The game was not stutter free even with a 4080, 32gb of ram and rtx off, unfortunately

Atomic heart is where the real stutter free experience is at.
yes, them stutters are asset loading stutters, but they're nothing compared to what you get when you run out of vram/ram

I believe those stutters could be averted by forcing game to cache more data into ram and vram preeemptively. I know someone did such a thing with Kena and removed all asset stream stutters

directstorage or whatever that can be remedy to this needs to be implemented
 
Last edited:

DareDaniel

Banned
Playing on a MSI Vector GP66 (RTX 3070ti and 32GB RAM). I'm almost 20 hours in and the experience hasn't been.. ideal. I avoid exploring Hogwarts and Hogsmeade as much as possible simply because the bad performance breaks all the immersion, which is a shame because that's what I wanted the most in this game as a long time HP fan. I tried the fixes but they did nothing for me, I guess other people were just having much worse performances than me. I've been a console gamer all my life and while I'm loving the piracy and being able to play exclusive games from PS, Xbox and Switch (Breath of the Wild<3), this game makes me want to buy a PS5 just to avoid more surprises like this (the FFVII remake PC version is also another example of a very bad port). I guess I'll just continue waiting for games to get cracked and "finished" on pc before playing them, with no exceptions lol. Also, that rant was fucking awesome.
 

sertopico

Member
there are two types of problems and stutters that are user bound

1) the type where u run out vram. if you have 8 gb, they fixed it. before a patch, low texture setting was 3000 mb, high was 4100 mb of texture budget. now low is 1200 mb and high is 3000 mb. and medium is 1800 mb. you can practically fix vram related issues by choosing medium or high texture

2) the type where u run out of ram. this is unsolvable. you gotta get 32 gig or bust

those who have 32 gig but keep having issues are most likely running into vram bound issues, which they can fix by reducing texture quality and effects quality

other than these 2 factors, game should be smooth sailing

asset loading stutters are completely unrelated and needs fixing from dev side
Thank you, but I meet all the requirements (10700k, 32GB@4GHz, 4090). The CPU is old I know and it causes a bottleneck. I am still getting some transversal stuttering and the game drops lots of frames when movin into new areas.

In particular I was referring to the missing exclusive full screen option and the fact some options are not saved properly, like the sharpening. DLAA+FG have to be enabled every time you start the game and it is tedious. I don't want to use DLSS at 1440p, it's unnecessary. Besides the games has the tendency to crash to desktop from time to time after a couple of hours. There are also some lighting glitches happening in some parts of the castles, where light and shadows coming through the windows just disappear when moving the camera.
 

yamaci17

Member
Thank you, but I meet all the requirements (10700k, 32GB@4GHz, 4090). The CPU is old I know and it causes a bottleneck. I am still getting some transversal stuttering and the game drops lots of frames when movin into new areas.

In particular I was referring to the missing exclusive full screen option and the fact some options are not saved properly, like the sharpening. DLAA+FG have to be enabled every time you start the game and it is tedious. I don't want to use DLSS at 1440p, it's unnecessary. Besides the games has the tendency to crash to desktop from time to time after a couple of hours. There are also some lighting glitches happening in some parts of the castles, where light and shadows coming through the windows just disappear when moving the camera.
I see. I specifically said I acknowledge traversal stutters, its clear that bottleneck there is the I/O systems on PC. you can brute force by forcing game to load more data than it requires for a given region/room/hall. but then you would either have to make it a graphical setting. Traversal stutters have always been a problem with UE games, but more so with this game because it boasts really high quality textures and interior elements. you can %300 zoom them and they still look great. the level of detail is actually rather crazy.

one thing that can solve your problem is to have game cache more data into your large VRAM buffer. but most likely they wont do that. so I'd say try to search those commands that causes more data to be loaded into VRAM upfront. I have never meddled with it because I do need stream of data with my puny 8 gb vram buffer lol

you can enforce dlaa with any dlss preset with this tool;


but I agree it should be fixed asap

for crashing, you may find it funny but on my lowend ryzen 2700 and 16 gb ram, out of my total of 80 hrs of gameplay, game has not crashed once. maybe I was lucky.

for the light/shadow/fog bugs, they definitely are a thing and needs fixing, but I have to say they did not affect from the overall experience in my personal case, as they seem to be temporary artifacts. overall I'd say game looked fine and intended %95 of the time. I saw some weird fog artifacts particulary when coming down certain stairs, especially from Ravenclaw tower.

and finally, exclusive fullscreen is a meme in 2023, DX12 uses some complicated but truly a work of art pseudo fullscreen where you still get the benefits of exclusive fullscreen with freedom of ease of use/alttabbing. i never have any problems with the new fullscreen model dx12 games use, I get full functionality of VRR, input lag is just as same and even better with Reflex etc.
 
Last edited:

sertopico

Member
I see. I specifically said I acknowledge traversal stutters, its clear that bottleneck there is the I/O systems on PC. you can brute force by forcing game to load more data than it requires for a given region/room/hall. but then you would either have to make it a graphical setting, but game already commits around 26 GB RAM. if it needs 26 GB RAM and still has traversal stutters, it theoritically means you would actually need upwards of 40 GB RAM to completely get rid of traversal stutters. Traversal stutters have always been a problem with UE games, but more so with this game because it boasts really high quality textures and interior elements. you can %300 zoom them and they still look great. the level of detail is actually rather crazy.

one thing that can solve your problem is to have game cache more data into your large VRAM buffer. but most likely they wont do that. so I'd say try to search those commands that causes more data to be loaded into VRAM upfront. I have never meddled with it because I do need stream of data with my puny 8 gb vram buffer lol

you can enforce dlaa with any dlss preset with this tool;


but I agree it should be fixed asap

for crashing, you may find it funny but on my lowend ryzen 2700 and 16 gb ram, out of my total of 80 hrs of gameplay, game has not crashed once. maybe I was lucky.

for the light/shadow/fog bugs, they definitely are a thing and needs fixing, but I have to say they did not affect from the overall experience in my personal case, as they seem to be temporary artifacts. overall I'd say game looked fine and intended %95 of the time. I saw some weird fog artifacts particulary when coming down certain stairs, especially from Ravenclaw tower.

and finally, exclusive fullscreen is a meme in 2023, DX12 uses some complicated but truly a work of art pseudo fullscreen where you still get the benefits of exclusive fullscreen with freedom of ease of use/alttabbing. i never have any problems with the new fullscreen model dx12 games use, I get full functionality of VRR, input lag is just as same and even better with Reflex etc.
Yes, I get it.
These stutters are unfortunately "physiological" and I agree with you, the extremely hi res textures united to hi poly meshes are the perfect recipe for catastrophe. There are a couple of commands to increase/decrease the memory pool, but I've read around they are ineffective after the latest patch.

Thanks for the link for DLAA, I was aware of it and gonna test it on Hogwarts as well. :)

The crashing thing is weird, this is the first game that throws "silent" whea errors in the event log (internal parity error on a CPU core). My OC has been always stable over the years though and has been thoroughly tested. I even thought I get some lag spikes cause of these errors... I also read it could be the video driver causing this. I am also using a 850W PSU on a GPU that requires 1000W minimum. Let's see what happens after I get the new PSU. 4090s have pretty high power spikes..
 
Last edited:

yamaci17

Member
Yes, I get it.
These stutters are unfortunately "physiological" and I agree with you, the extremely hi res textures united to hi poly meshes are the perfect recipe for catastrophe. There are a couple of commands to increase/decrease the memory pool, but I've read around they are ineffective after the latest patch.

Thanks for the link for DLAA, I was aware of it and gonna test it on Hogwarts as well. :)

The crashing thing is weird, this is the first game that throws "silent" whea errors in the event log (internal parity error on a CPU core). My OC has been always stable over the years though and has been thoroughly tested. I even thought I get some lag spikes cause of these errors... I also read it could be the video driver causing this. I am also using a 850W PSU on a GPU that requires 1000W minimum. Let's see what happens after I get the new PSU. 4090s have pretty high power spikes..
I never said they are physiological or you made them up, dunno why you said that (maybe its a joke and it flew over my head, sorry)
 

sertopico

Member
So i don't get it, she likes JK rowling yet she cracks the game.
She has few ideas, but very confused.
I never said they are physiological or you made them up, dunno why you said that (maybe its a joke and it flew over my head, sorry)
I was referring to your sentence "Traversal stutters have always been a problem with UE games", meaning it's like part of this engine and hard to get rid of. :D
 

JackSparr0w

Banned
Not a big fan of this woke culture either but the rest she wrote was really beyond. She might have cracked denuvo but she has the tunnel vision of a 12th century peasant.
He is an EXTREMELY gifted individual with a cult following and all the mental issues that come with that. Huge ego, narcissism etc.

He is 100% larping the woman part so he could also be larping some of his mental issues. A few years ago he was far more chill.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Playing on a MSI Vector GP66 (RTX 3070ti and 32GB RAM). I'm almost 20 hours in and the experience hasn't been.. ideal. I avoid exploring Hogwarts and Hogsmeade as much as possible simply because the bad performance breaks all the immersion, which is a shame because that's what I wanted the most in this game as a long time HP fan. I tried the fixes but they did nothing for me, I guess other people were just having much worse performances than me. I've been a console gamer all my life and while I'm loving the piracy and being able to play exclusive games from PS, Xbox and Switch (Breath of the Wild<3), this game makes me want to buy a PS5 just to avoid more surprises like this (the FFVII remake PC version is also another example of a very bad port). I guess I'll just continue waiting for games to get cracked and "finished" on pc before playing them, with no exceptions lol. Also, that rant was fucking awesome.
Turn off rt. Stick to dlss quality at 4k with settings at ultra or high.
 
Top Bottom