Not even that, I think a 2080/3070 would be more realistic, no? And I imagine the experience would be much worse.
The 5 people here saying they're having a great experience on 4090s, this setup above is more closer to what most people might have.
And this is a pretty bad experience. Much worse than the console versions running at 30 FPS with RTGI and AO in context.
Ftfy.
Like some buying a PS5 and celebrating playing demons souls.
Ahem. Make that 6 as I also have a 12900k / 4090 comboThe 5 people here saying they're having a great experience on 4090s
if a PS5 cost like $3000-4000 sureLike some buying a PS5 and celebrating playing demons souls.
I think that matrix demo run at 1440p on console.Looking at eurogamer's article, it looks like CDPR went with Nvidia's RTX tools once again, as if they didn't learn how much of a clusterfuck that resulted with Cyberpunk.
You know Nvidia's RTX (anti-)optimization tools have gone too far when even their 2000€ "consumer" GPU fails to provide decent framerates on a 7 year-old game that got the raytracing treatment.
Unless they're actually trying to wrap people's minds about buying the RTX 5090 for a modest 4000€, a year from now.
Considering how the Matrix demo looks on the measly PS5 / Series X at 4K30, just imagine what a 5x more powerful RTX 4090 should be able to do.
It's not "something that looks like Witcher 3 RTX", that much we should all agree on.
Feels good enough
White Orchard is only 36-37
IIRC it's 1080p reconstructed through TSR to 4K, but I'm assuming temporal reconstruction technologies are just here for everyone to use (literally everyone after AMD made FSR2 open source), so a RTX4090 could and should use it in any situation.I think that matrix demo run at 1440p on console.
This has nothing to do with DX12, but rather with the low performance DX11->DX12 "wrapper" that the devs decided to use in order to run Nvidia's RTX SDK, instead of updating the game engine.DX12 strikes again.
What a stinking pile of hot garbage its been for (PC) games, in terms of performance, if not features.
This is the real killer here but something that’s very hard to retroactively change.CPU multi thread utilization is terrible
Imagine celebreting poverty.if a PS5 cost like $3000-4000 sure
I am having a far better experience on my 3080 12GB than any console with their slideshow and massive input lag and no RT shadows, reflections and mods...
The 5 people here saying they're having a great experience on 4090s, this setup above is more closer to what most people might have.
And this is a pretty bad experience. Much worse than the console versions running at 30 FPS with RTGI and AO in context.
There is no shader stutter. Not a native dx12 game with real time compilation. They already patched the stutters.plus add shader stutter on top.
I have a 2080Ti, I settled for the 30FPS experience, with max 60fps in certain area's on 1440p with DLSS on performance. All settings maxed tho, because I am a sucker for graphics.Not even that, I think a 2080/3070 would be more realistic, no? And I imagine the experience would be much worse.
Gotta say, a shame with this update as I was really looking forward to this.
ah that’s right shader stutter was directx12 specific i think.There is no shader stutter. Not a native dx12 game with real time compilation. They already patched the stutters.
I am having a far better experience on my 3080 12GB than any console with their slideshow and massive input lag and no RT shadows, reflections and mods...
With max settings, thing that consoles can only dream about and with far better input lag still. Dont know what their problem is but I never go below 40 fps on Novigrad at 4k Dlss quality alter the stutters fixes.Congratulations. But the picture shows that a more moderate 3080 setup is struggling to reach 30 FPS in stress areas in DF's testing, which is closer to the experience most other people are likely having.
it's like living in the car instead because that's all you need apparently with most games being GPU bound. If specs for CPU rises I'm gonna upgrade. Until then that shiny car is all I need.
lolImagine celebreting poverty.
And a 4090 is for running gazilions of games, not only a few exclusives like a ps5.
Imagine celebreting poverty.
And a 4090 is for running gazilions of games, not only a few exclusives like a ps5.
Don't you own a 6900 XT?Dude Nvidia duped you into paying 2000€ for a GPU that plays the same games as a $500 console, with minimal visual differences that you wouldn't discern had you not watched Digital Foundry videos with 400% zoomed stills.
That's okay, we all make unreasonable choices in our lives. Just don't expect us to celebrate yours. Try nvidia's subreddit for that.
Yes, and it was an unreasonable expenditure of 450€ (after selling my Vega 64 for 550€) in the middle of 2021, and only after I tried for about 3 months to get a 6800XT at MSRP through AMD.com's weekly stock drops.Don't you own a 6900 XT?
So, you paid 1000 euros for a card that plays the same games as a $500 consoles with even less visual differences?Yes, and it was an unreasonable expenditure of 450€ (after selling my Vega 64 for 550€) in the middle of 2021, and only after I tried for about 3 months to get a 6800XT at MSRP through AMD.com's weekly stock drops.
At some point I could either lose the opportunity to sell my Vega 64 for more than what I paid for it (because back then the eth PoS transition was planned for late 21) or just spend 450€ to get the 6900XT.
Note: in my country I couldn't get Nvidia GPUs at MSRP.
So, you paid 1000 euros for a card that plays the same games as a $500 consoles with even less visual differences?
Doesn't sound like you have a lot of ground to be lecturing anyone on their spending.
I also like that the 4090 is 2000 euros because apparently, selling your old GPU is a non-factor when it comes to the 4090, but you make sure to deduct the price you sold your GPU for from what you paid for the 6900 XT.
It almost sounds like you're being disingenuous.
You're the one who claimed that NVIDIA tricked the other poster into buying a 2000 euros GPU to play console games with barely any visible differences. Considering that the 4090 absolutely dunks on the 6900 XT that you spent 1000 euros for, who are you mocking exactly? If the 4090 runs games with visuals indiscernible from a PS5, the 6900 XT might as well be running PS3 graphics.Be my guest and link to my posts bragging about buying a 6900XT while shitting on all console gamers because they're poor.
I admitted just now it was an unreasonable expenditure myself.
Paying 1000€ for a 6900XT 18 months ago wasn't a good catch, and I should have persisted on trying to get a 6800XT at MSRP. I did try to secure a high resale value on my Vega 64 at the time, but I now know it wasn't a sensible option.
The narrative you're trying to push just isn't here.
Speaking of being disingenuous..If the 4090 runs games with visuals indiscernible from a PS5, the 6900 XT might as well be running PS3 graphics.
You're the one owning a 6900 XT which is an even worse purchase than the 4090 even considering the time of their releases. And you're right, you should have gone for the 6800 XT. The 6900 XT is a piece of crap top of the line card that can't even run ray tracing decently.Speaking of being disingenuous..
Aaaww little bunny got offended that I didn't buy a GPU from his Nvidia overlord?You're the one owning a 6900 XT which is an even worse purchase than the 4090 even considering the time of their releases. And you're right, you should have gone for the 6800 XT. The 6900 XT is a piece of crap top of the line card that can't even run ray tracing decently.
The 6800 XT is from NVIDIA now? News to me.Aaaww little bunny got offended that I didn't buy a GPU from his Nvidia overlord?
Dude Nvidia duped you into paying 2000€ for a GPU that plays the same games as a $500 console, with minimal visual differences that you wouldn't discern had you not watched Digital Foundry videos with 400% zoomed stills.
That's okay, we all make unreasonable choices in our lives. Just don't expect us to celebrate yours. Try nvidia's subreddit for that.
I don't own a 2000€ GPU. OverHeat does. And same games? How are Star Citizen, WoW Dragon Flight and Pokémon Scarlet on some $500 console, with perfect IQ, mods, tons of controllers options, lowest input lag, high framerates? And how is the input lag, performance and IQ, mods of The Witcher 3 : https://www.neogaf.com/threads/digi...formance-modes-tested.1648321/#post-267196989 ?Dude Nvidia duped you into paying 2000€ for a GPU that plays the same games as a $500 console
This is dishonesty AT ITS PEAK.
The 5 people here saying they're having a great experience on 4090s, this setup above is more closer to what most people might have.
And this is a pretty bad experience. Much worse than the console versions running at 30 FPS with RTGI and AO in context.
no it wouldn't because it is a CPU limitation. the user dishonestly picked the most problematic spot in the game, Novigrad, which destroys all CPUs, including 12900k.Not even that, I think a 2080/3070 would be more realistic, no? And I imagine the experience would be much worse.
Gotta say, a shame with this update as I was really looking forward to this.
This is dishonesty AT ITS PEAK.
1) the scene is HEAVILY CPU LIMITED
2) it is MAX settings (ultra+, increased draw distance, npc count and lods over consoles)
3) ryzen 3600 is pretty much similar to consoles. it even lacks 2 cores. they're teh same arhitecture. you can't practically expect more performance out of same architecture on different platforms.
exact same scene on consoles run around 23-25 FPS as well. it is clearly what ZEN 2 architecture can output in their horribly optimized code. it has nothing to do with RTX 3080 (it is severely underutilize due to heavy CPU bottleneck)
It's not "much worse". They're both equally bad.And this is a pretty bad experience. Much worse than the console versions running at 30 FPS with RTGI and AO in context.
It's not "much worse". They're both equally bad.
The only way to enjoy ray tracing in this game with acceptable levels of performance is with an RTX 40 card and frame generation. Otherwise, use DX11 on PC or select Performance Mode on consoles.
RT mode is absolutely brutal on all platforms.