Yeah bottom one is sharper but you'd be hard pressed to notice.Bottom picture looks sharper to me without my glasses on
Yeah bottom one is sharper but you'd be hard pressed to notice.Bottom picture looks sharper to me without my glasses on
Really? So, when Microsoft says this (https://news.xbox.com/en-us/2020/06/10/everything-you-need-to-know-about-the-future-of-xbox/):When people talk about "hardware VRS" they are mainly talking about API/drivers.
your take is that they are talking about API capabilities?...the custom designed processor at the heart of Xbox Series X includes brand new innovative capabilities such as Variable Rate Shading (VRS)
Now turn for Variable Rate Compute Shaders (VRCS) on XSeries. New Talk
Variable Rate Compute Shaders on Xbox Series X|S
Microsoft Game Dev can help you find the right mix of tools and services to fit your game development needs.developer.microsoft.com
Yes, you technically could do VRS on an xbox one too but you would get very little out of it at such a low res and probably more overhead. Compared to xbox one there are improved capabilities on the Xbox series console and aiming for higher res makes support for it more viable. There are even some better capabilities compared to PS5 but I was referring more to the "Tier 2 VRS" that people use interchangeably for "hardware VRS". hardware VRS there is just the API compatibility. PS5 can do the screen space VRS people are referring to as Tier 2 described in the link and OP video.Really? So, when Microsoft says this (https://news.xbox.com/en-us/2020/06/10/everything-you-need-to-know-about-the-future-of-xbox/):
your take is that they are talking about API capabilities?
But then that wouldn't be "new innovative capabilities", would they?Yes, you technically could do VRS on an xbox one too but you would get very little out of it at such a low res and probably more overhead. Compared to xbox one there are improved capabilities on the Xbox series console and aiming for higher res makes support for it more viable. There are even some better capabilities compared to PS5 but I was referring more to the "Tier 2 VRS" that people use interchangeably for "hardware VRS". hardware VRS there is just the API compatibility. PS5 can do the screen space VRS people are referring to as Tier 2 described in the link and OP video.
Why wouldn't they be new innovative capabilities?But then that wouldn't be "new innovative capabilities", would they?
I'm sorry but do you have any source about what you are saying? It seems like it's a big amount of nothing if what you say is true
Where is the game that will have this? Point to it.I don't understand what you are referring about with pipelines. But in this discussion, I'm talking about the rendering pipeline.
And in this generation we have some of the biggest changes ever. For once, the whole geometry engine is revamped. Especially with Mesh Shaders and Amplification Shaders, replacing the whole geometry part of the rendering pipeline.
I already told you that these consoles don't have the bandwidth to implement the entire RT lighting at any reasonably clean resolution. Piecemeal RT isn't going to push visuals either (ala Spiderman). You need the entire lighting pipeline being RT.Then we have ray-tracing, that can replace or enhance several parts of the rendering pipeline. Be it shadows, reflections or global Illumination.
This has nothing to do with shading triangles that ARE seen. Got Bandwidth?One of the advantages of Mesh Shaders and even primitive Shaders is the ability to better cull unseen geometry, at an early stage. This means less resources spent on rendering invisible polygons, but also less overdraw for pixel shading. So it's a net win all around.
BANDWIDTH!!Also consider that the thing that spends most of shader work, is fragment shading and this depends mostly on render resolution. Not geometry.
What game are we talking about again? Or is this the imaginary game you see 5yrs from now?DLSS 1.9 was running on shaders, using DP4A. So will XESS.
Regardless, have you seen what TAAU can do? I have and although it's not as good as DLSS, it is still something with great results.
RT reflections is hardly "lots". LOL!The PS5 versions of Spider-man with lots of ray-tracing effects.
So is R&C now a giant leap up from last gen? I'll answer that - no. It also only has RT reflections. The fur rendering is typical of any game that has fur. SSD loading doesn't solve the rendering equation.Ratchet and Clank with great effects of ray-tracing, fur rendering, SSD for level loading.
And at 1080p for consoles. And NO ONE praises that game's visuals because it looks so ugly artistically.Metro Exodus Enhanced with real time Global Illumination.
I'll do one better. Do you know how to write a shader in a shader language? How about how the math looks for a GGX PBR model that 90% of games today use to shade their materials? You should stop being a fanboy and actually implement the things you say you know. Perhaps then you'll see how difficult it is to adhere to you and many others on this board's expectations. You guys are in for a rude awakening.But do you know what is a compiler? And why ML is important to improve performance and efficiency of compiled code?
Someone disagrees with you and that makes them a fanboy. Grow up.OK. You are sounding more and more like a fanboy. I'm sorry. This will be the last time I reply.
Where is the game that will have this? Point to it.
I already told you that these consoles don't have the bandwidth to implement the entire RT lighting at any reasonably clean resolution. Piecemeal RT isn't going to push visuals either (ala Spiderman). You need the entire lighting pipeline being RT.
This has nothing to do with shading triangles that ARE seen. Got Bandwidth?
BANDWIDTH!!
DLSS 1.9, it was Control. Before the patch for DLSS 2.0What game are we talking about again? Or is this the imaginary game you see 5yrs from now?
RT reflections is hardly "lots". LOL!
So is R&C now a giant leap up from last gen? I'll answer that - no. It also only has RT reflections. The fur rendering is typical of any game that has fur. SSD loading doesn't solve the rendering equation.
And at 1080p for consoles. And NO ONE praises that game's visuals because it looks so ugly artistically.
I'll do one better. Do you know how to write a shader in a shader language? How about how the math looks for a GGX PBR model that 90% of games today use to shade their materials? You should stop being a fanboy and actually implement the things you say you know. Perhaps then you'll see how difficult it is to adhere to you and many others on this board's expectations. You guys are in for a rude awakening.
If they could be done in the PS4/One generation via sw it's obvious there's nothing new about them, thus, according to you they are lying, aren't they?Why wouldn't they be new innovative capabilities?
A source about the VRS capabilities of Xbox being only software-based as you claimI've sent you a link already read part 3 of it. You can even compare it to the VRCS talk and see how it basically describes the exact same thing.
It's not them lying to you because it is a new capability. I mean when RTX cards released they talked about raytracing as a new capability on RTX cards but they could still go back and add that capability to their old GTX cards via driver update. The same with RTX Voice cancelling or SAM on AMD cards. RTX on GTX wasn't good but it was there as "hardware supported". Would adding VRS be any good for engines to use on an xbox one? Probably not, because at the end you still have your decade old bandwidth, memory and compute limits but it's possible.If they could be done in the PS4/One generation via sw it's obvious there's nothing new about them, thus, according to you they are lying, aren't they?
A source about the VRS capabilities of Xbox being only software-based as you claim
Yes, the XSX has lower precision abilities than the PS5 does, but the issue is if it will be exploited or not. Sure, MS internal studios may do it, but I am 99.9% sure no third parties are going to use it on multiplat games. It's not just flick a switch like VRS is (yes, not quite flick a switch but not far off it), but you need supercomputers that do the training etc, and they aren't going to go to all that trouble for the XSX at the expense of PS5. It's extra work, for no gain and possible lack of parity, which only causes them issues..
But do you know what is a compiler? And why ML is important to improve performance and efficiency of compiled code?
Yes, the XSX has lower precision abilities than the PS5 does, but the issue is if it will be exploited or not. Sure, MS internal studios may do it, but I am 99.9% sure no third parties are going to use it on multiplat games.
Yes, the XSX has lower precision abilities than the PS5 does, but the issue is if it will be exploited or not. Sure, MS internal studios may do it, but I am 99.9% sure no third parties are going to use it on multiplat games. It's not just flick a switch like VRS is (yes, not quite flick a switch but not far off it), but you need supercomputers that do the training etc, and they aren't going to go to all that trouble for the XSX at the expense of PS5. It's extra work, for no gain and possible lack of parity, which only causes them issues.
It's great that MS put it in, and I hope it gets used, but developers are really slow at adopting new tech.
Because I would assume you will then be introducing another separate deep learning system? From how I understand it, and I may well be wrong, is that Nvidia provide the super computers to do the DLSS training, and intel would have to do the same for their new cards, which would then require MS to supply the super computers to do the training for the Series? Three different requirements and possibly giving three different results as far as quality goes. Not sure a dev goes to that degree to give the Xbox an advantage over PS5 for no pay off.Intel XeSS could mean that Multiplats will absolutely use it. it is a hardware agnostic upscaler, so why not use it when possible? especially since many devs will already implement it into their PC versions soon
It's great that you know what other people say. Since we are talking about personal feelings, I've never seen someone talk about "hardware support" without implying some kind of hardware. Does that sound as a valid counterpoint to you?I'm saying when people refer to screen space 'Tier 2 VRS' they are talking about API support or when they say software they have written their own.
Do we know this for sure or is it a case of “if Sony does not shout about it they must not support it” as the latter does not really pair well with how they deal with HW specs in their last few console designs (e.g.: look at the latency improvements in their controllers they said nothing about in any big presentation or PR and yet… they are there).Yes, the XSX has lower precision abilities than the PS5 does
Do we know this for sure or is it a case of “if Sony does not shout about it they must not support it” as the latter does not really pair well with how they deal with HW specs in their last few console designs (e.g.: look at the latency improvements in their controllers they said nothing about in any big presentation or PR and yet… they are there).
Because I would assume you will then be introducing another separate deep learning system? From how I understand it, and I may well be wrong, is that Nvidia provide the super computers to do the DLSS training, and intel would have to do the same for their new cards, which would then require MS to supply the super computers to do the training for the Series? Three different requirements and possibly giving three different results as far as quality goes. Not sure a dev goes to that degree to give the Xbox an advantage over PS5 for no pay off.
Again, I could well be misunderstanding how this ML will work in practice.
Counterpoint to what, where do personal feelings come from?It's great that you know what other people say. Since we are talking about personal feelings, I've never seen someone talk about "hardware support" without implying some kind of hardware. Does that sound as a valid counterpoint to you?
Then in reply to that you asked:"Xbox exclusive circuitry" is a tongue in cheek reference to (Rikys) idea that "Tier 2 VRS", screen space VRS in this case, can't be done on other hardware. In reality even a PS4 using compute shaders can do it believe it or not.
Then I said when people talk about "hardware VRS" here they are mainly referring to using the driver/API but you can write your own compute shader screen space VRS, often even more effectively. Then you concentrated on a MS marketing articles use of the words:Isn't it true that Xbox has hardware VRS that's not in PCs or the PS5? I thought that was confirmed by Microsoft.
Forget trying to prove a point to Riky.You quoted me and I quoted you so I'm not talking about you but to you here. You didnt use that exact circuitry phrase, others did, which you liked and you continued on with the exact same stuff:
And that's were you're trying to make incorrect claims and steer things to console wars. This thread isn't about xbox series x games it's about game optimisation results and it's about an optimisation that can be done on all hardware and multiplatform games.
"Xbox exclusive circuitry" is a tongue in cheek reference to your idea that "Tier 2 VRS", screen space VRS in this case, can't be done on other hardware. In reality even a PS4 using compute shaders can do it believe it or not.
This boost from nanite and VRS isn't a feature that PCs and PS5 are missing and the percentage gain would not be different. I even gave you an example with a GDC talk already out using a similar pipeline. You think this is about xbox series secret sauce though and you try your hardest to make tech like VRS and SFS about wars all the time. I remember you arguing about PRT+ vs SFS x2 gains back in the day but can't be bothered dig up that exact post. I just remember you went around making claims like this all the time
Things don't turn out how you expected when the actual results of games came in with VRS though and you still argue that a gap will materialise. One thing you're right about is that the conversation has run its course.
Forget trying to prove a point to Riky.
He completely ignore everything about the PS5's hardware.
He thinks you can only achieve hardware VRS with RB+ ROPs that Nvidia doesn't have. So I guess Nvidia been using software VRS all this time?
He also ignore that the PS5 Geometry Engine is customized to do Foveated Rendering.
He also ignore the gains between software VRS and Hardware VRS on a 2D screen (not VR) are small.
I remember seeing that sheet too and from what I recall, it seemed fairly conclusive. Assuming it was accurate, out of all the features I've heard people crowing about I think ML is probably the one that could make the biggest visual difference in the future.there was like a compatibility sheet for XeSS where the PS5 was absent. but no idea where that sheet was coming from tbh
Talking about me again, my opinion obviously matters.
Like I said take it up with AMD, id and Digital Foundry. Their statements say it all. This thread isn't about PS5 hardware, I suggest you make one if you care so much.
You do know the PS5 is confirmed to be RDNA 2, with RDNA 2 Compute Units since Road to PS5 right?Yes, the XSX has lower precision abilities than the PS5 does, but the issue is if it will be exploited or not. Sure, MS internal studios may do it, but I am 99.9% sure no third parties are going to use it on multiplat games. It's not just flick a switch like VRS is (yes, not quite flick a switch but not far off it), but you need supercomputers that do the training etc, and they aren't going to go to all that trouble for the XSX at the expense of PS5. It's extra work, for no gain and possible lack of parity, which only causes them issues.
It's great that MS put it in, and I hope it gets used, but developers are really slow at adopting new tech.
It isn't about PS5, yet your talking about it on the previous page?Talking about me again, my opinion obviously matters.
Like I said take it up with AMD, id and Digital Foundry. Their statements say it all. This thread isn't about PS5 hardware, I suggest you make one if you care so much.
It isn't about PS5, yet your talking about it on the previous page?
When are you going to stop being a hypocrite and liar?
I did, thinking something new about VRS was brought into the threads after seeing it had 7 pages.I'm not the one who brought it up, try reading the thread to enlighten yourself, your just trolling an Xbox specific thread now.
I did, thinking something new about VRS was brought into the threads after seeing it had 7 pages.
But it just turned out to be console warring and fanboy wet dreams (mainly by you) about nothing new.
I did, thinking something new about VRS was brought into the threads after seeing it had 7 pages.
But it just turned out to be console warring and fanboy wet dreams (mainly by you) about nothing new.
It quite obvious why he made that statement."When we recently interviewed David Cage, CEO and founder of Quantic Dream, he highlighted the Xbox Series X's shader cores as more suitable for machine learning tasks, which could allow the console to perform a DLSS-like performance-enhancing image reconstruction technique."
Xbox Series X's Advantage Could Lie in Its Machine Learning-Powered Shader Cores, Says Quantic Dream
Looking at the hardware of Xbox Series X and PS5, Quantic Dream believes Microsoft's advantage could lie in the console's ML-powered shaders.wccftech.com
Jason Ronald has talked extensively about doing far more with this tech.
Most recently I've spearheaded work in using Neural Rendering to enhance traditional rendering methods, focusing on using implicit neural representations and how to make them run efficiently. This includes creating custom high performance inference using compute shaders. The target for this work is PlayStation 5. Development done in Python, Unity, C#, compute shaders with offline training using Pytorch.
I like how you label me as attacking and troll when you always ban bait.You've only appeared to attack people and troll, pretty sad. Added nothing to the discussion as usual.
It quite obvious why he made that statement.
Let's enter math class.
The formula for calculating INT8 and INT4 are as follows:
INT8: No. of CUs * 512 INT8 bits per CU * clock speed MHz = TOPS
INT4: No. of CUs * 1024 INT4 bits per CU * clock speed MHz = TOPS
XBSX
INT8: 52 * 512 * 1825 = 48.58 TOPs
INT4: 52 * 1024 * 1825 = 97.17 TOPs
PS5
INT8: 36 * 512 * 2233 = 41.15 TOPs
INT4: 36 * 1024 * 2233 = 82.31 TOPs
Base on that, XBSX is more suitable for ML.
But I like how you ignore confirmed information from official places like Sony.
And let's not forget this patent.
Where one of the inventors associated with the patent have this on their LinkedIn.
And how about you?Nah your original post is just an attack on me, sad. You're still posting irrelevant theories about PS5 now and the thread is from MS first party developers about Series consoles. It's a common tactic dribbling fanboys use, don't like anything positive about Xbox so they attack people who are happy about the news in the thread and spam irrelevant charts to derail it.
There is a stark contrast between this thread and the Cerney RT patent thread, everyone can see that.
You never proved there's no hardware in Xbox to do VRS. It's your feeling that there's noneCounterpoint to what, where do personal feelings come from?
I've seen plenty of people talk about Tier 2 VRS capability being some kind of hardware instead of the term for API compatibility. That's how my conversation with you started
I said:
Then in reply to that you asked:
Then I said when people talk about "hardware VRS" here they are mainly referring to using the driver/API but you can write your own compute shader screen space VRS, often even more effectively. Then you concentrated on a MS marketing articles use of the words:
"brand new innovative capabilities such as Variable Rate Shading (VRS)"
Which I assumed was to suggest PS4/xbox one can't do VRS otherwise that blurb would be a lie. I said it's not necessarily a lie.
Where are the feelings?
If you look at The Coalitions latest video you would notice that they too even use compute shader VRS because "hardware VRS" got no gains on their deferred rendering engine.
Wouldn't waste your time with someone who isn't interested in any sort of discussion.And how about you?
I'm still waiting to see these performance advantages.
Does Doom eternal even require any vrs ? that game is as smooth as butter on any hardware. I am very pessimistic about VRS but I think they will use it or something like it in racing games and fast paced games to achieve 120fps, VR games. Probably Forza 8 will use it. For 60 or 30fps mode I would like to see more gfx effects personally. We have games like DL2 which runs better on Dx11, Elder Ring probably would also run better in Dx11. There are many games that perform better on DX11. IMO devs first need to get this right then they can think about VRS
It quite obvious why he made that statement.
Let's enter math class.
The formula for calculating INT8 and INT4 are as follows:
INT8: No. of CUs * 512 INT8 bits per CU * clock speed MHz = TOPS
INT4: No. of CUs * 1024 INT4 bits per CU * clock speed MHz = TOPS
XBSX
INT8: 52 * 512 * 1825 = 48.58 TOPs
INT4: 52 * 1024 * 1825 = 97.17 TOPs
PS5
INT8: 36 * 512 * 2233 = 41.15 TOPs
INT4: 36 * 1024 * 2233 = 82.31 TOPs
Base on that, XBSX is more suitable for ML.
But I like how you ignore confirmed information from official places like Sony.
And let's not forget this patent.
Where one of the inventors associated with the patent have this on their LinkedIn.
What does 'no hardware in xbox' even mean here? Where have I said 'there is no hardware in xbox', whatever that even means here.You never proved there's no hardware in Xbox to do VRS. It's your feeling that there's none
Why are you even talking about Nvidia?ML is not just about granularity for FP and Int execution. It´s about processing vector accumulations in a Matrix.
For that it's necessary support for DP4A or Tensor units. And this is the thing that lacks in the PS5.
This is an example of the level of performance you get using DP4A and a regular GPU.
The M4 is a Maxwell based GPU, for the Tesla range of cards. P4 is Pascal with DP4A.
Why are you even talking about Nvidia?
And it would also apply to the Xbox Series Consoles and AMD RDNA GPUs, not just PS5.
You should take some time out of your day and read AMD RDNA Whitepaper.
I like how you still refuse to believe the PS5 doesn't have that capability.Because nVidia was the first one to implement DP4A in their GPUs. Back in 2016.
And they wrote papers about it, as well as benchmarks.
AMD is NOW doing the same with RDNA2. At least on PC and Series S/X
It's worth pointing out that AMD did have 'rapid packed math' for a while (PS4 Pro at least) so it wouldn't be the same as the M4 FP32 efficiency even on old AMD cards. nvidia Pascal offers more though.ML is not just about granularity for FP and Int execution. It´s about processing vector accumulations in a Matrix.
For that it's necessary support for DP4A or Tensor units. And this is the thing that lacks in the PS5.
This is an example of the level of performance you get using DP4A and a regular GPU.
The M4 is a Maxwell based GPU, for the Tesla range of cards. P4 is Pascal with DP4A.
I like how you still refuse to believe the PS5 doesn't have that capability.
Even though it's confirmed to be RDNA 2.
Even RDNA 1 have this capability.
There is no official information that says the PS5 isn't RDNA 2, but yet here we are with over a year after release and people still think the PS5 is RDNA 1.
I wouldn't be replying to you anymore on this ML topic in this thread as I wish to not derail this thread any further. We can continue this in a ML specific thread if you wish.
I was never on beyond3d and I was never banned here or warned for console warring / trolling.Because a few devs have already stated or inferred, that the PS5 is not as good as the Series S/X and PC, at ML.
Also, reports on skews for Rdeon chips seem to confirm that the PS5 does not support DP4A.
You keep insisting that the PS5 has the full RDNA2 feature set, despite having been proven wrong a few times.
BTW, weren´t you banned from the beyond3d forum?