• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.

Log4Girlz

Member
dont remember an AMD rep saying that though i do remember an AMD rep saying it was an e6760 which most of us myself included ate up

Yeah it was touted everywhere for a while. Then we realized they wouldn't know what the fuck they were talking about. I think it was from a japanese article.
 

JordanN

Banned
Man, the Wust stuff was a thread of nightmares.

Just more reasons I wish Nintendo would abolish their "not gonna give you specs" policy.
 
So would that make the consensus opinion that Latte has fixed function shaders that replicate TEV functions in both Wii U and Wii modes? I've read both theories floating around, the first that it's fixed function and the second that TEV is being emulated into modern shaders on the GPU, and both seem to have merit.

Fixed function would make things a lot easier for Nintendo's internal teams but I can't imagine that 3rd party devs would be happy with a split programmable/fixed function GPU. Although since 3rd party devs actually don't seem to be happy maybe that's exactly what it is.

Thanks for that. If that's what Nintendo actually said, (that they designed the Wii U, then modified Wii U parts in order to achieve fluid backwards compatibility as opposed to gutting/sacrificing for BC's sake) then I think that this idea that the core Wii U design is held back because of BC is negated. They built the machine they wanted to build it, and the console does not make sacrifices in order for people to play their Wii game collection.

I found the original quote.

http://iwataasks.nintendo.com/interviews/#/wiiu/console/0/2

Shiota

Yes. The designers were already incredibly familiar with the Wii, so without getting hung up on the two machines' completely different structures, they came up with ideas we would never have thought of. There were times when you would usually just incorporate both the Wii U and Wii circuits, like 1+1. But instead of just adding like that, they adjusted the new parts added to Wii U so they could be used for Wii as well.

Like LWill said these were things we considered as well. I think the former was something I considered and the latter from Fourth Storm. I think we have both dropped those views at some point in time. Though what he has been postulating in regards to 160 ALUs sounds similar to something I thought about during my time away.

LWill brought up something I forgot to mention and that was in regard to this "8-bit CPU" in Latte. My take is that the Command Processor is handling that.

What's it have, like 5 of them? ;)

Oh the WUST days were fun times.

New documentation revealed that Latte's eDRAM is actually 32MB of embedded tessellation.

Yeah those were fun times indeed.

I always figured just that nobody cared about "maxing" any Wii. It was far behind anyway, and few FPS came out for it which are the most graphically demanding games.

Anyways on a totally separate topic, I used to often bookmark posts I thought were particularly egregious or the poster would eat crow later. A lot of them dealt with Wii U and what a monster it was going to be in the early days. The other day I was perusing a few of these and they were pretty funny. There was some pretty damning stuff by BG. I will probably post a few later as I dont have time to mess with it now.

Just remember there was a time where more or less a ton of people thought Wii U was going to be more powerful than Durango. Nobody came out and said it but that's basically what they were getting at.

It's just kind of sad to look back at how Wii U undershot our worst expectations by a large amount. Even mine, and I was it's #1 detractor. I really expected it to at least be better than PS360 easily. So far I haven't seen that.

I remember specifically one guy's post, it was in response to that thing that Crytek was working with Wii U or whatever, and he said something like "Crytek? what kind of monster are you building here Nintendo" or something like that. That was the attitude back then. Wii U was a monster.

Oh another thing was power, I think BG had it using 100 watts or something, in one of the bookmarked posts. The reality was 33. 33.

I think the rumored Durango specs was everything Wii U should have been from an engineering standpoint. The whole "good enough" thing.

BG 100 watts

http://www.neogaf.com/forum/showpost.php?p=34726131&postcount=3759

Here's BG predicting e6760 and 600-800 gflops

http://www.neogaf.com/forum/showpost.php?p=41597923&postcount=2367

anyways it's ot to this thread so i guess i wont belabor it. it's just interesting how attitudes have changed.

Good ole Rangers. The worst culprit of seeing what he wants to see despite what he links shows otherwise. I like how you dig up a post over a year ago that I said the total console would be around ~100w and then in turn focus on the gameplay wattage of 33W and not Nintendo's own number of the total wattage being 75w. But yeah play that up to make it sound like I was dramatically off. Let's also ignore that I was even closer with the total wattage with a later post.

http://www.neogaf.com/forum/showpost.php?p=39090808&postcount=3437

And here you go with that E6760 mess again that got deleted from B3D because of how off you were. I like how you ignore that I clearly said "guess" before saying 600-800GFLOPs and despite even earlier in this thread me saying I chose to focus on a raw power angle. But apparently certain people like to act like I said things as if they will be fact. Let's conveniently ignore this post that explains the E6760 and how I said it was a comparison GPU and not the GPU Wii U is based on.

http://www.neogaf.com/forum/showpost.php?p=42408109&postcount=4998

Just like in that B3D thread, you've given proof of your own habit of twisting things to how you see it. I remember seeing to twisting one of Brad Grenz's post on B3D as well and him not being pleased about it. I laughed to myself and said, "Your turn Brad".

I look forward to these other posts you plan to link to.

BG had 'called' me out a few posts ago, so I took a look at his post history and the back sledding was amazing.

Do a google search of him and 'dev kit' and he was going around on Gamespot and GameFAQs claiming he had all this access to the dev kit and surprises, but he's been called out enough and he's busy "gettin' Money" playing for the Money team with an inside basketball court AND an outside basketball court to reply.

http://www.ign.com/boards/threads/wiiu-devkit-pics-wiiu-early-target-specs.452522923/
http://www.gamefaqs.com/boards/631516-wii-u/61019975?page=2
http://www.neogaf.com/forum/showthread.php?p=41938848&highlight=#post41938848
http://www.neogaf.com/forum/showpost.php?p=38549553&postcount=1
http://www.neogaf.com/forum/showthread.php?p=42351669&highlight=#post42351669

ROFL! In all of those links, show me where I said I have access to the dev kit. Did you even read the GameFaqs posts. I was telling them that the CPU was not going to be a POWER7. The "surprise" was going to be showing the pictures of the dev kit that were sent to me, which I clearly say, but I asked for permission beforehand and they weren't comfortable with it. I like how getting credit for providing the link means I have dev kit access. I don't even understand what you were trying to prove with these links.

I like that you guys link this stuff. It helps to prove my point that some people see what they want to see even though it's not there. Thanks for doing the work for me! And dang right I'm working on "gettin' Money" playing for the Money team with an inside basketball court AND an outside basketball court to reply. You want to start Paypaling me some money I'll consider sticking around. What's the point of making fun of my life situation?

Oh and while we're at it, let's do the same for PS4.

http://www.neogaf.com/forum/showpost.php?p=38657020&postcount=1

I don't even understand the motivation behind it. But I guess it goes back to some people taking this more serious than I do.
 
Remember when some AMD rep said it was capable of 1 terraflop?

Yeah and then we got someone to translate it and found out what was wrong with that.

It was also fueled by hype posts by "insiders" like BG and Ideaman. When Akham ad llhere (probably misspelled the names heh) said to not get too excited about the specs they were pounced on in the wust threads(llhere not so much but he wasn't as blunt about it).

I said multiple times that I wasn't an insider. And I actually defended Arkam.

Just more reasons I wish Nintendo would abolish their "not gonna give you specs" policy.

This I agree with. People are going to talk about the specs regardless of them being available or not. Might as well share them.
 

Log4Girlz

Member

krizzx

Junior Member
Yeah its gonna be difficult to search for, I'll try, but here are some posts that I think refer to that prior rumor.

This one is posting news discounting the power of the Wii U

http://www.neogaf.com/forum/showpost.php?p=36586328&postcount=13566

And here is a response from one of our favorite semi-insiders in the WUST thread

http://www.neogaf.com/forum/showpost.php?p=36586657&postcount=13588

I was completely unaware that these even existed. Is there a part 6?

Though, back on topic. Shouldn't Latte be counted as having 35 MB of EDRAM since it has a 1mb and 2 mb chip?
 
Well regardless of the Dev being wrong about the capability of the machine, it's rather telling that he says, 'Not as many shaders' as PS360. Despite ignorance of the capability of those shaders, there's not much reason to assume he doesn't know the number. This beefs up the idea of 160 beefed up, much more modern shaders.
 

krizzx

Junior Member
Well regardless of the Dev being wrong about the capability of the machine, it's rather telling that he says, 'Not as many shaders' as PS360. Despite ignorance of the capability of those shaders, there's not much reason to assume he doesn't know the number. This beefs up the idea of 160 beefed up, much more modern shaders.

This has been crossed many time already. It would require magic the likes of which we have never seen in a GPU before to be true.
 
Yes, I find it utterly shocking that people would change their opinions based on new information.

That's not allowed. You have to be held to your previous opinion no matter what.

I was completely unaware that these even existed. Is there a part 6?

Though, back on topic. Shouldn't Latte be counted as having 35 MB of EDRAM since it has a 1mb and 2 mb chip?

Well 1MB is SRAM, that said it's probably due to devs having "full control" over the 32MB portion.
 
Well regardless of the Dev being wrong about the capability of the machine, it's rather telling that he says, 'Not as many shaders' as PS360. Despite ignorance of the capability of those shaders, there's not much reason to assume he doesn't know the number. This beefs up the idea of 160 beefed up, much more modern shaders.

But should we really hold weight to their statement when the source is dubious at best? Not that I don't trust Gamesindustry.biz, but the 'anonymous dev' thing. Then again, it is gamesindustry.biz...

The writer of that peice's info can be obtained here - Steve Peterson

Might be OT, but looking back there were several devs who straight up discounted the CPU, and can be directly quoted as much. Would we chalk that up to lack of information as well?

ROFL! In all of those links, show me where I said I have access to the dev kit. Did you even read the GameFaqs posts. I was telling them that the CPU was not going to be a POWER7. The "surprise" was going to be showing the pictures of the dev kit that were sent to me, which I clearly say, but I asked for permission beforehand and they weren't comfortable with it. I like how getting credit for providing the link means I have dev kit access. I don't even understand what you were trying to prove with these links.

I like that you guys link this stuff. It helps to prove my point that some people see what they want to see even though it's not there. Thanks for doing the work for me! And dang right I'm working on "gettin' Money" playing for the Money team with an inside basketball court AND an outside basketball court to reply. You want to start Paypaling me some money I'll consider sticking around. What's the point of making fun of my life situation?

Oh and while we're at it, let's do the same for PS4.

http://www.neogaf.com/forum/showpost.php?p=38657020&postcount=1

I don't even understand the motivation behind it. But I guess it goes back to some people taking this more serious than I do.

Pretty heavily alluded to some big insider info the entire time, and dampened what you were saying after each successive 'revelation' about the console, month after month. but who cares.

Which goes back to my original point, why are people downing Fourth Storm now and bigging you up as a more credible source?
And boy did he get a lot of shit for it.

Matt got a lot of heat too when he said it appeared to be 3x Broadway cores. And so did Marcan for saying the same thing.

When the Dynasty Warriors dev said it was a bit weaker than the 360's CPU, it was shrugged off as lack of knowledge, and the CoD/Battlefield/FPs dev's comment about the CPU limiting the amount of players on a map, it was shrugged off as a lack of knowledge.
 
But should we really hold weight to their statement when the source is dubious at best? Not that I don't trust Gamesindustry.biz, but the 'anonymous dev' thing. Then again, it is gamesindustry.biz...

The writer of that peice's info can be obtained here - Steve Peterson

Might be OT, but looking back there were several devs who straight up discounted the CPU, and can be directly quoted as much. Would we chalk that up to lack of information as well?

Also Arkam was the first person to indirectly let us know there was something wrong on the CPU end.
 

prag16

Banned
Might have something to do with the way he presented his information.

I doubt that was all of it (Nintendo fan butthurt was involved), but yeah I seem to remember some level of arrogant prick-ness which couldn't have helped the situation.
 

The_Lump

Banned
Never know why bg gets loads of flak. He was one of the reasonable ones providing grounded speculation and analysis of the few facts we had. He never claimed to be an insider at all and never claimed his posts as fact afaik.

As usual the same people seem to be reading what they want to read instead of what has been written.
 

OryoN

Member
Why are people bringing up all these old discussions? The majority of it was under a speculation thread for a reason. This happens every generation. Information that is passed around gets distorted, hopes are latched onto, and even rumors that started out with some truth, can morph into something totally different. In the midst if it all, something here or there is confirmed and the cycle starts all over again in attempt to put everything in perspective. In the end, you have a plethora of opinions, bold claims, insider comments, rumors and leaks to sieve through.

What exactly is new or surprising a about this? Wasn't PS4 supposed to use 3D stacking and what's not? That didn't turn out accurate, did it? But this stuff is normal. Even very trusted insiders/devs(leave Ideaman out of this) that were involved in many of these threads understand how these situations play out, and aren't really phased, whether people believe them or not. Many of them simply leave a comment and keep it moving. Very few if them feel the need to return later saying; "I told you so," because it is what it is. Normal!

But whatever... carry on.
 

The_Lump

Banned
Why are people bringing up all these old discussions? The majority of it was under a speculation thread for a reason. This happens every generation. Information that is passed around gets distorted, hopes are latched onto, and even rumors that started out with some truth, can morph into something totally different. In the midst if it all, something here or there is confirmed and the cycle starts all over again in attempt to put everything in perspective. In the end, you have a plethora of opinions, bold claims, insider comments, rumors and leaks to sieve through.

What exactly is new or surprising a about this? Wasn't PS4 supposed to use 3D stacking and what's not? That didn't turn out accurate, did it? But this stuff is normal. Even very trusted insiders/devs(leave Ideaman out of this) that were involved in many of these threads understand how these situations play out, and aren't really phased, whether people believe them or not. Many of them simply leave a comment and keep it moving. Very few if them feel the need to return later saying; "I told you so," because it is what it is. Normal!

But whatever... carry on.


Well said, man.
 

IdeaMan

My source is my ass!
i guess it's one way to celebrate the one year anniversary of the pre-E3 2012 hype for some people, to do a bit of nonconstructive bitching about the past and re-invent the history of those threads.

Why are people bringing up all these old discussions?

Don't know how to interpret the "leave IdeaMan", but if it's for the accuracy of the info given, there are three assessments available in public threads where you can see that 90% of the long streak of leaks i provided were accurate. Maybe it's for the "i told you so" attitude ? Well, i only posted messages of this tone AFTER in some case several pages of doubters (for the 1GO of RAM for the OS for example), so it's rather fair don't you think ?
 
welcome back ideaman. Great info. Amazing how a full core wasn't used, yet ports like mass effect and trine looked really good if not better than the other consoles. Or am i reading this wrong?
 

IdeaMan

My source is my ass!
welcome back ideaman. Great info. Amazing how a full core wasn't used, yet ports like mass effect and trine looked really good if not better than the other consoles. Or am i reading this wrong?

It wasn't widespread. It concerned a few games. Don't know if trine or mass effect are involved.

And i wasn't away :) I just have less time to post, you can read some of my messages on PC threads or others (like the Kotaku one on the Durango delay)
 

krizzx

Junior Member
Why are people bringing up all these old discussions? The majority of it was under a speculation thread for a reason. This happens every generation. Information that is passed around gets distorted, hopes are latched onto, and even rumors that started out with some truth, can morph into something totally different. In the midst if it all, something here or there is confirmed and the cycle starts all over again in attempt to put everything in perspective. In the end, you have a plethora of opinions, bold claims, insider comments, rumors and leaks to sieve through.

What exactly is new or surprising a about this? Wasn't PS4 supposed to use 3D stacking and what's not? That didn't turn out accurate, did it? But this stuff is normal. Even very trusted insiders/devs(leave Ideaman out of this) that were involved in many of these threads understand how these situations play out, and aren't really phased, whether people believe them or not. Many of them simply leave a comment and keep it moving. Very few if them feel the need to return later saying; "I told you so," because it is what it is. Normal!

But whatever... carry on.

It was also suppose to have better backwards compatibility, be capable of 1080p 60FPS in all games, output at 4k, etc...
 

stanley1993

Neo Member

BG, im wondering, how does this discovery of logic(wii to wii u) change some ideas of Wii U's own TEV device in gpu? Can it not have special sauce and logic at the same time? also how large would 8 bit be in a 40 nm chip.
 
Good post OryoN. All I wanted to do was have Azak present my view on the GPU to give some discussion. It should have never become about me and old speculation. Also that 3D stacking was speculation as well.

Pretty heavily alluded to some big insider info the entire time, and dampened what you were saying after each successive 'revelation' about the console, month after month. but who cares.

Which goes back to my original point, why are people downing Fourth Storm now and bigging you up as a more credible source?

Heavily alluded and big insider info? LOL. Some of what you linked to came from Lherre's posts here. You're the one that provided links that not only didn't support your argument but contradicted it. And now all of a sudden "who cares". Why even derail this thread in the first place?

And to your question I gave you reasons why in the other post with the things I did claim as fact. I can give links if you want.

But I hope that means things can get back to the GPU now and talk of what we are seeing in the die. I'm pretty good at compartmentalizing discussions, so I would like to think we can truly move on from that.

Might have something to do with the way he presented his information.

I doubt that was all of it (Nintendo fan butthurt was involved), but yeah I seem to remember some level of arrogant prick-ness which couldn't have helped the situation.

Nah it was definitely how he presented info.

http://www.neogaf.com/forum/showpost.php?p=36321204&postcount=8685

In the beginning he came out firing away and wouldn't really respond to anybody.

Never know why bg gets loads of flak. He was one of the reasonable ones providing grounded speculation and analysis of the few facts we had. He never claimed to be an insider at all and never claimed his posts as fact afaik.

As usual the same people seem to be reading what they want to read instead of what has been written.

Them posting the links on their own helped.
 
Hey IM! Been a minute. Interesting info too.

BG, im wondering, how does this discovery of logic(wii to wii u) change some ideas of Wii U's own TEV device in gpu? Can it not have special sauce and logic at the same time? also how large would 8 bit be in a 40 nm chip.

You're going to make me say it aren't you? Haha. Any "special sauce" is most likely going to come down to what type of architecture they chose and what extensions they have in their API. The die shot itself is rather conventional, though my view (let me say my view again if this turns out not to be the case) of it suggests we're looking at a dual graphics engine. In other words 2 rasterizers, 2 hierarchical Zs, 2 tessellators, 2 geometry assemblers, and 2 vertex assemblers.
 

OryoN

Member
Don't know how to interpret the "leave IdeaMan", but if it's for the accuracy of the info given, there are three assessments available in public threads where you can see that 90% of the long streak of leaks i provided were accurate. Maybe it's for the "i told you so" attitude ? Well, i only posted messages of this tone AFTER in some case several pages of doubters (for the 1GO of RAM for the OS for example), so it's rather fair don't you think ?

Hey, no beef here, man. I'm aware of how everything went down and why you delivered some messaged in the manner you did, particularly after some things were confirmed. I've got no problem with that. I recall thanking you for sharing whatever bit of info you could.

The reason I said "leave Ideaman out of this" is because I was certain that once I made this statement; "Very few if them feel the need to return later saying; "I told you so," because it is what it is. Normal!", some would point the the rare exception to the case(yourself), in order to downplay the point I intended to make. A point which I did NOT intended to be a reference to you in any way. Hence, leaving you out. We clear? ;)
 

stanley1993

Neo Member
Hey IM! Been a minute. Interesting info too.



You're going to make me say it aren't you? Haha. Any "special sauce" is most likely going to come down to what type of architecture they chose and what extensions they have in their API. The die shot itself is rather conventional, though my view (let me say my view again if this turns out not to be the case) of it suggests we're looking at a dual graphics engine. In other words 2 rasterizers, 2 hierarchical Zs, 2 tessellators, 2 geometry assemblers, and 2 vertex assemblers.

Ok. I don't know if this was already answered but how would this setup differ from your average GPU? How would nintendo benefit from a dual graphics engine and what facts or rumors support this theory of yours? If this is actually correct, would it be possible that this dual graphics engine was made for multiple gamepad connectivity. I am going to look at the pic comparisons for each gpu you were comparing for a better idea.
 

IdeaMan

My source is my ass!
Hey, no beef here, man. I'm aware of how everything went down and why you delivered some messaged in the manner you did, particularly after some things were confirmed. I've got no problem with that. I recall thanking you for sharing whatever bit of info you could.

The reason I said "leave Ideaman out of this" is because I was certain that once I made this statement; "Very few if them feel the need to return later saying; "I told you so," because it is what it is. Normal!", some would point the the rare exception to the case(yourself), in order to downplay the point I intended to make. A point which I did NOT intended to be a reference to you in any way. Hence, leaving you out. We clear? ;)

no problem :)

/sit and read interesting gpu chat
 
Ok. I don't know if this was already answered but how would this setup differ from your average GPU? How would nintendo benefit from a dual graphics engine and what facts or rumors support this theory of yours? If this is actually correct, would it be possible that this dual graphics engine was made for multiple gamepad connectivity. I am going to look at the pic comparisons for each gpu you were comparing for a better idea.

As of now only Cayman and some of the Southern Island GPUs utilize it. As I understand it this was AMD's way of increasing the triangles/clock to 2 tri/clock. I believe nVidia's design outputs 4 tri/clock from what I read. The hypothesis is based on the duplicate blocks at the bottom of die shot. There are five duplicated blocks in Latte. Modern GPUs have five components in their graphics engine. I already felt early on that the J blocks are the TMUs due to their location with the SIMDs. For me that left me wondering why we're seeing so many duplicated blocks. When I decided to come back to trying to figure out what might be going on, I remembered the dual engine in Cayman. And then since I was actually paying attention to it I noticed there were ten components due to the duplication. So it's primarily based on those similarities.

I also bounced the idea off of some others with wsippel saying he had considered it as well and I believe mentioned a benefit to the controller. He can confirm that if he wants. Blu mentioned that AMD also did it to improve workload granularity and something else I think I'm drawing a blank on. I think blu is going to give his take on my idea sooner or later when he gets the chance.
 

krizzx

Junior Member
As of now only Cayman and some of the Southern Island GPUs utilize it. As I understand it this was AMD's way of increasing the triangles/clock to 2 tri/clock. I believe nVidia's design outputs 4 tri/clock from what I read. The hypothesis is based on the duplicate blocks at the bottom of die shot. There are five duplicated blocks in Latte. Modern GPUs have five components in their graphics engine. I already felt early on that the J blocks are the TMUs due to their location with the SIMDs. For me that left me wondering why we're seeing so many duplicated blocks. When I decided to come back to trying to figure out what might be going on, I remembered the dual engine in Cayman. And then since I was actually paying attention to it I noticed there were ten components due to the duplication. So it's primarily based on those similarities.

I also bounced the idea off of some others with wsippel saying he had considered it as well and I believe mentioned a benefit to the controller. He can confirm that if he wants. Blu mentioned that AMD also did it to improve workload granularity and something else I think I'm drawing a blank on. I think blu is going to give his take on my idea sooner or later when he gets the chance.

I recall the method of doubling the polygons per hertz being mentioned earlier in this thread and asked if this could be using it. It would be interesting if it did. It would certainly explain the Bayonetta 2 numbers.
 
I recall the method of doubling the polygons per hertz being mentioned earlier in this thread and asked if this could be using it. It would be interesting if it did. It would certainly explain the Bayonetta 2 numbers.

I have more to add to the discussion, but I believe the simplest explanation for those Bayonetta 2 numbers is that they are being used to bake the normal maps. It's so above even the numbers we have for PS4 games (KZ), that it must be ruled out. And Wii U and PS4 are getting their graphics IP from the same vendor, with Sony's chip clocked higher. Just seems very very unlikely those are final in-game polycounts even if it were a dual setup engine config.
 
I have more to add to the discussion, but I believe the simplest explanation for those Bayonetta 2 numbers is that they are being used to bake the normal maps. It's so above even the numbers we have for PS4 games (KZ), that it must be ruled out. And Wii U and PS4 are getting their graphics IP from the same vendor, with Sony's chip clocked higher. Just seems very very unlikely those are final in-game polycounts even if it were a dual setup engine config.

Maybe all the Bayonetta levels will be the featureless gray boxes blu keeps advocating devs use in the Frostbite thread?
 
I have more to add to the discussion, but I believe the simplest explanation for those Bayonetta 2 numbers is that they are being used to bake the normal maps. It's so above even the numbers we have for PS4 games (KZ), that it must be ruled out. And Wii U and PS4 are getting their graphics IP from the same vendor, with Sony's chip clocked higher. Just seems very very unlikely those are final in-game polycounts even if it were a dual setup engine config.
That or tesselation seeing that would allow to scale detail back outside of closeups, polygon-wise as well.

Or both.
 

krizzx

Junior Member
Maybe all the Bayonetta levels will be the featureless gray boxes blu keeps advocating devs use in the Frostbite thread?

To be fare, the environments in Frostbite engine made games do have tendency to be filled with large amounts of nothing. At least from an interactivity stand point.
 

z0m3le

Banned
I have more to add to the discussion, but I believe the simplest explanation for those Bayonetta 2 numbers is that they are being used to bake the normal maps. It's so above even the numbers we have for PS4 games (KZ), that it must be ruled out. And Wii U and PS4 are getting their graphics IP from the same vendor, with Sony's chip clocked higher. Just seems very very unlikely those are final in-game polycounts even if it were a dual setup engine config.

I keep hearing that Ninja Gaiden has models with 122k polygons. If that is the case. 192k would be likely if they are going after the same effect with all weapons and clothing like the NG model. Though I haven't looked into the numbers myself, I just keep seeing them posted including in this thread pages back.

Also BG, that is interesting. I do think the GPU is probably custom. I'm not saying it has to be odd and I don't think they have moved away from VLIW because from a coding standpoint that would be very easy to figure out for devs and we would of probably heard something about that by now.

Ideaman's info I heard about a couple weeks ago (from him) I find that probably explains some of the launch games performances and why developers complained about the CPU (while I don't think they would say it was particularly amazing I do think it was probably enough to handle ports from 360 without much problem as long as SIMD tasks were passed along to the GPU when needed) It also should point to comparing early launch ports as a means for comparisons as dishonest or pointless in many cases.
 
I have more to add to the discussion, but I believe the simplest explanation for those Bayonetta 2 numbers is that they are being used to bake the normal maps. It's so above even the numbers we have for PS4 games (KZ), that it must be ruled out. And Wii U and PS4 are getting their graphics IP from the same vendor, with Sony's chip clocked higher. Just seems very very unlikely those are final in-game polycounts even if it were a dual setup engine config.

What are these poly numbers for Wii U? and I'm assuming there talking about the main character model?

KZ: SF is 40k polygons for NPC characters at the highest LOD level, but there's 60 of them...
 

z0m3le

Banned
What are these poly numbers for Wii U? and I'm assuming there talking about the main character model?

KZ: SF is 40k polygons for NPC characters at the highest LOD level, but there's 60 of them...

Well if the 2 polygon cycle is correct for Wii U:
Wii U = 1.1 billion polygons, if not it is 550 million polygons
PS4 = 1.6 billion polygons
360 = 500 million polygons just for comparison.
 

krizzx

Junior Member
Well if the 2 polygon cycle is correct for Wii U:
Wii U = 1.1 billion polygons, if not it is 550 million polygons
PS4 = 1.6 billion polygons
360 = 500 million polygons just for comparison.

I still don't understand how these calculations work.

The PS2's clock was 147.456 MHz, but max polyugon capability was 50 million and most ever achieved in an actual game was 10 million.

The GC's graphics clock was 162 MHz, but its peak polygon count was 110 million and the highest achieved was 20 million at 60 FPS.

The Xbox1's graphics clock was 233 MHz, but its peak polygon count was 120 million and most ever achieved in a game was 12 million at 30 FPS

I've never known polygon counts to scale with clock rate point like your listing it. I always thought there was something else to it. This has confused me since earlier in the thread. Is this a more modern thing?
 

z0m3le

Banned
I still don't understand how these calculations work.

The PS2's clock was 147.456 MHz, but max polyugon capability was 50 million and most ever achieved in an actual game was 10 million.

The GC's graphics clock was 162 MHz, but its peak polygon count was 110 million and the highest achieved was 20 million at 60 FPS.

The Xbox1's graphics clock was 233 MHz, but its peak polygon count was 120 million and most ever achieved in a game was 12 million at 30 FPS

I've never known polygon counts to scale with clock rate point like your listing it. I always thought there was something else to it. This has confused me since earlier in the thread. Is this a more modern thing?
It is explained a couple pages earlier which is where I saw how to calculate the polygon numbers. It is very easy actually, it is 1 polygon per clock per engine. This is for modern AMD GPUs. GCN and Cayman (HD 6900) handle 2 polygons per cycle.
 
Status
Not open for further replies.
Top Bottom