frankie_baby
Member
Remember when some AMD rep said it was capable of 1 terraflop?
dont remember an AMD rep saying that though i do remember an AMD rep saying it was an e6760 which most of us myself included ate up
Remember when some AMD rep said it was capable of 1 terraflop?
dont remember an AMD rep saying that though i do remember an AMD rep saying it was an e6760 which most of us myself included ate up
Remember when some AMD rep said it was capable of 1 terraflop?
So would that make the consensus opinion that Latte has fixed function shaders that replicate TEV functions in both Wii U and Wii modes? I've read both theories floating around, the first that it's fixed function and the second that TEV is being emulated into modern shaders on the GPU, and both seem to have merit.
Fixed function would make things a lot easier for Nintendo's internal teams but I can't imagine that 3rd party devs would be happy with a split programmable/fixed function GPU. Although since 3rd party devs actually don't seem to be happy maybe that's exactly what it is.
Thanks for that. If that's what Nintendo actually said, (that they designed the Wii U, then modified Wii U parts in order to achieve fluid backwards compatibility as opposed to gutting/sacrificing for BC's sake) then I think that this idea that the core Wii U design is held back because of BC is negated. They built the machine they wanted to build it, and the console does not make sacrifices in order for people to play their Wii game collection.
Shiota
Yes. The designers were already incredibly familiar with the Wii, so without getting hung up on the two machines' completely different structures, they came up with ideas we would never have thought of. There were times when you would usually just incorporate both the Wii U and Wii circuits, like 1+1. But instead of just adding like that, they adjusted the new parts added to Wii U so they could be used for Wii as well.
What's it have, like 5 of them?
Oh the WUST days were fun times.
I always figured just that nobody cared about "maxing" any Wii. It was far behind anyway, and few FPS came out for it which are the most graphically demanding games.
Anyways on a totally separate topic, I used to often bookmark posts I thought were particularly egregious or the poster would eat crow later. A lot of them dealt with Wii U and what a monster it was going to be in the early days. The other day I was perusing a few of these and they were pretty funny. There was some pretty damning stuff by BG. I will probably post a few later as I dont have time to mess with it now.
Just remember there was a time where more or less a ton of people thought Wii U was going to be more powerful than Durango. Nobody came out and said it but that's basically what they were getting at.
It's just kind of sad to look back at how Wii U undershot our worst expectations by a large amount. Even mine, and I was it's #1 detractor. I really expected it to at least be better than PS360 easily. So far I haven't seen that.
I remember specifically one guy's post, it was in response to that thing that Crytek was working with Wii U or whatever, and he said something like "Crytek? what kind of monster are you building here Nintendo" or something like that. That was the attitude back then. Wii U was a monster.
Oh another thing was power, I think BG had it using 100 watts or something, in one of the bookmarked posts. The reality was 33. 33.
I think the rumored Durango specs was everything Wii U should have been from an engineering standpoint. The whole "good enough" thing.
BG 100 watts
http://www.neogaf.com/forum/showpost.php?p=34726131&postcount=3759
Here's BG predicting e6760 and 600-800 gflops
http://www.neogaf.com/forum/showpost.php?p=41597923&postcount=2367
anyways it's ot to this thread so i guess i wont belabor it. it's just interesting how attitudes have changed.
BG had 'called' me out a few posts ago, so I took a look at his post history and the back sledding was amazing.
Do a google search of him and 'dev kit' and he was going around on Gamespot and GameFAQs claiming he had all this access to the dev kit and surprises, but he's been called out enough and he's busy "gettin' Money" playing for the Money team with an inside basketball court AND an outside basketball court to reply.
http://www.ign.com/boards/threads/wiiu-devkit-pics-wiiu-early-target-specs.452522923/
http://www.gamefaqs.com/boards/631516-wii-u/61019975?page=2
http://www.neogaf.com/forum/showthread.php?p=41938848&highlight=#post41938848
http://www.neogaf.com/forum/showpost.php?p=38549553&postcount=1
http://www.neogaf.com/forum/showthread.php?p=42351669&highlight=#post42351669
Remember when some AMD rep said it was capable of 1 terraflop?
It was also fueled by hype posts by "insiders" like BG and Ideaman. When Akham ad llhere (probably misspelled the names heh) said to not get too excited about the specs they were pounced on in the wust threads(llhere not so much but he wasn't as blunt about it).
Just more reasons I wish Nintendo would abolish their "not gonna give you specs" policy.
not at all, i only remember "4850 equivalent"
Man, the Wust stuff was a thread of nightmares.
Just more reasons I wish Nintendo would abolish their "not gonna give you specs" policy.
Yeah its gonna be difficult to search for, I'll try, but here are some posts that I think refer to that prior rumor.
This one is posting news discounting the power of the Wii U
http://www.neogaf.com/forum/showpost.php?p=36586328&postcount=13566
And here is a response from one of our favorite semi-insiders in the WUST thread
http://www.neogaf.com/forum/showpost.php?p=36586657&postcount=13588
Well regardless of the Dev being wrong about the capability of the machine, it's rather telling that he says, 'Not as many shaders' as PS360. Despite ignorance of the capability of those shaders, there's not much reason to assume he doesn't know the number. This beefs up the idea of 160 beefed up, much more modern shaders.
Yes, I find it utterly shocking that people would change their opinions based on new information.
I was completely unaware that these even existed. Is there a part 6?
Though, back on topic. Shouldn't Latte be counted as having 35 MB of EDRAM since it has a 1mb and 2 mb chip?
Well regardless of the Dev being wrong about the capability of the machine, it's rather telling that he says, 'Not as many shaders' as PS360. Despite ignorance of the capability of those shaders, there's not much reason to assume he doesn't know the number. This beefs up the idea of 160 beefed up, much more modern shaders.
ROFL! In all of those links, show me where I said I have access to the dev kit. Did you even read the GameFaqs posts. I was telling them that the CPU was not going to be a POWER7. The "surprise" was going to be showing the pictures of the dev kit that were sent to me, which I clearly say, but I asked for permission beforehand and they weren't comfortable with it. I like how getting credit for providing the link means I have dev kit access. I don't even understand what you were trying to prove with these links.
I like that you guys link this stuff. It helps to prove my point that some people see what they want to see even though it's not there. Thanks for doing the work for me! And dang right I'm working on "gettin' Money" playing for the Money team with an inside basketball court AND an outside basketball court to reply. You want to start Paypaling me some money I'll consider sticking around. What's the point of making fun of my life situation?
Oh and while we're at it, let's do the same for PS4.
http://www.neogaf.com/forum/showpost.php?p=38657020&postcount=1
I don't even understand the motivation behind it. But I guess it goes back to some people taking this more serious than I do.
And boy did he get a lot of shit for it.
But should we really hold weight to their statement when the source is dubious at best? Not that I don't trust Gamesindustry.biz, but the 'anonymous dev' thing. Then again, it is gamesindustry.biz...
The writer of that peice's info can be obtained here - Steve Peterson
Might be OT, but looking back there were several devs who straight up discounted the CPU, and can be directly quoted as much. Would we chalk that up to lack of information as well?
Also Arkam was the first person to indirectly let us know there was something wrong on the CPU end.
Might have something to do with the way he presented his information.And boy did he get a lot of shit for it.
Might have something to do with the way he presented his information.
Why are people bringing up all these old discussions? The majority of it was under a speculation thread for a reason. This happens every generation. Information that is passed around gets distorted, hopes are latched onto, and even rumors that started out with some truth, can morph into something totally different. In the midst if it all, something here or there is confirmed and the cycle starts all over again in attempt to put everything in perspective. In the end, you have a plethora of opinions, bold claims, insider comments, rumors and leaks to sieve through.
What exactly is new or surprising a about this? Wasn't PS4 supposed to use 3D stacking and what's not? That didn't turn out accurate, did it? But this stuff is normal. Even very trusted insiders/devs(leave Ideaman out of this) that were involved in many of these threads understand how these situations play out, and aren't really phased, whether people believe them or not. Many of them simply leave a comment and keep it moving. Very few if them feel the need to return later saying; "I told you so," because it is what it is. Normal!
But whatever... carry on.
Why are people bringing up all these old discussions?
welcome back ideaman. Great info. Amazing how a full core wasn't used, yet ports like mass effect and trine looked really good if not better than the other consoles. Or am i reading this wrong?
I wonder if this was an issue for home consoles as well, and the spring update unlocked this core to games and the OS?
In my case, as i said clearly, it was an issue found and resolved before the launch of the Wii U in November.
Why are people bringing up all these old discussions? The majority of it was under a speculation thread for a reason. This happens every generation. Information that is passed around gets distorted, hopes are latched onto, and even rumors that started out with some truth, can morph into something totally different. In the midst if it all, something here or there is confirmed and the cycle starts all over again in attempt to put everything in perspective. In the end, you have a plethora of opinions, bold claims, insider comments, rumors and leaks to sieve through.
What exactly is new or surprising a about this? Wasn't PS4 supposed to use 3D stacking and what's not? That didn't turn out accurate, did it? But this stuff is normal. Even very trusted insiders/devs(leave Ideaman out of this) that were involved in many of these threads understand how these situations play out, and aren't really phased, whether people believe them or not. Many of them simply leave a comment and keep it moving. Very few if them feel the need to return later saying; "I told you so," because it is what it is. Normal!
But whatever... carry on.
Pretty heavily alluded to some big insider info the entire time, and dampened what you were saying after each successive 'revelation' about the console, month after month. but who cares.
Which goes back to my original point, why are people downing Fourth Storm now and bigging you up as a more credible source?
Might have something to do with the way he presented his information.
I doubt that was all of it (Nintendo fan butthurt was involved), but yeah I seem to remember some level of arrogant prick-ness which couldn't have helped the situation.
Never know why bg gets loads of flak. He was one of the reasonable ones providing grounded speculation and analysis of the few facts we had. He never claimed to be an insider at all and never claimed his posts as fact afaik.
As usual the same people seem to be reading what they want to read instead of what has been written.
BG, im wondering, how does this discovery of logic(wii to wii u) change some ideas of Wii U's own TEV device in gpu? Can it not have special sauce and logic at the same time? also how large would 8 bit be in a 40 nm chip.
In that case, i can't wait to see the nintendo direct in june. I wonder how the games will look.
Don't know how to interpret the "leave IdeaMan", but if it's for the accuracy of the info given, there are three assessments available in public threads where you can see that 90% of the long streak of leaks i provided were accurate. Maybe it's for the "i told you so" attitude ? Well, i only posted messages of this tone AFTER in some case several pages of doubters (for the 1GO of RAM for the OS for example), so it's rather fair don't you think ?
Hey IM! Been a minute. Interesting info too.
You're going to make me say it aren't you? Haha. Any "special sauce" is most likely going to come down to what type of architecture they chose and what extensions they have in their API. The die shot itself is rather conventional, though my view (let me say my view again if this turns out not to be the case) of it suggests we're looking at a dual graphics engine. In other words 2 rasterizers, 2 hierarchical Zs, 2 tessellators, 2 geometry assemblers, and 2 vertex assemblers.
Hey, no beef here, man. I'm aware of how everything went down and why you delivered some messaged in the manner you did, particularly after some things were confirmed. I've got no problem with that. I recall thanking you for sharing whatever bit of info you could.
The reason I said "leave Ideaman out of this" is because I was certain that once I made this statement; "Very few if them feel the need to return later saying; "I told you so," because it is what it is. Normal!", some would point the the rare exception to the case(yourself), in order to downplay the point I intended to make. A point which I did NOT intended to be a reference to you in any way. Hence, leaving you out. We clear?
Ok. I don't know if this was already answered but how would this setup differ from your average GPU? How would nintendo benefit from a dual graphics engine and what facts or rumors support this theory of yours? If this is actually correct, would it be possible that this dual graphics engine was made for multiple gamepad connectivity. I am going to look at the pic comparisons for each gpu you were comparing for a better idea.
As of now only Cayman and some of the Southern Island GPUs utilize it. As I understand it this was AMD's way of increasing the triangles/clock to 2 tri/clock. I believe nVidia's design outputs 4 tri/clock from what I read. The hypothesis is based on the duplicate blocks at the bottom of die shot. There are five duplicated blocks in Latte. Modern GPUs have five components in their graphics engine. I already felt early on that the J blocks are the TMUs due to their location with the SIMDs. For me that left me wondering why we're seeing so many duplicated blocks. When I decided to come back to trying to figure out what might be going on, I remembered the dual engine in Cayman. And then since I was actually paying attention to it I noticed there were ten components due to the duplication. So it's primarily based on those similarities.
I also bounced the idea off of some others with wsippel saying he had considered it as well and I believe mentioned a benefit to the controller. He can confirm that if he wants. Blu mentioned that AMD also did it to improve workload granularity and something else I think I'm drawing a blank on. I think blu is going to give his take on my idea sooner or later when he gets the chance.
I recall the method of doubling the polygons per hertz being mentioned earlier in this thread and asked if this could be using it. It would be interesting if it did. It would certainly explain the Bayonetta 2 numbers.
I have more to add to the discussion, but I believe the simplest explanation for those Bayonetta 2 numbers is that they are being used to bake the normal maps. It's so above even the numbers we have for PS4 games (KZ), that it must be ruled out. And Wii U and PS4 are getting their graphics IP from the same vendor, with Sony's chip clocked higher. Just seems very very unlikely those are final in-game polycounts even if it were a dual setup engine config.
That or tesselation seeing that would allow to scale detail back outside of closeups, polygon-wise as well.I have more to add to the discussion, but I believe the simplest explanation for those Bayonetta 2 numbers is that they are being used to bake the normal maps. It's so above even the numbers we have for PS4 games (KZ), that it must be ruled out. And Wii U and PS4 are getting their graphics IP from the same vendor, with Sony's chip clocked higher. Just seems very very unlikely those are final in-game polycounts even if it were a dual setup engine config.
Maybe all the Bayonetta levels will be the featureless gray boxes blu keeps advocating devs use in the Frostbite thread?
I have more to add to the discussion, but I believe the simplest explanation for those Bayonetta 2 numbers is that they are being used to bake the normal maps. It's so above even the numbers we have for PS4 games (KZ), that it must be ruled out. And Wii U and PS4 are getting their graphics IP from the same vendor, with Sony's chip clocked higher. Just seems very very unlikely those are final in-game polycounts even if it were a dual setup engine config.
To be fare, the environments in Frostbite engine made games do have tendency to be filled with large amounts of nothing. At least from an interactivity stand point.
I have more to add to the discussion, but I believe the simplest explanation for those Bayonetta 2 numbers is that they are being used to bake the normal maps. It's so above even the numbers we have for PS4 games (KZ), that it must be ruled out. And Wii U and PS4 are getting their graphics IP from the same vendor, with Sony's chip clocked higher. Just seems very very unlikely those are final in-game polycounts even if it were a dual setup engine config.
What are these poly numbers for Wii U? and I'm assuming there talking about the main character model?
KZ: SF is 40k polygons for NPC characters at the highest LOD level, but there's 60 of them...
Well if the 2 polygon cycle is correct for Wii U:
Wii U = 1.1 billion polygons, if not it is 550 million polygons
PS4 = 1.6 billion polygons
360 = 500 million polygons just for comparison.
It is explained a couple pages earlier which is where I saw how to calculate the polygon numbers. It is very easy actually, it is 1 polygon per clock per engine. This is for modern AMD GPUs. GCN and Cayman (HD 6900) handle 2 polygons per cycle.I still don't understand how these calculations work.
The PS2's clock was 147.456 MHz, but max polyugon capability was 50 million and most ever achieved in an actual game was 10 million.
The GC's graphics clock was 162 MHz, but its peak polygon count was 110 million and the highest achieved was 20 million at 60 FPS.
The Xbox1's graphics clock was 233 MHz, but its peak polygon count was 120 million and most ever achieved in a game was 12 million at 30 FPS
I've never known polygon counts to scale with clock rate point like your listing it. I always thought there was something else to it. This has confused me since earlier in the thread. Is this a more modern thing?