• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Reddit [verified] User shares NX info: x86 Architecture, Second screen support etc.

Status
Not open for further replies.

10k

Banned
Will interesting to see this when it's revealed as is always the case with Nintendo, however the controller looks terrible.... I selfishly wish they were just a software company and they were making their games on the hardware of PS4/Xbox One to see what they could come up with.
You're famous now, junior.
 

maxcriden

Member
Nope, it was discussed on 3Dforums many moons ago (I am not looking it up again) saying it was 1080p downsampled from 1440p was their best estimate.

What do you think its from ? Its more than what ps4 crunches out.

1440p is a Nintendo press release (sounds better than bullshot LOL), but if Ubi do that its because they are nasty....

They all do it, deal with it. Or believe it was native on WiiU, bookmark so when it comes out, we can discuss......and crow will be eaten.

Maybe it was running on NX PC target hardware....who knows. This is what I believe anyway...

I wasn't aware it was downsampled like that. My mistake. Is that true of both videos of Zelda U, though? Regardless, my overall point and my own recollection is that releasing bullshot videos (and bullshots in general) is far less common for Nintendo than it is for the Ubisofts and such of the game development world. I would definitely prefer it to never happen, though.

I think it looked the same, they just showed off more of the rough spots in actual gameplay as opposed to the cinematic camera of the reveal.

This was my impression also.

Skyward sword pics were 720p I seem to remember.
Edit: oh it's the trailer you are talking about? fair enough then.

Yep, I'm referring primarily to the footage. Which evidently I was wrong about, after all, but my larger point I was trying to articulate is more just that I think Nintendo do is much less often guilty of releasing bullshot compared to many of their industry peers.

What do you guys think if the base NX console was close to the handheld in terms of power (maybe with a better CPU) and required the SCDs for additional power?

It's possible, but I expect SCDs being required to even get the system beyond the HH level of power is a pretty tough sell and will make the default console too weak. At that point its power would be likely commensurate with Wii U. (Someone please do correct me if I'm mistaken.)

One is a friend. Others are peers I met and that still talk to be more clear. They could have the info I just didn't ask literally mentioned it was a short confirmation. Considering I don't have a decent set of files as this is nothing like my other leaks, more like N64 where it was super tight lipped. Already been warned here and other places not to get myself in trouble by sharing actual details the NDA is made for.

Info like that says a lot which is why nintendo locks people down and heavily compartmentalizes things. They are like a cross breed between a cia agent and a ninja.

Zelda N64 and GC were succeeded. People drum up the GC demo but nintendo has shown their hardware real time or in other demos can do far more. The WiiU demo can't be compared to nothing else we don't have a real time zelda game to do so.

Ok, gotcha. That makes sense. Thank you. That leaves a bit more mystery in the air for sure.

Oh, I'm not tech savvy in the slightest. I don't know anything about this jaguar, polaris, AMD, ARM, x86 stuff outside of some very minor aspects
(x86 means porting from PC is easier, ARM is friendlier with portables)
The handheld being the base model would hold back NX titles, for sure, but it's probably for the best. A company can invest more money into said title if the installbase was going to be noticeably bigger so it should show more stylized but ambitious handheld titles on the console.
I think it'll probably be possible to make an average 60fps Wii U game run at 30fps on the handheld while the console could run the same game at 1080p 60fps without needing a ton of additional work. If you make them with the mindset that it's going to be made for two different devices that could help make the console games look nicer as well.
Additionally, If the titles are too much for the handheld to handle then they can make it console exclusive.
An example would be DKCTF: 30fps without fur effects outside of cut scenes) on the NX handheld and 1080p 60fps on the console with added effects like particles, fur effects, etc.
Best case scenario would be Super Smash 4 which I think was them experimenting with the idea.
Of course, this is all speculation based on our interpretation on what Iwata said about the direction of their hardware.

Gotcha, I understand a bit better now, thank you. I suppose it comes down to as you said the interpretation of Iwata's comments. There's one school of thought that takes it as you describe, and another that thinks of it more as being about shared assets and engines rather than upscaled or downscaled games, or even a mix of the two schools of thought. I suspect it'll be a mix but I'll be very interested to see.
 

dickroach

Member
The only reason why it's still going is because of the information by LCGeek and some reputable posters (Thraktor, Blu...) providing some context to go along with it. Like i said earlier, we should have posted a new thread about that, instead of this insane Reddit crap.

so what been information by reputable sources has been mentioned? too many pages in this thread
 

geordiemp

Member
I wasn't aware it was downsampled like that. My mistake. Is that true of both videos of Zelda U, though?
.

Dont know, all I know is when that zelda screenshot was analysed by the pixel counters, thats what they came up with, which obviously casts doubts on whether it came from a a WiiU.

It does look nice though, just too nice.
 

Proelite

Member
I think the question comes down to how much total RAM Nintendo wants to go with. If it's 8GB or less, then I can't imagine a HBM+DDR3/4 approach being cheaper than GDDR5(X), or even LPDDR4, and either of the latter should provide enough bandwidth for a GCN 1.2 GPU. If they decide they need 12GB or more, then perhaps a small HBM pool plus a DDR3/4 pool might be the cheaper way to give themselves both the bandwidth and capacity they need.

What kind of bandwidth are we getting with 8Gbs of LPDDR4 on a 128bit bus?

I know that with GDDR5x they can hit over 150gb/s. They won't need the Edram in that case.
 
Will interesting to see this when it's revealed as is always the case with Nintendo, however the controller looks terrible.... I selfishly wish they were just a software company and they were making their games on the hardware of PS4/Xbox One to see what they could come up with.

Hilarious post. :D

You are a bit late. All controller "leaks" has been exposed as fakes.

I selfishly wish people would actually read news or at least a bit of the threads they are posting in. :)

There are actually no real useful evidence or even hints about NX, besides the info that NX CPU has more power than any other console CPU. Some people trust the guy who "revealed" this CPU info bit, besides the fact he claims to be insane.
 

LCGeek

formerly sane
Hilarious post. :D

You are a bit late. All controller "leaks" has been exposed as fakes.

I selfishly wish people would actually read news or at least a bit of the threads they are posting in. :)

There are actually no real useful evidence or even hints about NX, besides the info that NX CPU has more power than any other console CPU. Some people trust the guy who "revealed" this CPU info bit, besides the fact he claims to be insane.

its a joke around the forum, I'm not legally insane bro.

I'm allowed to confirm, predict, or knock down trash I know is trash. Some of what you say about real useful evidence is my point to begin with, so stop trying act like you're the only skeptic as plenty others including myself are. If you wonder why I don't say more, nda and red steel for starters.

Finally it's a nintendo tech speculation thread if you're not happy about it, it's not like it's a requirement to be in here.
 

Neoxon

Junior Member
Will interesting to see this when it's revealed as is always the case with Nintendo, however the controller looks terrible.... I selfishly wish they were just a software company and they were making their games on the hardware of PS4/Xbox One to see what they could come up with.
If you're referring to the touchscreen controller, that was proven as fake. That is, unless you have an inside source.
 

Thraktor

Member
What kind of bandwidth are we getting with 8Gbs of LPDDR4 and GDDR5X on a 128bit bus?

LPDDR4 wouldn't be enough on a 128 bit bus, but on a 256 bit bus, it would be able to provide 100GB/s+ (and given GCN 1.2's color buffer compression, that should be plenty). In theory GDDR5X could provide as much as 224GB/s on a 128 bit bus, but early implementations are likely to be closer to 160GB/s, which is still plenty from a bus that narrow.
 

Proelite

Member
LPDDR4 wouldn't be enough on a 128 bit bus, but on a 256 bit bus, it would be able to provide 100GB/s+ (and given GCN 1.2's color buffer compression, that should be plenty). In theory GDDR5X could provide as much as 224GB/s on a 128 bit bus, but early implementations are likely to be closer to 160GB/s, which is still plenty from a bus that narrow.

I don't think Nintendo will be using a 256bit bus for main memory. So it's either GDDR5X or
other ram + cache.

Considering GDDR5x will be new and expensive, I doubt they'll go with that.

What's wrong with having 32 mb of Edram on daughter die at 40nm? If it's impossible, I guess they can go with Esram in the APU like the Xbox One.
 
Gotcha, I understand a bit better now, thank you. I suppose it comes down to as you said the interpretation of Iwata's comments. There's one school of thought that takes it as you describe, and another that thinks of it more as being about shared assets and engines rather than upscaled or downscaled games, or even a mix of the two schools of thought. I suspect it'll be a mix but I'll be very interested to see.
I believe they only mentioned porting assets, but I also think they mentioned that the architecture made it really hard to port games from Wii to 3DS and 3DS to Wii U or something like that alongside of idealizing iOS and android in that respect.
I don't think ALL the games will be shared, but I think it'll make sense for most 1st party games to be made for both.
3DS allowing for more 3D/ambitious games than the DS and having to adapt to HD development made it hard to output enough content for both systems.
From what I see, a shared library would be a god send for Nintendo keeping costs low while maximizing the installbase and pleasing fans.
But, if they don't do this I don't think they can survive another generation unless they drop one of the two systems or expand rapidly. Consumers won't be satisfied if they need to get two systems to experience Nintendo's full output but I think most would be pleased if you can play everything on one system
 

LCGeek

formerly sane
What does the CPU affect? I didn't think the X1 was "stronger" than PS4 in any particular sense, but I don't have the first clue about these things.

Compared to where the WiiU was it's a fucking jump, anybody here who has even a slightly good laptop knows the difference without having to think about it too much. The overall cpu performance in a WiiU cpu isn't even as good as intel core 2 duo. Shit a phenom could kill the WiiU without blinking. It's a tweaked out p3 level dinosaur, that's how old the cpu in the WiiU is.

Context people, see this is why nintendo cannot please some people you can say the same sony or microsoft console expectations too. I'm a little whiny to some here, but that kind of jump don't interest you? Ok, what kind of jump would interest you?
 

MCN

Banned
Are people this thick compared to where the WiiU was it's a fucking jump, anybody here who has even a slightly good laptop knows the difference without having to think about it too much. The overall cpu performance in a WiiU cpu isn't even as good as intel core 2 duo. Shit a phenom could kill the WiiU without blinking. It's a tweaked out p3 level fucking dinosaur, that's how fucking old the cpu in the WiiU is.

Context people, see this is why nintendo cannot please some people including sony or microsoft. I'm a little whiny to some here, but that kind of jump don't interest you? Ok, what kind of jump would interest you?

I want nothing less than a positronic neural net.
 
I want nothing less than a positronic neural net.

Are people this thick compared to where the WiiU was it's a fucking jump, anybody here who has even a slightly good laptop knows the difference without having to think about it too much. The overall cpu performance in a WiiU cpu isn't even as good as intel core 2 duo. Shit a phenom could kill the WiiU without blinking. It's a tweaked out p3 level fucking dinosaur, that's how fucking old the cpu in the WiiU is.

Context people, see this is why nintendo cannot please some people including sony or microsoft. I'm a little whiny to some here, but that kind of jump don't interest you? Ok, what kind of jump would interest you?

Bruh, chill. Don't be an ass. This isn't the WUSP where it was a dedicated community of people all following specifically tech stuff. This is the front page of gaf.
 

platina

Member
Are people this thick compared to where the WiiU was it's a fucking jump, anybody here who has even a slightly good laptop knows the difference without having to think about it too much. The overall cpu performance in a WiiU cpu isn't even as good as intel core 2 duo. Shit a phenom could kill the WiiU without blinking. It's a tweaked out p3 level fucking dinosaur, that's how fucking old the cpu in the WiiU is.

Context people, see this is why nintendo cannot please some people including sony or microsoft. I'm a little whiny to some here, but that kind of jump don't interest you? Ok, what kind of jump would interest you?

Needs moar chartz

A89xAb.png
 
Are people this thick compared to where the WiiU was it's a fucking jump, anybody here who has even a slightly good laptop knows the difference without having to think about it too much. The overall cpu performance in a WiiU cpu isn't even as good as intel core 2 duo. Shit a phenom could kill the WiiU without blinking. It's a tweaked out p3 level fucking dinosaur, that's how fucking old the cpu in the WiiU is.

Context people, see this is why nintendo cannot please some people including sony or microsoft. I'm a little whiny to some here, but that kind of jump don't interest you? Ok, what kind of jump would interest you?

Not even sure what you're getting at here. But if you're saying a) I don't know shit about technology and b) the Wii U isn't powerful, well then I'm not gonna argue with that.
 

ozfunghi

Member
Compared to where the WiiU was it's a fucking jump, anybody here who has even a slightly good laptop knows the difference without having to think about it too much. The overall cpu performance in a WiiU cpu isn't even as good as intel core 2 duo. Shit a phenom could kill the WiiU without blinking. It's a tweaked out p3 level dinosaur, that's how old the cpu in the WiiU is.

Context people, see this is why nintendo cannot please some people you can say the same sony or microsoft console expectations too. I'm a little whiny to some here, but that kind of jump don't interest you? Ok, what kind of jump would interest you?

I just think he meant that he does not know what a CPU does exactly, hence him not knowing the CPU in the XBO being slightly more powerful than the one in the PS4 either.
 
Geek is pissed at shitty console CPUs. I see his point. They really got the shaft this gen. Why not crank console TDP up to 200 watts? Or even higher? Just keep the price down. Plenty of applications rely on strong CPU at a high clockrate. Games not needing a strong CPU? It's a myth.
 

LCGeek

formerly sane
Not even sure what you're getting at here. But if you're saying a) I don't know shit about technology and b) the Wii U isn't powerful, well then I'm not gonna argue with that.

You could know, but I was asking how much. Didn't know you didn't know that much, MY BAD. The cpu we are talking about in the WiiU Cpu is old like late 90s old, nintendo just souped it up. That's the cpu they've been using since the cube or variants of it to get what they did in WiiU. To down them even more they weren't even using high end variants of the IBM cpus they were based on.

In terms of games, do you like open world games? If you do you really want a good cpu. If the game don't even look good do you want strong physics that don't drop your fps? That's where this helps.

Geek is pissed at shitty console CPUs. I see his point. They really got the shaft this gen. Why not crank console TDP up to 200 watts? Or even higher? Just keep the price down. Plenty of applications rely on strong CPU at a high clockrate. Games not needing a strong CPU? It's a myth.

Agreed on the myth part I'm in paradise with having server and awesome desktop access it's never enough still. Yet we can go places if they try its disgusting where they are right now. As to the why of a console tdp like that it's impossible at the size of WiiU based form, too much heat to say the least. Even if nintendo said yes to TDP they have to budge on form otherwise no matter what they do sans removing an optical drive for something much smaller will have to contend with space/heat issues. To get there it has to be about something near a PS4 say within 80%.
 

Nanashrew

Banned
I do kinda hope that things do begin to level out at some point and consoles can get better CPUs. Lacking strong CPUs are a bottleneck.
 

Thraktor

Member
I don't think Nintendo will be using a 256bit bus for main memory. So it's either GDDR5X or
other ram + cache.

Considering GDDR5x will be new and expensive, I doubt they'll go with that.

Nintendo haven't used wide memory busses in consoles because they've had a high-bandwidth smaller pool providing the bandwidth they need, so they don't need a wide bus on the main memory. Look at the 3DS's 128-bit FCRAM, as wide a bus as you'll find in a mobile environment, and you'll see how willing Nintendo are to use a wide memory interface when they need to.

Besides, a large reason that a system designer would want to use a narrow memory bus is that it allows them to use fewer memory chips, keeping the motherboard small and simple. With LPDDR4 widely available with a 64 bit interface, Nintendo could get away with four chips, the same number they used in the Wii U, and a quarter as many as were used in the launch PS4 and XBO. GDDR5 would require at least 8 chips to reach 8GB with currently available parts (and I'm pretty sure those are 32 bit I/O chips, so you'd end up with a 256 bit bus anyway). Nintendo could get anywhere from 8GB to 24GB of LPDDR4 on four chips at almost 120GB/s if they wanted to.

What's wrong with having 32 mb of Edram on daughter die at 40nm? If it's impossible, I guess they can go with Esram in the APU like the Xbox One.

Using a daughter die alongside the main SoC (even if it's a "cheap" 40nm daughter die) requires packaging the two of them together on an MCM, which isn't cheap. It would likely end up more expensive than HBM for similar bandwidth and a tiny fraction of the capacity.
 

AntMurda

Member
On Wii & Wii U it is not Zelda but Xenoblade that shows what the hardware is capable off.

That's false. Skyward Sword showed off what the Wii was capable from on all fronts. It was literally the showpiece software that tried to deliver what the Wii was supposed to deliver as far immersive 1:1 controls and lavish 3D worlds. Xenoblade wasn't processing 1:1 controls or doing any calculations like that.
 

ozfunghi

Member
...
In terms of games, do you like open world games? If you do you really want a good cpu. If the game don't even look good do you want strong physics that don't drop your fps? That's where this helps.

I've been wondering about this. Do PS4/XBO games have physics done mostly by the CPU or the GPU?
 

LordOfChaos

Member
ARM = weak seems to be a connection people can't shake. Existing A72s on 28nm likely already beat Jaguar, let alone a custom AMD 14nm FF design for a console...

The ISA isn't a huge part of the equation here. For toolchains and porting, sure. But it means little for teh poweh.
 

I don't expect Nintendo to come in with a new, high-end GPU or the strongest CPU AMD can put in a console, but I hope that Nintendo recognizes that Sony's usage of GDDR5 for the unified memory architecture was a wiser choice than the Xbox One and Wii U's design. Especially with incremental console updates and forward and backward compatibility being important in the future.

Thraktor's post describes what I'm thinking.

The bolded is where the problem lies, nobody seems to be offering eDRAM on processes below 40nm (outside Intel and IBM). This means that, for a small, high-bandwidth pool of memory, their options are reduced to SRAM or HBM.

To illustrate why SRAM is unsuitable for a framebuffer at 28nm, just look at Xbox One and PS4. You've got two consoles with a roughly similar cost released at the same time, but one chooses a single pool of GDDR5 and the other split pools of SRAM and DDR3. MS hoped that using a small on-die pool of memory for the framebuffer, like Wii U or Xbox 360, would give them the best of both worlds, by providing the GPU the bandwidth it needs while cheap DDR3 allows them a large 8GB of main memory.

The results are now obvious. SRAM is big (it takes up far more die space than eDRAM) and therefore very expensive. They could only accommodate 32MB of it on the SoC, and even then, with a larger SoC than Sony, there was a lot less room left for the GPU. So, they ended up with an embedded pool that isn't large enough for a console targeting 1080p, and a GPU that's almost 30% less powerful than it would otherwise have been. Meanwhile, Sony upgraded PS4's memory to 8GB at the last minute, leaving MS without even an overall capacity advantage. Nintendo would have exactly the same problems if they tried to take the SRAM approach to split pools. There's no getting around the cost and the die area implications.

HBM is more of an unknown. It's obviously expensive, but on a per-MB basis much cheaper than SRAM. In theory a single 1GB stack of HBM1 would provide both the capacity (obviously) and the bandwidth necessary for a console competitive with PS4 when combined with some quantity of DDR3/4. That said, a large part of the cost of HBM is surely the packaging (similar to the reason Wii U's MCM is as expensive as it is). That packaging cost isn't any different from HBM1 to HBM2, and won't be all that much more for two or four stacks of memory than it would be for one. So, for all we know it may be 4GB or bust when it comes to using HBM.

I think the question comes down to how much total RAM Nintendo wants to go with. If it's 8GB or less, then I can't imagine a HBM+DDR3/4 approach being cheaper than GDDR5(X), or even LPDDR4, and either of the latter should provide enough bandwidth for a GCN 1.2 GPU. If they decide they need 12GB or more, then perhaps a small HBM pool plus a DDR3/4 pool might be the cheaper way to give themselves both the bandwidth and capacity they need.

An alternative, of course, is to replace the embedded memory pool with a large victim cache which acts as an L3 for both the CPU and GPU (like Apple uses on many of their SoCs). It doesn't need to be large enough to hold the entire framebuffer to significantly reduce main-memory bandwidth requirements, but its effectiveness depends largely on how the GPU accesses the framebuffer (which depends both on the hardware and the way programmers use it). The PowerVR GPUs used in Apple's chips are designed specifically to conserve bandwidth by using a tile-based rendering system to maximise the efficiency of a cache system like Apple uses. AMD's GCN is designed for desktop environments with high-bandwidth GDDR5, though, so it might require a bit of effort on the part of engine programmers to get good use out of any L3 cache.
 

Nanashrew

Banned
That's false. Skyward Sword showed off what the Wii was capable from on all fronts. It was literally the showpiece software that tried to deliver what the Wii was supposed to deliver as far immersive 1:1 controls and lavish 3D worlds. Xenoblade wasn't processing 1:1 controls or doing any calculations like that.

I'd still put Xenoblade up there rather high in how big the game was for the Wii. The game is freakin' HUGE!

Possible spoilers for some if they've not gotten very far in the game https://www.youtube.com/watch?v=hLadRkjbXSc


Yes, some areas do get divided in sections by placing a cave/dungeon or something before the next area. However, each section is gigantic and it's still impressive how much they got into XC. Then you have a lot of monsters roaming every field, some small, some very larger that tower over you. Then you got all the NPCs to talk to, all that dialogue alongside the dialogue of the story. It is very impressive how much they squeezed out of that tiny little Wii.

That's also not getting into the sense of scale and seeing this giant arch above head or that huge waterfall in the distance, textures and what may or may have not been reused, the draw distance, the abundance of items, weapons, armors and making them visible on the body.

Makna forest is still pretty great with how dense it is with foliage, trees, large dinosaurs protecting their territories.
 

Proelite

Member
Nintendo haven't used wide memory busses in consoles because they've had a high-bandwidth smaller pool providing the bandwidth they need, so they don't need a wide bus on the main memory. Look at the 3DS's 128-bit FCRAM, as wide a bus as you'll find in a mobile environment, and you'll see how willing Nintendo are to use a wide memory interface when they need to.

Besides, a large reason that a system designer would want to use a narrow memory bus is that it allows them to use fewer memory chips, keeping the motherboard small and simple. With LPDDR4 widely available with a 64 bit interface, Nintendo could get away with four chips, the same number they used in the Wii U, and a quarter as many as were used in the launch PS4 and XBO. GDDR5 would require at least 8 chips to reach 8GB with currently available parts (and I'm pretty sure those are 32 bit I/O chips, so you'd end up with a 256 bit bus anyway). Nintendo could get anywhere from 8GB to 24GB of LPDDR4 on four chips at almost 120GB/s if they wanted to.



Using a daughter die alongside the main SoC (even if it's a "cheap" 40nm daughter die) requires packaging the two of them together on an MCM, which isn't cheap. It would likely end up more expensive than HBM for similar bandwidth and a tiny fraction of the capacity.

Some questions.

128bit bus + 32mb esram vs 256bit bus no esram. Which is cheaper in the long run.

LPDDR4 vs DDR4 which should NX use.
 

Malus

Member
I'd still put Xenoblade up there rather high in how big the game was for the Wii. The game is freakin' HUGE!

Possible spoilers for some if they've not gotten very far in the game https://www.youtube.com/watch?v=hLadRkjbXSc


Yes, some areas do get divided in sections by placing a cave/dungeon or something before the next area. However, each section is gigantic and it's still impressive how much they got into XC. Then you have a lot of monsters roaming every field, some small, some very larger that tower over you. Then you got all the NPCs to talk to, all that dialogue alongside the dialogue of the story. It is very impressive how much they squeezed out of that tiny little Wii.

That's also not getting into the sense of scale and seeing this giant arch above head or that huge waterfall in the distance, textures and what may or may have not been reused, the draw distance, the abundance of items, weapons, armors and making them visible on the body.

Makna forest is still pretty great with how dense it is with foliage, trees, large dinosaurs protecting their territories.

No doubt Xenoblade has a phenomenal sense of scale and art direction. Don't know that I'd count armor as one of its strong suits though. Some of the designs are rather funky.

SS was a lot smoother looking overall, and had some stunning individual moments spread throughout the game like the Lanayru Sand Sea, Floria Waterfall, Ancient Cistern, Levias, the Fi cutscenes, those epic final bossfights, and a bunch more.
 

AmyS

Member
Where did that chart came from? Is that even accurate?

It's really one of the only things we know know the NX that has a solid source.

It was posted in one of the old Wii U threads from 2012, I think.

I believe it's fairly accurate, in terms of a raw performance comparison.
 

Nanashrew

Banned
No doubt Xenoblade has a phenomenal sense of scale and art direction. Don't know that I'd count armor as one of its strong suits though. Some of the designs are rather funky.

SS was a lot smoother looking overall, and had some stunning individual moments spread throughout the game like the Lanayru Sand Sea, Floria Waterfall, Levias, the Fi cutscenes, those epic final bossfights, and a bunch more.

Oh yeah, definitely. I'm just saying that both Zelda and Xenoblade push the system in their own ways. Skyward Sword is very impressive just as much as Xenoblade to me. Both rank very high.

Their approach and philosophies are different in areas too. Like how Nintendo goes for very clean and simple while Monolith Soft goes for more density. Makna Forest being one example of their density.
 

OryoN

Member
Thanks for responding, Thraktor.
I wasn't aware of the dilemma Nintendo is facing with eDRAM going forward. It'll be interesting to see what solution they settle on. I don't really want what - architecture wise - amounts to PS4/XB1 in a Nintendo labeled casing. It's a bit more fun seeing unique designs with their various strength/weaknesses. If the hardware is up to participate this time around, I hope the software side of development is by far more welcoming to developers than it ever was. Nintendo can never seem to achieve both bullet points simultaneously, haha.
 

LordOfChaos

Member
It was posted in one of the old Wii U threads from 2012, I think.

I believe it's fairly accurate, in terms of a raw performance comparison.

I don't...PS3 has 3x the texel rate of the Wii U? Should be something like 13Gtexels vs 8.8, far from 3x.

I see lots more inconsistencies that I can't be arsed with right now.
 
I always chuckle when I see the title to this thread. It's pretty funny in 2016 that the use of x86 architecture is so worthy of mention it makes it in the title.
 
Status
Not open for further replies.
Top Bottom