• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Reddit [verified] User shares NX info: x86 Architecture, Second screen support etc.

Status
Not open for further replies.

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
This is a common misconception. Nintendo actually produce very good hardware for what they are targeting.

It's not like they were aiming to produce a powerhouse and accidentally made a WiiU. A Wii-backwards compatible, Xbox 360 level, low wattage box is exactly what they set out for.

Their decision to target that spec may not have been competent, but that is another debate.
Exactly. WiiU was pretty much a state-of-the-art MCM for the given TDP and timeframe.
 

Proelite

Member
What about the rumored specs from 2014? :)

250W system. $599 USD base. Slightly smaller than Xbox One in size.

Still struggles with ports because of lack of ram and bandwidth. That CPU and GPU is going to be bottlenecked to starvation.

Sounds legit. :D
 
There are no android dev kits because computers can usually emulate them pretty well or you test on the actual hardware. You dev for last years version to work on next year's version and then patch it up once it's out. At least that's what some smaller devs do.
But I wonder if they could use SDKs to start porting over engines and all that.
I imagine the portable is the base model for NX so maybe that's what they're doing...at least for now. Would imply the handheld is out first, tho
 

AlStrong

Member
50W TDP peak. 35W for the SOC.

2 x 4 core module PUMA 1.8-2.0GHZ
512:40:16 GCN 1.2 gpu at 800 mhz. 800 gigaflops. Might be 10CUs 1 teraflop if they clock the cpu lower.
32mb Edram 256-512gb/s
8GB of DDR4 on a 128bit bus. 50 gb/s

This is the ABSOLUTE minimum that I think you can except from NX, if Nintendo decides to cheap out again.

They could actually go cheaper on the hypothetical EDRAM you've got there since GCN1.2 has bandwidth compression.

6GB of ram would actually cost more in the long run since you'll be using a weird bus configuration. It'll make the mobo messier than necessary.

LPDDR4 can come in 12Gbit/24Gbit configurations with x32/x64 width, but that's just FYI than a suggestion.
 
That camera being so zoomed out and the grass everywhere is probably covering up some rather nasty textures lol if the game awards footage is anything to go by.
The game awards footage will be from a build that's around 2 years old by the time the game comes out.
 

Malus

Member
The game awards footage will be from a build that's around 2 years old by the time the game comes out.

Xenoblade X looked pretty damn good in it's reveal footage 2 1/2 years before it came out, save for a lake with weird edges on it. I do hope that Zelda is much improved though.
 

Pokemaniac

Member
I'm not talking about the company. They're a multi-billion dollar company, no shit they can come up with a powerful machine.

Nintendo hardware is not technologically competent. I never said "Nintendo hardware team is not technologically competent."

For what Nintendo is targeting, they make very good hardware. They're just targeting a bit low.

Why would the RAM allocation need to differ between the two? Why arbitrarily take more RAM for the OS that could be used by games just because it's a home console instead of the handheld? This is assuming there's feature parity between the two.

The main reason to change the allocation is to avoid hamstringing multitasking apps on the console. Most of those are going to require a bit more space to run just due to higher quality assets alone. At least one of them is likely to stream videos, which benefits from having a larger buffer space at higher resolutions. Keeping the same allocation could get a bit tight, so there's a decent chance it could get a bit of a bump to make sure everything has some breathing room.
 
Also it remains to be seen how the wii u version will actually stack up to this. If so, the NX version will be truly spectacular.

Yes because while EVERY Zelda game has technically exceeded the first time it's shown, this will be the one that doesn't and that Nintendo released a bullvideo for for reasons
 
D

Deleted member 465307

Unconfirmed Member
The game awards footage will be from a build that's around 2 years old by the time the game comes out.

Every time I realize this, my mind explodes a bit.

If Zelda Wii U has any major technical weaknesses, I think it's going to be its resolution. Hopefully they've optimized it, but I'm pretty sure a lot of what we've seen in footage (not that we've seen much) has been sub-720p. I'm hoping it hits 720p for release. I expect resolution to be what an NX port addresses the most.
 
For what Nintendo is targeting, they make very good hardware. They're just targeting a bit low.



The main reason to change the allocation is to avoid hamstringing multitasking apps on the console. Most of those are going to require a bit more space to run just due to higher quality assets alone. At least one of them is likely to stream videos, which benefits from having a larger buffer space at higher resolutions. Keeping the same allocation could get a bit tight, so there's a decent chance it could get a bit of a bump to make sure everything has some breathing room.
That makes sense, but I was originally responding to the assumption that it'll use 3GB+ for OS. That just didn't make sense to me.
 

maxcriden

Member
Nintendo has plenty of games now or in the future that they should know how well things perform. Port talk gets me going as you can see. There's so much there to that rumor then how badly nintendo will still screw up their online platform most likely.

2.5 Tflops and above on those architectures is a good pipe dream it's just a bit shy of where the price, size, and tdp need to be. I like this pessimist thing it's my ideal outside of the ram. I'm with others and to drop hints as I have in the past.

I'm not sure I understand either of these parts, can you clarify? Thank you.

They actually made that real time footage a couple of years ago though.

Believe it or not, that was actually just over a year ago.
 

Rodin

Member
Xenoblade X looked pretty damn good in it's reveal footage 2 1/2 years before it came out, save for a lake with weird edges on it. I do hope that Zelda is much improved though.
It looked far, far worse than the final version.

Last year Aonuma said that Zelda U already looked better than the reveal (they didn't show anything though, except that 10 seconds teaser in a different tod), now imagine how that could look on a 1-2tflops NX.
Believe it or not, that was actually just over a year ago.
It was E3 2014, so it's very close to 2 years (especially when you consider that they didn't make that footage the night before e3).
 

RootCause

Member
To be fair, whether or not a game ever releases has nothing to do with whether or not a dev team put the work in to make it.

That shot was from a running build, so yes, they made that.
Naw! That would be everything but fair, since it isn't from an actual product available for purchase. It's not different than EA, Guerrilla, Ubi or Gearbox showing in game footage from an unreleased project. I'll gladly change my stance once they have a consumer product.
 
Xenoblade X looked pretty damn good in it's reveal footage 2 1/2 years before it came out, save for a lake with weird edges on it. I do hope that Zelda is much improved though.
The next time we'll see it it'll probably the NX version so after two additional years of development and not a weird offscreen recording so I imagine it'll look better, lol
 

beril

Member
50W TDP peak. 35W for the SOC.

2 x 4 core module PUMA 1.8-2.0GHZ
512:40:16 GCN 1.2 gpu at 800 mhz. 800 gigaflops. Might be 10CUs 1 teraflop if they clock the cpu lower.
32mb Edram 256-512gb/s
8GB of DDR4 on a 128bit bus. 50 gb/s

This is the ABSOLUTE minimum that I think you can except from NX, if Nintendo decides to cheap out again.

It'd be pretty weird for them not to use more Edram than Wii U if they go with a similar memory architecture. That's an area where Nintendo usually excels.

32mb on Wii U is very generous for he power of the system, as no one expect graphically demanding games to run at 1080p on the system, but on the NX it would become a bottleneck like on Xbox One. The 6MB on 3DS is also extremely generous considering the screen resolution. In Gunman Clive 1 I was able to fit nearly every texture in the game and much of the model data into that, as well as the 2x2 FSAA framebuffer.
 
But, they should launch a console with a traditional controller that has enough power. And like wiifit move and kinect, sell their surprise separately. That way nobody has to use it. Third party or console owners.

I don't want to be harsh, but this is really stupid and guarantees that nobody, including first parties, will use said special controller. It needs to come with the system so that every customer has it or nobody will develop games for it.

What they need to do is include their special controller AND a normal controller with the system. That way if a dev wants to use the special controller for a game, they can do so knowing that their game will still actually sell because everyone buying the system will have the controller it's designed for, but if devs want to make a normal game, same thing, everyone will already have a standard controller.

Quick example: LittleBigPlanet 2 had optional Move support. What do you think would have happened if it was ONLY Move support, when the PS3 didn't come with a Move controller? Nobody would have bought LBP2. Or they would have had to make an expensive bundle that came with a Move to ensure that everyone buying the game could actually play it. And with that same game, Move support was 100% optional, so how many people actually even cared at all? I would imagine the answer is "barely anyone". Now, if Sony had released a PS3 that came with a DS3 as well as a Move wand? Completely different story. There's a reason they put the colored tracking lights in the DS4.
 

Malus

Member
The next time we'll see it it'll probably the NX version so after two additional years of development and not a weird offscreen recording so I imagine it'll look better, lol

I don't think it being offscreen footage made much if any difference. Here's hoping that it does get a nice NX bump.

It looked far, far worse than the final version.

I don't see it. You could probably point out minor differences, but the scene at the beginning where the main character is running through Primordia looks almost exactly the same as the end product, save for the aforementioned lake with weird edges.
 
True. But Nintendo is really able to get the most out of everything with some games. Like ND.
But yeah, i have to agree. R&C turned out to be really jaggy.
It's funny because when I made a claim about that a few months ago from playing the demo very early people thought I was trippin
This. Mario Kart 8 is a far better comparison. Much better art direction than 3D World. It's night and day. Those sorts of visuals in a 3D Mario game at 1080p60fps.....lawd jesus.
Yep, as great as 3D world looksits a bit too clean, the artstyle makes it so it doesnt have to use any textures and it gives off a sterile look compared to games like Mario Kart 8.
Well, Ratchet and Clank have also been putting in more graphical stuff as they've given up on 60fps for weird reasons.

Mario is much more simple and clean, even Mario Galaxy with its really good art style is simple and clean. On Dolphin, Mario Galaxy looks crazy good. Though because of the clean art style I could see them adding in a bit more bells and whistles with effects.

Mario Kart 8 is a good example too. It's a different art style but still maintains a simple and clean look while adding in some bells and whistles.
I think an artstyle change will make it look much better. Leave the clean look alone and opt for more texture and assets to give mario a new look.
 

maxcriden

Member
But I wonder if they could use SDKs to start porting over engines and all that.
I imagine the portable is the base model for NX so maybe that's what they're doing...at least for now. Would imply the handheld is out first, tho

To you and to other tech savvy folks in this thread: wouldn't that significantly constrain the console to have the handheld be the base model? Also, if the HH is out first and is being used as the base model for development, that wouldn't seem to sync up with what LCGeek has heard about console HW capabilities, in the sense that his dev friends would I would think be more likely aware of the HH specs and info then if they wanted to develop for the console and the HH was the base model for development. Unless I'm completely misunderstanding your post. ☺

It looked far, far worse than the final version.

Last year Aonuma said that Zelda U already looked better than the reveal (they didn't show anything though, except that 10 seconds teaser in a different tod), now imagine how that could look on a 1-2tflops NX.

It was E3 2014, so it's very close to 2 years (especially when you consider that they didn't make that footage the night before e3).

Unless I misunderstood, we were talking about the TGA footage from December 2014 and not the E3 footage. Maybe I did misunderstand, though.

Edit: I see now where my confusion came from. I didn't realize you were the same poster RootCause was replying to, and at the same time I forgot the E3 reveal and TGA were both real-time, since the former is pre-recorded.

Naw! That would be everything but fair, since it isn't from an actual product available for purchase. It's not different than EA, Guerrilla, Ubi or Gearbox showing in game footage from an unreleased project. I'll gladly change my stance once they have a consumer product.

The difference between those companies and Nintendo is that they have a reputation for releasing bullshot footage and Nintendo as I understand it generally does not, particularly in regards to footage.
 

LCGeek

formerly sane
To you and to other tech savvy folks in this thread: wouldn't that significantly constrain the console to have the handheld be the base model? Also, if the HH is out first and is being used as the base model for development, that wouldn't seem to sync up with what LCGeek has heard about console HW capabilities, in the sense that his dev friends would I would think be more likely aware of the HH specs and info then if they wanted to develop for the console and the HH was the base model for development. Unless I'm completely misunderstanding your post. ☺

Not really people already showed the cpu can exist and for mobile. They don't need a mega strong gpu at all for a handheld at a much lower res spec.

As for the abilities I'm only referencing things if they are doing ports that want to hit a certain point as a Ps4 1080/30fps performance peak. How they hit that performance is the discussion, we will know if can or have the ability to do so soon enough.

Pinky is your avatar from UHF?
 

Pinky

Banned
It looked far, far worse than the final version.

Last year Aonuma said that Zelda U already looked better than the reveal (they didn't show anything though, except that 10 seconds teaser in a different tod), now imagine how that could look on a 1-2tflops NX.

It was E3 2014, so it's very close to 2 years (especially when you consider that they didn't make that footage the night before e3).

arousal.gif~c200
 

maxcriden

Member
Not really people already showed the cpu can exist and for mobile. They don't need a mega strong gpu at all for a handheld at a much lower res spec.

As for the abilities I'm only referencing things if they are doing ports that want to hit a certain point as a Ps4 1080/30fps port. How they hit that performance is the discussion, we will know if can or have the ability to do so soon enough.

Ok, I think I follow and I stand corrected then. Still, though, if your friends are developing console games and the HH is the base development unit, wouldn't you be much likelier to have HH spec info also? In the sense that any dev being offered one would also have access to the other, should they want to make a game cross-platform.
 

geordiemp

Member
The difference between those companies and Nintendo is that they have a reputation for releasing bullshot footage and Nintendo as I understand it generally does not, particularly in regards to footage.

Nope, it was discussed on 3Dforums many moons ago (I am not looking it up again) saying it was 1080p downsampled from 1440p was their best estimate.

What do you think its from ? Its more than what ps4 crunches out.

1440p is a Nintendo press release (sounds better than bullshot LOL), but if Ubi do that its because they are nasty....

They all do it, deal with it. Or believe it was native on WiiU, bookmark so when it comes out, we can discuss......and crow will be eaten.

Maybe it was running on NX PC target hardware....who knows. This is what I believe anyway...
 

Instro

Member
Yes because while EVERY Zelda game has technically exceeded the first time it's shown, this will be the one that doesn't and that Nintendo released a bullvideo for for reasons

I don't recall SS looking significantly different, same with TP for that matter.

The game really looked nothing like that in the demo they showed.
 

Malus

Member
I don't recall SS looking significantly different, same with TP for that matter.

The game really looked nothing like that in the demo they showed.

I think it looked the same, they just showed off more of the rough spots in actual gameplay as opposed to the cinematic camera of the reveal.
 

Zoon

Member
What do you guys think if the base NX console was close to the handheld in terms of power (maybe with a better CPU) and required the SCDs for additional power?
 

LCGeek

formerly sane
Ok, I think I follow and I stand corrected then. Still, though, if your friends are developing console games and the HH is the base development unit, wouldn't you be much likelier to have HH spec info also? In the sense that any dev being offered one would also have access to the other, should they want to make a game cross-platform.

One is a friend. Others are peers I met and that still talk to be more clear. They could have the info I just didn't ask literally mentioned it was a short confirmation. Considering I don't have a decent set of files as this is nothing like my other leaks, more like N64 where it was super tight lipped. Already been warned here and other places not to get myself in trouble by sharing actual details the NDA is made for.

Info like that says a lot which is why nintendo locks people down and heavily compartmentalizes things. They are like a cross breed between a cia agent and a ninja.

Zelda N64 and GC were succeeded. People drum up the GC demo but nintendo has shown their hardware real time or in other demos can do far more. The WiiU demo can't be compared to nothing else we don't have a real time zelda game to do so.
 

Mr Swine

Banned
Could Nintendo release a handheld that is basically a Vita with a bit better and more modern CPU, GPU and more ram at a $99 price tag? or would it just make more sense to make one that is a Wii U/PS360 power level at $199
 

OryoN

Member
I thought the wii us memory solution was actually very efficient and one of the reasons you get so good looking and performing games out of 176 glops.

Oh it's definitely efficient, don't get me wrong, the system definitely punch well above its weight, especially for the tiny size of the chip. But it's definitely a step back in terms of the memory hierarchy compared to what GameCube brought to the table. The on-die memory is there, but gone is the blazing fast main memory. The "slow" RAM has been updated to average speed RAM, but that cache-like main memory is a much bigger lost, than the small gain from "A-RAM" to DDR3. That little piece of hardware was an elegant beast.
 

Peterc

Member
I don't want to be harsh, but this is really stupid and guarantees that nobody, including first parties, will use said special controller. It needs to come with the system so that every customer has it or nobody will develop games for it.

What they need to do is include their special controller AND a normal controller with the system. That way if a dev wants to use the special controller for a game, they can do so knowing that their game will still actually sell because everyone buying the system will have the controller it's designed for, but if devs want to make a normal game, same thing, everyone will already have a standard controller.

Quick example: LittleBigPlanet 2 had optional Move support. What do you think would have happened if it was ONLY Move support, when the PS3 didn't come with a Move controller? Nobody would have bought LBP2. Or they would have had to make an expensive bundle that came with a Move to ensure that everyone buying the game could actually play it. And with that same game, Move support was 100% optional, so how many people actually even cared at all? I would imagine the answer is "barely anyone". Now, if Sony had released a PS3 that came with a DS3 as well as a Move wand? Completely different story. There's a reason they put the colored tracking lights in the DS4.

Agree, but what about psvr?
 
To you and to other tech savvy folks in this thread: wouldn't that significantly constrain the console to have the handheld be the base model? Also, if the HH is out first and is being used as the base model for development, that wouldn't seem to sync up with what LCGeek has heard about console HW capabilities, in the sense that his dev friends would I would think be more likely aware of the HH specs and info then if they wanted to develop for the console and the HH was the base model for development. Unless I'm completely misunderstanding your post. ☺
Oh, I'm not tech savvy in the slightest. I don't know anything about this jaguar, polaris, AMD, ARM, x86 stuff outside of some very minor aspects
(x86 means porting from PC is easier, ARM is friendlier with portables)
The handheld being the base model would hold back NX titles, for sure, but it's probably for the best. A company can invest more money into said title if the installbase was going to be noticeably bigger so it should show more stylized but ambitious handheld titles on the console.
I think it'll probably be possible to make an average 60fps Wii U game run at 30fps on the handheld while the console could run the same game at 1080p 60fps without needing a ton of additional work. If you make them with the mindset that it's going to be made for two different devices that could help make the console games look nicer as well.
Additionally, If the titles are too much for the handheld to handle then they can make it console exclusive.
An example would be DKCTF: 30fps without fur effects outside of cut scenes) on the NX handheld and 1080p 60fps on the console with added effects like particles, fur effects, etc.
Best case scenario would be Super Smash 4 which I think was them experimenting with the idea.
Of course, this is all speculation based on our interpretation on what Iwata said about the direction of their hardware.
 

ozfunghi

Member
Could Nintendo release a handheld that is basically a Vita with a bit better and more modern CPU, GPU and more ram at a $99 price tag? or would it just make more sense to make one that is a Wii U/PS360 power level at $199

I think the entire idea of getting many games to be cross platform (between console and handheld) they would best go for something that scales down rather easily from their home console. It might be an extra draw for both devs and gamers. Being able to release a game not only on the homeconsole, but on the (likely more popular) handheld as wel, without having to waste too many resources on the (downport) handheld version.
 
Agree, but what about psvr?

Honestly, I dunno. I mean, with controllers and perhipherals that were released later in a system's life, we have a precedent and history to look at. But PSVR is something else entirely. As EviLore says at the top of the forum, it's almost like it's a new platform, not just an add-on.

But if I had to speculate, I would say that PSVR will share a similar, though probably better fate than things like PSMove and the Wii Balance Board. It's not something that comes with the system, so the install base isn't there for devs to just churn out games for it, and apparently adding in VR to a normal game isn't as easy as just flipping a switch (IE, FarCry 4 wouldn't be able to just, put on the headset and boop it's ready to go). It's also VERY expensive compared to something like a controller. I think that it will be more popular than the Move/Kinnect/Balance Board/etc but... as much as I'm looking forward to it I think that the only way VR will ever gain mass adoption on consoles is if it's packed in from the start, and I don't know how or when that would become cost effective. Maybe if the PS5 was just a spec-bumped PS4/PS4K, that they could sell for like $200, and then you have the PSVR 2.0 as a modest revision of the one coming out soon, you could sell the two together as the PS5VR for $499, that would be the best way to do it. Every game would work with VR, everyone that bought a PS5 would have the helmet, maybe developers would opt to include a "normal mode" in their games for people who don't want to use VR, but at least developers would be able to design games for it with absolute certainty that everyone owning a PS5 and buying their game will be able to play it.

But again the only way, from what we've seen in the past, to get customers and developers alike to adopt these add-ons is to actually include them with either the system itself to make sure 100% of the userbase has access to it, and thus can play the games designed for it, or to bundle the add-on with the game that uses it to ensure everyone that buys the game can actually play it as it was designed (like the Expansion Pack for the N64 being bundled with DK64, or the WiiRemote+ being bundled with Wii Sports Resort and FlingSmash). And that second option isn't really available to all developers so the best bet once more is to pack it with the system.
 

ozfunghi

Member
I still can't believe we have a thread on NeoGAF for "Reddit rumors".

The only reason why it's still going is because of the information by LCGeek and some reputable posters (Thraktor, Blu...) providing some context to go along with it. Like i said earlier, we should have posted a new thread about that, instead of this insane Reddit crap.
 
The only reason why it's still going is because of the information by LCGeek and some reputable posters (Thraktor, Blu...) providing some context to go along with it. Like i said earlier, we should have posted a new thread about that, instead of this insane Reddit crap.

Can we please do that? This thread is just validating the inner egos of those Reddit "leakers". We should also now allow this Reddit crap in that other thread. That way, we can keep an intelligent discussion going.
 

Thraktor

Member
I truly believe GameCube was the "perfect" hardware solution for Nintendo. They didn't brute force anything, but the real world performance was so incredible. I think one key component that made that all possible was the Mosys 1T-SRAM(the CPU & GPU were no slouch either). I remember Factor 5 was praising how amazingly fast it was, saying that they were caught by surprise, because they were already feeding the system huge amounts of data, yet, the more they fed it, it just kept blazing through it all. The access speed was just incredibly fast, like cache, some said.

I'd like to see Nintendo pursue a similar solution again. Embedded memory on the GPU for Frame buffer and other stuff, and a sizeable super-fast system memory for graphical/low-latency tasks, with an additional large pool of slower memory to handle slower/higher latency stuff(streaming, sound, etc). This strategy allowed them to create affordable, yet, very high performance hardware. I know they were worried about the cost of such exotic memory going forward, but I'm sure it could be cheaper now, and they could probably get by with less total memory than current consoles(eg: 64MB embedded, 1GB fast mem1?, 4GB slow mem2).

It's not like those 8GB of current gen significantly boosted their performance, but - in theory - is a bottleneck since the disc would take a long time filling just a fraction of that. Apple's iPhone gets by with a seemingly meager amount of system memory compared to Android devices, because what they have is tuned for performance, rather than brute force. Not sure why Nintendo abandoned GameCube's exotic memory solution when it produced such great result for decent cost, and a lot less silicon. I'm as curious as you about whether such a design is feasible this time around.

Wii U is very much a continuation of the Gamecube philosophy when it come to memory. There's a very small, extremely fast 3MB eDRAM pool, a quite small, very fast 32MB eDRAM pool, and then a large and slow 2GB DDR3 pool. The fact that they got away with a mere 12 GB/s of bandwidth for their largest pool shows how effectively they could utilise that 32MB in particular.

The problem with continuing along that road with the NX is that they no longer have the options they had before. More specifically:

50W TDP peak. 35W for the SOC.

2 x 4 core module PUMA 1.8-2.0GHZ
512:40:16 GCN 1.2 gpu at 800 mhz. 800 gigaflops. Might be 10CUs 1 teraflop if they clock the cpu lower.
32mb Edram 256-512gb/s
8GB of DDR4 on a 128bit bus. 50 gb/s

This is the ABSOLUTE minimum that I think you can except from NX, if Nintendo decides to cheap out again.

The bolded is where the problem lies, nobody seems to be offering eDRAM on processes below 40nm (outside Intel and IBM). This means that, for a small, high-bandwidth pool of memory, their options are reduced to SRAM or HBM.

To illustrate why SRAM is unsuitable for a framebuffer at 28nm, just look at Xbox One and PS4. You've got two consoles with a roughly similar cost released at the same time, but one chooses a single pool of GDDR5 and the other split pools of SRAM and DDR3. MS hoped that using a small on-die pool of memory for the framebuffer, like Wii U or Xbox 360, would give them the best of both worlds, by providing the GPU the bandwidth it needs while cheap DDR3 allows them a large 8GB of main memory.

The results are now obvious. SRAM is big (it takes up far more die space than eDRAM) and therefore very expensive. They could only accommodate 32MB of it on the SoC, and even then, with a larger SoC than Sony, there was a lot less room left for the GPU. So, they ended up with an embedded pool that isn't large enough for a console targeting 1080p, and a GPU that's almost 30% less powerful than it would otherwise have been. Meanwhile, Sony upgraded PS4's memory to 8GB at the last minute, leaving MS without even an overall capacity advantage. Nintendo would have exactly the same problems if they tried to take the SRAM approach to split pools. There's no getting around the cost and the die area implications.

HBM is more of an unknown. It's obviously expensive, but on a per-MB basis much cheaper than SRAM. In theory a single 1GB stack of HBM1 would provide both the capacity (obviously) and the bandwidth necessary for a console competitive with PS4 when combined with some quantity of DDR3/4. That said, a large part of the cost of HBM is surely the packaging (similar to the reason Wii U's MCM is as expensive as it is). That packaging cost isn't any different from HBM1 to HBM2, and won't be all that much more for two or four stacks of memory than it would be for one. So, for all we know it may be 4GB or bust when it comes to using HBM.

I think the question comes down to how much total RAM Nintendo wants to go with. If it's 8GB or less, then I can't imagine a HBM+DDR3/4 approach being cheaper than GDDR5(X), or even LPDDR4, and either of the latter should provide enough bandwidth for a GCN 1.2 GPU. If they decide they need 12GB or more, then perhaps a small HBM pool plus a DDR3/4 pool might be the cheaper way to give themselves both the bandwidth and capacity they need.

An alternative, of course, is to replace the embedded memory pool with a large victim cache which acts as an L3 for both the CPU and GPU (like Apple uses on many of their SoCs). It doesn't need to be large enough to hold the entire framebuffer to significantly reduce main-memory bandwidth requirements, but its effectiveness depends largely on how the GPU accesses the framebuffer (which depends both on the hardware and the way programmers use it). The PowerVR GPUs used in Apple's chips are designed specifically to conserve bandwidth by using a tile-based rendering system to maximise the efficiency of a cache system like Apple uses. AMD's GCN is designed for desktop environments with high-bandwidth GDDR5, though, so it might require a bit of effort on the part of engine programmers to get good use out of any L3 cache.
 

ozfunghi

Member
Can we please do that? This thread is just validating the inner egos of those Reddit "leakers". We should also now allow this Reddit crap in that other thread. That way, we can keep an intelligent discussion going.

Sure man, and i have the ideal topic title to generate a lot of buzz:

NX>>>>> X1 >> PS4>>>>>>>>>>>>>>>>>>>>>>Wii U

lol
 
Status
Not open for further replies.
Top Bottom