• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: Wii U final specs

Keeping shader units fed has little to do with external bandwidth if you have a 32MB high bandwidth low latency buffer to render into. Because you don't necessarily need to store a large amount of data to keep them fed. Over 320 SP's would be far from a waste IMO.

There's also the question of power usage. There's a curve where increasing the SPs increases the power usage, and thus you'd have to reduce the power envelope of the whole system in order to fit within power constraints. There's also concern for heat and whatnot with all components being under the same xclamp/heat spreader.
 

Donnie

Member
There's also the question of power usage. There's a curve where increasing the SPs increases the power usage, and thus you'd have to reduce the power envelope of the whole system in order to fit within power constraints. There's also concern for heat and whatnot with all components being under the same xclamp/heat spreader.

Yeah, increasing ROPS and texture units would increase power usage as well though.
 
The funniest thing about this whole situation is that the whole "well you can buy 40 gigs of DDR on newegg for cheap" crowd was closer to the mark. That being said, I am really at odds trying to figure out how they are supposedly selling this thing at a loss.
 
The funniest thing about this whole situation is that the whole "well you can buy 40 gigs of DDR on newegg for cheap" crowd was closer to the mark. That being said, I am really at odds trying to figure out how they are supposedly selling this thing at a loss.
Is its really that hard? Have you seen the controller? Nintendo doesn't get 100% of that 299 or 349.
 

Donnie

Member
But its much cheaper in power consumption and space than 160+SP like you suggest.

But also much less useful, especially if you're already at 16 ROPS and 32 texture units and plan most games at 720P. I mean how much space would you want to spend adding more ROPS or texture units if you're setup was 320SP's, 16 ROPS and 32 texture units? If you can spend more space/power usage and have a choice I think you'd go with more shader units.

The reason I'm mentioning 16 ROPS and 32 texture units is because I was basing this all on the 4770. What I was saying is that taking into account the die size and what we think is on there as far as eDRAM the amount likely left for the GPU should allow for more than 320SP's if you compare it to the die size of a 4770.
 

Durante

Member
Is its really that hard? Have you seen the controller?
Well, I'm having a hard time with it honestly.

Here's an android tablet with better specs in almost everything than the Wii U one, except for physical controls and any custom streaming hardware. It also has almost twice the battery capacity and quite a fast CPU. The price is listed as USD 54 to 64.

As for the main system, in the teardowns the PCB looks really neat and simple. It's also small, low-power (and thus no complex cooling requirements) and none of the chips are large or bleeding edge.
 
But also much less useful, especially if you're already at 16 ROPS and 32 texture units and plan most games at 720P. I mean how much space would you really want to spend adding more ROPS or texture units if you're setup was 320SP's, 16 ROPS and 32 texture units.

I know. I modified that post. The memory access would unilaterally kill the usefulness of adding additional ROPs and TUs

Which is why I think its at maximum the die-shrunk R730 with eDRAM and NB/SB on chip. That's a maximum of the gp-U imo.

And please don't bring up Benghazi.
 
But also much less useful, especially if you're already at 16 ROPS and 32 texture units and plan most games at 720P. I mean how much space would you really want to spend adding more ROPS or texture units if you're setup was 320SP's, 16 ROPS and 32 texture units.

I don't know a whole lot about ROPs, but I think 8 is more likely than 16. I'm sure it varies, but I've read that most games were bottlenecked more by SPUs this generation. ROPS are also bandwidth hogs, and considering that the eDRAM's bandwidth is apparently being relied on for things other than post processing effects and framebuffer, there might not be much benefit from including 16 over 8.

But yeah, I'd love more insight into fillrate and how it fared on the 360 w/ its 8 ROPS.
 

DCKing

Member
I think the worst the GPU could be was a derivative of the Turks GPU, which has 480SPUs/24TMUs/8 ROPs on 40nm with a die size of 118mm^2. Slap the EDRAM onto it and you come just short of the die size reported by Anandtech. Not great, but still respectably more powerful than the 360. Turks, by the way, was still in use by AMD in 2012.

A RV740 derivative is probably still possible considering the simplifications AMD could perform on the UVD unit and the memory controller to reduce die size.
 

Donnie

Member
I know. I modified that post. The memory access would unilaterally kill the usefulness of adding additional ROPs and TUs

Which is why I think its at maximum the die-shrunk R730 with eDRAM and NB/SB on chip. That's a maximum of the gp-U imo.

That's your opinion, I just don't agree on that being the max. Especially when with Rv730 you're talking about a GPU with 320SP's, 8 ROPS and 16 texture units and only 512M transistors, likely well under 100mm2 if properly designed for 40nm (I mean lets not forget this is a custom chip designed for its given process not a 55nm chip shrunk to 40nm). Since there's no other R700 GPU based on 40nm outside of the 4770 and its going to be too big given the eDRAM lets look at a lower end HD5000 on 40nm.

The 5570 has 400SP's, 8 ROPS and 20 texture units and its only 627M transistors and 104mm2. Something that more than fits into the die size and transistor budget we're seeing for WiiU's GPU.
 

Lonely1

Unconfirmed Member
Well, I'm having a hard time with it honestly.

Here's an android tablet with better specs in almost everything than the Wii U one, except for physical controls and any custom streaming hardware. It also has almost twice the battery capacity and quite a fast CPU. The price is listed as USD 54 to 64.

As for the main system, in the teardowns the PCB looks really neat and simple. It's also small, low-power (and thus no complex cooling requirements) and none of the chips are large or bleeding edge.
On my experience, those tablets have a terrible build quality.
 
I know. I modified that post. The memory access would unilaterally kill the usefulness of adding additional ROPs and TUs

Which is why I think its at maximum the die-shrunk R730 with eDRAM and NB/SB on chip. That's a maximum of the gp-U imo.

And please don't bring up Benghazi.

Nintendo could scale the R700 config to its theoretical "low end" max. That would be 400 SPUs, 40 TMUs, and 8 ROPs. AMD lines after R700 started using different core configs, but my inkling is Nintendo want a decent amount of TMUs so that there are a bunch of address buses to the eDRAM (in this case 40), resulting in high-bandwidth, low latency texture reads.
 
That's your opinion, I just don't agree on that being the max. Especially when with Rv730 you're talking about a GPU with 320SP's, 8 ROPS and 16 texture units and only 512M transistors, likely well under 100mm2 if properly designed for 40nm (I mean lets not forget this is a custom chip designed for its given process not a 55nm chip shrunk to 40nm). Since there's no other R700 GPU based on 40nm outside of the 4770 and its going to be too big given the eDRAM lets look at a lower end HD5000 on 40nm.

The 5570 has 400SP's, 8 ROPS and 20 texture units and its only 627M transistors and 104mm2. Something that more than fits into the die size and transistor budget we're seeing for WiiU's GPU.

the 5570 is not of the R7xx family line. DX10.1 was directly stated. The R8xx line was cooked with DX11 in mind. And high-bandwidth as well. You'd probably be wishing for the 4650/4670 if it were the case

Nintendo could scale the R700 config to its theoretical "low end" max. That would be 400 SPUs, 40 TMUs, and 8 ROPs. AMD lines after R700 started using different core configs, but my inkling is Nintendo want a decent amount of TMUs so that there are a bunch of address buses to the eDRAM (in this case 40), resulting in high-bandwidth, low latency texture reads.

All of that with 157mm^2 split between eDRAM, NB, SB, and an ARM core?
 

Pociask

Member
The funniest thing about this whole situation is that the whole "well you can buy 40 gigs of DDR on newegg for cheap" crowd was closer to the mark. That being said, I am really at odds trying to figure out how they are supposedly selling this thing at a loss.

Boy, was I wrong about that statement I made on the internet a few months ago, said no one in the history of the internet.

The thing that always puzzles me is how Nintendo spends tons of money on r&d and colloboration with other hardware companies, and then they get.... what? The streaming technology is legit, but then what?

I'm not a hardware guy, but I thought the consensus was the 360 was released in 2005, technology has gotten tons better in the last seven years, even if Nintendo makes a low-end console, advances in technology mean it will HAVE to be a ton better than the 360.

So... (and I say this knowing we're still missing key pieces of info), what happened? Were the 360 and the PS3 really just beasts for their time? Or are graphics just that low a priority for Nintendo? If it's the priority thing, it's kind of hard to believe - these guys still make video games for a living. They have to attract talented software designers and artists. At least in the West it seems lots of people want to make the shiniest shiny available. And Nintendo itself used to make some consoles that did amazing things - remember when new consoles were released, and PC's took a long time to catch up to what a console could do?

One last thought - I buy Nintendo in 2004 thinking HD tv's aren't that widespread, so why build a machine around HD graphics. And they were going blue ocean, I get that too. But now, HD tv's ARE widespread. What's the excuse now? It's hard to say they're going blue ocean when it seems they're doing an HD Wii + a tablet + Nintentwitter.
 

Log4Girlz

Member
Boy, was I wrong about that statement I made on the internet a few months ago, said no one in the history of the internet.

The thing that always puzzles me is how Nintendo spends tons of money on r&d and colloboration with other hardware companies, and then they get.... what? The streaming technology is legit, but then what?

I'm not a hardware guy, but I thought the consensus was the 360 was released in 2005, technology has gotten tons better in the last seven years, even if Nintendo makes a low-end console, advances in technology mean it will HAVE to be a ton better than the 360.

So... (and I say this knowing we're still missing key pieces of info), what happened? Were the 360 and the PS3 really just beasts for their time? Or are graphics just that low a priority for Nintendo? If it's the priority thing, it's kind of hard to believe - these guys still make video games for a living. They have to attract talented software designers and artists. At least in the West it seems lots of people want to make the shiniest shiny available. And Nintendo itself used to make some consoles that did amazing things - remember when new consoles were released, and PC's took a long time to catch up to what a console could do?

One last thought - I buy Nintendo in 2004 thinking HD tv's aren't that widespread, so why build a machine around HD graphics. And they were going blue ocean, I get that too. But now, HD tv's ARE widespread. What's the excuse now? It's hard to say they're going blue ocean when it seems they're doing an HD Wii + a tablet + Nintentwitter.

The Wii U is remarkably powerful. The graphics it puts out are pretty amazing IMO. But only when you consider it is a 40 watt console. So far third party ports are suffering and nothing looks anywhere near as good as the HD Twin's best. In all likelihood everything under the hood is clocked fairly slow to keep the power draw low. If you take all these exact same chips and were to clock them higher, get this sucker to draw 80 to 100 watts (still very efficient) it would clearly outclass the current gen. But again, its only drawing 40 watts, nothing is going to change that.
 
The Wii U is remarkably powerful. The graphics it puts out are pretty amazing IMO. But only when you consider it is a 40 watt console. So far third party ports are suffering and nothing looks anywhere near as good as the HD Twin's best. In all likelihood everything under the hood is clocked fairly slow to keep the power draw low. If you take all these exact same chips and were to clock them higher, get this sucker to draw 80 to 100 watts (still very efficient) it would clearly outclass the current gen. But again, its only drawing 40 watts, nothing is going to change that.

Even a 60-75w power draw would give us a console significantly more future-proofed than what we're getting. Sigh.
 

JoelFB

Vice President, Frozenbyte
Another to add to the list: Trine 2 seems to be missing some effects found on the 360 and PS3 versions.

As part of the development team, I'd very much like to see some evidence too. :) Couldn't quite find anything in the Giant Bomb thread (didn't browse it all).

Trine 2: Director's Cut should be using slightly higher graphics settings on Wii U than Trine 2 on the other consoles - nothing really major but a tiny bit. Also the new Goblin Menace expansion levels simply wouldn't run on the other consoles right now, they're more demanding than the normal campaign levels and we've done a lot of optimizations on Wii U to make them run well. The only possibility I can think of for "effects missing" is that if we've had to optimize the Goblin Menace levels we've accidentally disabled some effects for the main campaign too. But let's see some real evidence first and I can provide the tech answer (or semi-tech, I'm no programmer myself). If some effects are truly missing, we'll fix it in an update. :)

Also I have to apologize to North American gamers - we actually wanted to have a small discount at launch but sadly that wasn't possible just yet. We're going to have it in Europe but of course that's no consolation to NA gamers. Sorry!

Anyhow I hope those who buy the game enjoy it very much! :)


Best,
Joel
Frozenbyte
 

CronoShot

Member
As part of the development team, I'd very much like to see some evidence too. :) Couldn't quite find anything in the Giant Bomb thread (didn't browse it all).

Trine 2: Director's Cut should be using slightly higher graphics settings on Wii U than Trine 2 the other consoles - nothing really major but a tiny bit. Also the new Goblin Menace expansion levels simply wouldn't run on the other consoles right now, they're more demanding than the normal campaign levels and we've done a lot of optimizations on Wii U to make them run well. The only possibility I can think of for "effects missing" is that if we've had to optimize the Goblin Menace levels we've accidentally disabled some effects for the main campaign too. But let's see some real evidence first and I can provide the tech answer (or semi-tech, I'm no programmer myself). If some effects are truly missing, we'll fix it in an update. :)

Also I have to apologize to North American gamers - we actually wanted to have a small discount at launch but sadly that wasn't possible just yet. We're going to have it in Europe but of course that's no consolation to NA gamers. Sorry!

Anyhow I hope those who buy the game enjoy it very much! :)


Best,
Joel
Frozenbyte

Good to hear. I will buy this.
 
Also I have to apologize to North American gamers - we actually wanted to have a small discount at launch but sadly that wasn't possible just yet. We're going to have it in Europe but of course that's no consolation to NA gamers. Sorry!

Not sure if you can comment - NDAs and all that - but I've read that devs and publishers will have greater control over pricing and sales with the Wii U eShop. Your comment here suggests it's coming, but wasn't ready in time for the US launch. Would that be accurate?

(...oh, and cheers for posting - GAF can't be the most pleasant of places for someone who worked on a launch title for a new system :) )
 
As part of the development team, I'd very much like to see some evidence too. :) Couldn't quite find anything in the Giant Bomb thread (didn't browse it all).

Trine 2: Director's Cut should be using slightly higher graphics settings on Wii U than Trine 2 on the other consoles - nothing really major but a tiny bit. Also the new Goblin Menace expansion levels simply wouldn't run on the other consoles right now, they're more demanding than the normal campaign levels and we've done a lot of optimizations on Wii U to make them run well. The only possibility I can think of for "effects missing" is that if we've had to optimize the Goblin Menace levels we've accidentally disabled some effects for the main campaign too. But let's see some real evidence first and I can provide the tech answer (or semi-tech, I'm no programmer myself). If some effects are truly missing, we'll fix it in an update. :)

Also I have to apologize to North American gamers - we actually wanted to have a small discount at launch but sadly that wasn't possible just yet. We're going to have it in Europe but of course that's no consolation to NA gamers. Sorry!

Anyhow I hope those who buy the game enjoy it very much! :)


Best,
Joel
Frozenbyte

Thanks for sharing! Tried the game out on PS Plus, and am considering purchasing the DC on eShop.
 

beril

Member
I'm not a hardware guy, but I thought the consensus was the 360 was released in 2005, technology has gotten tons better in the last seven years, even if Nintendo makes a low-end console, advances in technology mean it will HAVE to be a ton better than the 360.

So... (and I say this knowing we're still missing key pieces of info), what happened? Were the 360 and the PS3 really just beasts for their time? Or are graphics just that low a priority for Nintendo? If it's the priority thing, it's kind of hard to believe - these guys still make video games for a living. They have to attract talented software designers and artists. At least in the West it seems lots of people want to make the shiniest shiny available. And Nintendo itself used to make some consoles that did amazing things - remember when new consoles were released, and PC's took a long time to catch up to what a console could do?

The original 360 had horrible build quality (there's been some issues with launch WiiUs but nothing comparable to the RRoD so far). The 300$ model had no storage at all (the premium model was right between the basic and premium WiiU SKUs, but with a HDD rather than flash), shipped with a wired controller, it had no wifi, a standard DVD drive and it was a noisy powerhungry beast. The PS3 was 600$ and is still hardly any cheaper than the WiiU. Both were likely sold at an even larger loss than the WiiU
 

capslock

Is jealous of Matlock's emoticon
Whatever glimmer of interest I had in buying this system is fading fast. There seems to be little thought of any future proofing with this console; hell, it's not even past proof!

How did Nintendo release a console with worse specs than 7 year old machines?
 

Log4Girlz

Member
Whatever glimmer of interest I had in buying this system is fading fast. There seems to be little thought of any future proofing with this console; hell, it's not even past proof!

How did Nintendo release a console with worse specs than 7 year old machines?

They focused on having a 40 watt console. In a sense, you get far, far more performance per watt....but you have a lot less watts to go around.
 

disap.ed

Member
The PS3 was 600$ and is still hardly any cheaper than the WiiU. Both were likely sold at an even larger loss than the WiiU

This just isn't true. Today I received a leaflet from an Austrian chain which sells a Slim 2 500GB with GT5: AE, Uncharted 3 and a free game (out of AC3, NfS:MW, FIFA2013 and one more) for 299€.

I am normally really Nintendo centric (didn't have any of the current and last gen consoles beside Gamecube/Wii), but this is really an interesting offer, the 349€ WiiU premium pack is just too expensive in comparison.
 
Whatever glimmer of interest I had in buying this system is fading fast. There seems to be little thought of any future proofing with this console; hell, it's not even past proof!

How did Nintendo release a console with worse specs than 7 year old machines?

Chill dude. It should be hacked soon, and probably just as wide as the Wii was in it's early days. It'll be worth it. there doesn't seem to be a lot of special protections going on.

I envision a day where I can use my WiiU controller to play VC/Wii games AND turn my TV to something decent.

It also makes me think, why havent MadCatz or PDP made controllers or portable TVs that do a similar function on the PS3 and 360, where Wireless HDMI exists?
 

DrWong

Member
As part of the development team, I'd very much like to see some evidence too. :) Couldn't quite find anything in the Giant Bomb thread (didn't browse it all).

Trine 2: Director's Cut should be using slightly higher graphics settings on Wii U than Trine 2 on the other consoles - nothing really major but a tiny bit. Also the new Goblin Menace expansion levels simply wouldn't run on the other consoles right now, they're more demanding than the normal campaign levels and we've done a lot of optimizations on Wii U to make them run well. The only possibility I can think of for "effects missing" is that if we've had to optimize the Goblin Menace levels we've accidentally disabled some effects for the main campaign too. But let's see some real evidence first and I can provide the tech answer (or semi-tech, I'm no programmer myself). If some effects are truly missing, we'll fix it in an update. :)

Also I have to apologize to North American gamers - we actually wanted to have a small discount at launch but sadly that wasn't possible just yet. We're going to have it in Europe but of course that's no consolation to NA gamers. Sorry!

Anyhow I hope those who buy the game enjoy it very much! :)


Best,
Joel
Frozenbyte
Thanks for the post. Trine 2 - with Nano Assault - is a Eshop lock for me at launch and as an European I'm glad to know about the discount.

Also, please provide to this guy what the evidence he's requesting, I'm very curious to know how a slighty "superior" version can be perceived as inferior.
 
As part of the development team, I'd very much like to see some evidence too. :) Couldn't quite find anything in the Giant Bomb thread (didn't browse it all).

Trine 2: Director's Cut should be using slightly higher graphics settings on Wii U than Trine 2 on the other consoles - nothing really major but a tiny bit. Also the new Goblin Menace expansion levels simply wouldn't run on the other consoles right now, they're more demanding than the normal campaign levels and we've done a lot of optimizations on Wii U to make them run well. The only possibility I can think of for "effects missing" is that if we've had to optimize the Goblin Menace levels we've accidentally disabled some effects for the main campaign too. But let's see some real evidence first and I can provide the tech answer (or semi-tech, I'm no programmer myself). If some effects are truly missing, we'll fix it in an update. :)

Also I have to apologize to North American gamers - we actually wanted to have a small discount at launch but sadly that wasn't possible just yet. We're going to have it in Europe but of course that's no consolation to NA gamers. Sorry!

Anyhow I hope those who buy the game enjoy it very much! :)


Best,
Joel
Frozenbyte

Wow, thanks for posting. This game is definitely on my short list.
 
As part of the development team, I'd very much like to see some evidence too. :) Couldn't quite find anything in the Giant Bomb thread (didn't browse it all).

Trine 2: Director's Cut should be using slightly higher graphics settings on Wii U than Trine 2 on the other consoles - nothing really major but a tiny bit. Also the new Goblin Menace expansion levels simply wouldn't run on the other consoles right now, they're more demanding than the normal campaign levels and we've done a lot of optimizations on Wii U to make them run well. The only possibility I can think of for "effects missing" is that if we've had to optimize the Goblin Menace levels we've accidentally disabled some effects for the main campaign too. But let's see some real evidence first and I can provide the tech answer (or semi-tech, I'm no programmer myself). If some effects are truly missing, we'll fix it in an update. :)

Also I have to apologize to North American gamers - we actually wanted to have a small discount at launch but sadly that wasn't possible just yet. We're going to have it in Europe but of course that's no consolation to NA gamers. Sorry!

Anyhow I hope those who buy the game enjoy it very much! :)


Best,
Joel
Frozenbyte

Van Owned.
 

JoelFB

Vice President, Frozenbyte
Not sure if you can comment - NDAs and all that - but I've read that devs and publishers will have greater control over pricing and sales with the Wii U eShop. Your comment here suggests it's coming, but wasn't ready in time for the US launch. Would that be accurate?

(...oh, and cheers for posting - GAF can't be the most pleasant of places for someone who worked on a launch title for a new system :) )

Yeah I'm not going to comment on that but what I can say is that in general things are very flexible and developer-friendly on the eShop, and we definitely think it's the best digital storefront on the consoles by far.

And thanks (everyone). :) It's not that scary to post here when you know the game can pull its own weight and most people react to it positively (even if it's nigh impossible to please everyone). We're happy we made the launch and are anxiously waiting the European launch now. :)
 

Pociask

Member
They focused on having a 40 watt console. In a sense, you get far, far more performance per watt....but you have a lot less watts to go around.

I appreciate the good answers to my question, including this one. Will probably get a Wii U down the line for Nintendo games anyway, but... this is not good.

I really, really don't understand Nintendo on this one. I've still got light bulbs* that draw more power than the Wii U. Heck, if you think about the standard fan+three bulb set up, the Wii U is drawing what, 40/240, about 17% of what most people turn on daily without a thought. I guess that's neat, but... not something I ever wanted in a game console.

*This is off-topic, but I moved into my current place less than a year ago, still had lots of old bulbs, I'm replacing them as they die.

EDIT: Looked up the 360 for comparison. Launch 360's drew 150-180 watts at first, and with revisions the Slim has brought that down to 88 watts. PS3 went from around 200 to around 100. So the Wii U is using half the power of the revised, more energy efficient versions of the 360 and PS3, and producing comparable/better visuals. As stated, definitely an achievement.
 

beril

Member
This just isn't true. Today I received a leaflet from an Austrian chain which sells a Slim 2 500GB with GT5: AE, Uncharted 3 and a free game (out of AC3, NfS:MW, FIFA2013 and one more) for 299€.

I am normally really Nintendo centric (didn't have any of the current and last gen consoles beside Gamecube/Wii), but this is really an interesting offer, the 349€ WiiU premium pack is just too expensive in comparison.

So some stores do special offers for a console for a limited time. I'm sure that will never happen for the WiiU. The fact still remains that there isn't a massive difference in MSRP between a new PS3 and the WiiU. The PS3 models have more storage, but that probably isn't very relevant to most consumers, and they don't have a massive screen on the controller.
 
Yeah I'm not going to comment on that but what I can say is that in general things are very flexible and developer-friendly on the eShop, and we definitely think it's the best digital storefront on the consoles by far.

No worries - I did think I was pushing my luck with that one :)

Good luck with the European launch!
 

Earendil

Member
As part of the development team, I'd very much like to see some evidence too. :) Couldn't quite find anything in the Giant Bomb thread (didn't browse it all).

Trine 2: Director's Cut should be using slightly higher graphics settings on Wii U than Trine 2 on the other consoles - nothing really major but a tiny bit. Also the new Goblin Menace expansion levels simply wouldn't run on the other consoles right now, they're more demanding than the normal campaign levels and we've done a lot of optimizations on Wii U to make them run well. The only possibility I can think of for "effects missing" is that if we've had to optimize the Goblin Menace levels we've accidentally disabled some effects for the main campaign too. But let's see some real evidence first and I can provide the tech answer (or semi-tech, I'm no programmer myself). If some effects are truly missing, we'll fix it in an update. :)

Also I have to apologize to North American gamers - we actually wanted to have a small discount at launch but sadly that wasn't possible just yet. We're going to have it in Europe but of course that's no consolation to NA gamers. Sorry!

Anyhow I hope those who buy the game enjoy it very much! :)


Best,
Joel
Frozenbyte

Thank you for taking the time to come and post this. Trine 2 is one of the first items on my Will-Buy list for when I get a Wii U next year.


Van Owned.

Indeed
 

DieH@rd

Banned
Even a 60-75w power draw would give us a console significantly more future-proofed than what we're getting. Sigh.

2/4 module Jaguar, ~30-40w radeon 7xxx, and 4-8 gigs of DDR4 could fit in 75w power envelope and produce kickass visuals. But I'm hoping that MS/Sony will aim for 150+ watts.
 

v1oz

Member
I think given the barrage of bad news there's been. We should be at least grateful we're getting a modern GPU with actual shaders this time. They could have easily stuck with registers and color combiners.
 

DieH@rd

Banned
I think given the barrage of bad news there's been. We should be at least grateful we're getting a modern GPU with actual shaders this time. They could have easily stuck with registers and color combiners.

Well... they literally went to AMD and asked for the most slowest and oldest part they can produce. I really dont think they could have gone worse than this [although they somehow managed to put beyond worst case scenario hardware in 3DS IMHO].
 

Jomjom

Banned
Look on the bright side, weak hardware means that a Wii U version of Dolphin will come sooner rather than later. If they had made this a powerhouse console it would have been 10-20 years before we would be able to emulate it on a PC.
 
Look on the bright side, weak hardware means that a Wii U version of Dolphin will come sooner rather than later. If they had made this a powerhouse console it would have been 10-20 years before we would be able to emulate it on a PC.

We won't be seeing a Wii U emulator for a long time
 
Top Bottom