Active Users:1160 Time:22/11/2024 11:26:32 PM
That jpeg thing was unfortunately fake - we don't have specific specs, but here's what we've got - Edit 1

Before modification by Jragghen at 09/06/2011 07:27:34 AM

Ooh, it's blu-ray, and backwards compatible with GameCube AND Wii? That is pretty sweet. Doesn't affect me, since I didn't own the last two nintendo consoles (roommates did), but very slick.


No confirmation of BC with Gamecube (it's possible it will be, but I'm guessing no controller ports. We'll see). It's confirmed to be not Blu-ray, but also to not be DVD - some other high density optical format. They gave it a name, but I can't recall it. It's looking to be proprietary.

Hm... accusations that the demo video had some footage of PS3 and 360 games...


Accusation nothing. It was confirmed from the horse's mouth - none of the games are ready yet (read: launch is still a year away, minimum), so all that third party footage was PS360 footage. Which made Kotaku and...Game Informer (I think it was)'s comments of "doesn't look as good as PS360" to be hilarious - and that commentary is the root of this "oh, it's not that powerful" FUD. Also, there was a FRAPS watermark on them or something.


We won't get specific specs from Nintendo - they don't do that. Which is why we look at IBM/AMD statements and read between the lines. For example, IBM said that it uses a processor which has EDRAM.

http://www.engadget.com/2011/06/07/ibm-puts-watsons-brains-in-nintendo-wii-u/

Nintendo's new console, the Wii U, was finally unveiled to the world today at E3 2011, and we got a glimpse of its graphical prowess at the company's keynote. Details were scarce about the IBM silicon Nintendo's new HD powerhouse was packing, but we did some digging to get a little more info. IBM tells us that within the Wii U there's a 45nm custom chip with "a lot" of embedded DRAM. It's a silicon on insulator design and packs the same processor technology found in Watson, the supercomputer that bested a couple of meatbags on Jeopardy awhile back.


There is only one processor that IBM makes right now with EDRAM - Power 7, the aforementioned processor which was in Watson (and which comes on the 45nm process, so that matches, too). It comes in 4, 8, and 16 core variants right now, with 4MB EDRAM per core. Current guesses have it as being a 3 core variant of the processor, with the corresponding 12MB. This isn't really based on much of anything, but if it is the case, would be very cost effective, as they would be able to utilize all the 4 core processors which have impurities when IBM prints it out. Etc, etc. So it's not "confirmed," but we can be 99% sure it's based off of Power 7 tech at this point.


On Graphics, here's AMD's statement, which is slightly less informative:

http://www.marketwire.com/press-release/amd-nintendo-join-forces-creating-new-way-enjoy-console-gaming-entertainment-nyse-amd-1523972.htm

Key statement there is "multiple display support." AMD's multiple screen tech came out in the R800 series (AMD cards with names 5xxx, released in 2009), so we're probably looking at that as being the minimum baseline. For comparison, the 360 has a R520 variant which came out in 2006, while the PS3 has a variant of the GeForce 7 series, released in 2005/6.


I fully expect that Sony/Microsoft's inevitable machines will be leapfrogging it, likely resulting in a PS2 to Xbox gap, give or take, when they inevitably come out. But the Wii U (God, I hate that name ) should be reasonably assumed to be a generation beyond the PS3/360.

Return to message