Digital Foundry: Wii U

#1squatch22Posted 2/5/2013 5:28:54 AM
http://www.eurogamer.net/articles/df-hardware-wii-u-graphics-power-finally-revealed

Despite its general release two months ago, Nintendo's Wii U console would remain something of a technological mystery. We quickly gained a good idea of the make-up of the IBM tri-core CPU, but the system's apparent strengths are in its graphics hardware, and in that regard we had little or no idea of the composition of the Radeon core. Indeed, it's safe to say that we knew much more about the graphics processors in the next-generation Xbox and PlayStation. Until now.

Detailed polysilicon die photography of the Wii U's GPU has now been released, showing the hardware make-up at the nano level and resolving most of the outstanding mysteries. However, the story of how these photos came into existence is a fascinating tale in itself: community forum NeoGAF noticed that Chipworks were selling Wii U reverse-engineering photography on its website, with shots of the principal silicon being offered at $200 a pop. Seeking to draw a line under the controversy surrounding the Nintendo hardware, a collection was started to buy the photos.

There was just one problem. The shots were simply higher quality versions of what had already been revealed on sites like Anandtech - good for getting an idea of the amount of silicon used and the make-up of the overall design, but without the ultra-magnification required to provide answers, and therefore no further use in unearthing the secrets of the Wii U hardware. At this point, Chipworks itself became aware of the community money-raising effort, and decided to help out by providing the required shot - for free. It's a remarkably generous gesture bearing in mind that the cost of carrying out this work is, as Chipworks' Rob Williamson told us, "non-trivial".

So, what does the new shot below actually tell us? Well, first of all, let's be clear about how we draw our conclusions. Graphics cores work principally by spreading work in parallel over a vast array of processors. On the die shot, this manifests as the same mini-blocks of transistors "copied and pasted" next to one another. We know that the Wii U hardware is based on AMD's RV770 line of processors - essentially the Radeon HD 4xxx cards - so we have some point of comparison with existing photography of equivalent AMD hardware.

Chipworks' shot is still being analysed, but the core fundamentals are now seemingly beyond doubt. The Wii U GPU core features 320 stream processors married up with 16 texture mapping units and featuring 8 ROPs. After the Wii U's initial reveal at E3 2011, our take on the hardware was more reserved than most. "We reckon it probably has more in common with the Radeon HD 4650/4670 as opposed to anything more exotic," we said at the time. "The 320 stream processors on those chips would have more than enough power to support 360 and PS3 level visuals, especially in a closed-box system."

It was ballpark speculation at the time based on what we had eyeballed at the event, but the final GPU is indeed a close match to the 4650/4670, albeit with a deficit in the number of texture-mapping units and a lower clock speed - 550MHz. AMD's RV770 hardware is well documented so with these numbers we can now, categorically, finally rule out any next-gen pretensions for the Wii U - the GCN hardware in Durango and Orbis is in a completely different league. However, the 16 TMUs at 550MHz and texture cache improvements found in RV770 do elevate the capabilities of this hardware beyond the Xenos GPU in the Xbox 360 - 1.5 times the raw shader power sounds about right. 1080p resolution is around 2.5x that of 720p, so bearing in mind the inclusion of just eight ROPs, it's highly unlikely that we'll be seeing any complex 3D titles running at 1080p.
---
Dog posted: I know I have bias...
http://img.gamefaqs.net/screens/1/9/b/gfs_75574_2_16.jpg
#2squatch22(Topic Creator)Posted 2/5/2013 5:29:13 AM
All of which may lead some to wonder quite why many of the Wii U ports disappoint - especially Black Ops 2, which appears to have been derived from the Xbox 360 version, running more slowly even at the same 880x720 sub-hd resolution. The answer comes from a mixture of known and unknown variables.

The obvious suspect would be the Wii U's 1.2GHz CPU, a tri-core piece of hardware re-architected from the Wii's Broadway chip, in turn a tweaked, overclocked version of the GameCube's Gekko processor. In many of our Wii U Face-Offs we've seen substantial performance dips on CPU-specific tasks. However, there still plenty of unknowns to factor in too - specifically the bandwidth levels from the main RAM and the exact nature of the GPU's interface to its 32MB of onboard eDRAM. While the general capabilities of the Wii U hardware are now beyond doubt, discussion will continue about how the principal processing elements and the memory are interfaced together, and Nintendo's platform-exclusive titles should give us some indication of what this core is capable of when developers are targeting it directly.

However, while we now have our most important answers, the die-shot also throws up a few more mysteries too - specifically, what is the nature of the second and third banks of RAM up on the top-left, and bearing in mind how little of the chip is taken up by the ALUs and TMUs, what else is taking up the rest of the space? Here we can only speculate, but away from other essential GPU elements such as the ROPs and the command processor, we'd put good money on the Wii U equivalent to the Wii's ARM 'Starlet' security core being a part of this hardware, along with an audio DSP. We wouldn't be surprised at all if there's a hardware video encoder in there too for compressing the framebuffer for transmission to the GamePad LCD display. The additional banks of memory could well be there for Wii compatibility, and could account for the 1MB texture and 2MB framebuffer. Indeed, the entire Wii GPU could be on there, to ensure full backwards compatibility.

While there's still room for plenty of debate about the Wii U hardware, the core fundamentals are now in place and effectively we have something approaching a full spec. It took an extraordinary effort to get this far and you may be wondering quite why it took a reverse engineering specialist using ultra-magnification photography to get this information, when we already know the equivalent data for Durango and Orbis. The answer is fairly straightforward - leaks tend to derive from development kit and SDK documentation and, as we understand it, this crucial information simply wasn't available in Nintendo's papers, with developers essentially left to their own devices to figure out the performance level of the hardware.
---
Dog posted: I know I have bias...
http://img.gamefaqs.net/screens/1/9/b/gfs_75574_2_16.jpg
#3Enigma149Posted 2/5/2013 5:52:16 AM
the GCN hardware in Durango and Orbis is in a completely different league


What exactly does GCN stand for here?
---
3DS:4897-5935-1924; NNID: CrimsonEnigma; PSN: CrimsonEnigma (not currently in use)
'If you think a system will make you look mature, you ain't mature' -squatch
#4parkourboybryanPosted 2/5/2013 5:55:20 AM
So the GPU is approximately 1.5x more powerful, but unknown factors are preventing them from knowing much else. That's what I got from reading it.
---
One of the proud people that still plays Mirror's Edge to this day.
Not changing my sig until DICE officially announces Mirror's Edge 2
#5FenderMasterPosted 2/5/2013 6:37:22 AM
parkourboybryan posted...
So the GPU is approximately 1.5x more powerful, but unknown factors are preventing them from knowing much else. That's what I got from reading it.


darkjedilink, shinobi, linkfan et al, where are you? WiiU needs some irrational defending!
---
http://www.jvp-boston.org
#6icarus231Posted 2/5/2013 6:40:33 AM
parkourboybryan posted...
So the GPU is approximately 1.5x more powerful, but unknown factors are preventing them from knowing much else. That's what I got from reading it.


From what I can gather the 1.5x is still only a guess at this point. The GPU is custom and not exactly what anyone expected and essentially they need to reverse engineer it to figure out how good it really is.
#7darkjedilinkPosted 2/5/2013 6:45:25 AM
These guys are the same people that said that it wouldn't likely be as powerful as it is, solely based on the casing size.

They're just trying to back up they're flawed claim. This is proven by them claiming the 6670 derivatives in the Nextbox and PS4 'isn't in the same league' as the Wii U chip.
---
Gaming is like a pair of boobs - Sony and Microsoft fight over whos boobs look more realistic, while Nintendo is about having fun with them - Walkiethrougie
#8FenderMasterPosted 2/5/2013 6:49:15 AM
darkjedilink posted...
These guys are the same people that said that it wouldn't likely be as powerful as it is, solely based on the casing size.

They're just trying to back up they're flawed claim. This is proven by them claiming the 6670 derivatives in the Nextbox and PS4 'isn't in the same league' as the Wii U chip.


lol right on cue!
---
http://www.jvp-boston.org
#9godplaysSNESPosted 2/5/2013 6:57:57 AM
darkjedilink posted...
These guys are the same people that said that it wouldn't likely be as powerful as it is, solely based on the casing size.

They're just trying to back up they're flawed claim. This is proven by them claiming the 6670 derivatives in the Nextbox and PS4 'isn't in the same league' as the Wii U chip.


Except that all points to the Nextbox and PS4 using AMD's latest architecture, GCN
---
Super Mario Kart is the single best Mario Kart ever!
#10PigfartsPosted 2/5/2013 7:00:55 AM
try hard, even nintendo said the sales were aka lower then expected, which unless you are totally blind and deaf, take it to mean poor.
---
If you don't like the smell of pigfarts, stay out of the pigpen.