Digital Foundry: Wii U

#211aether17Posted 2/5/2013 4:02:07 PM
jmichaelbp posted...
I've been looking at the NeoGaf threads, so it has more than just 32MB of EDRAM and it has like 2-4 MB more, and it has 1 MB of SRAM?


It does, probably used to emulate the high speed RAM in Wii for Wii games. It doesn't seem like it COULDN't be used for Wii U games though.

jmichaelbp posted...


Because it's fun ignoring those things.


That's the thing, it has newer than what the R700 series contains (I'm trying to find a link where Arkam of NeoGaf clearly stated that it had some newer features ahead of Shader Model 4.0 and DX 10.1), so comparing it to Xenos based on SPU, ROPs etc is irrelevant because Latte would be more efficient in every way. It has a proper tessellation unit, something Xenos has to "fake" due to it having a limited tessellator (if it has one), it has Compute Shaders, something Xenos would take a performance hit on if it attempted it. It has a bunch of other features Xenos would need to "fake". All of this "faking" costs more power than if one were to do it naturally. Latte will do these things naturally, that has to be accounted for with the "multiplier". 1.5x is too low for those SPU's ROPs, and the features we know are in Wii U. The fact that DF ignored this proves just how idiotic they can be.

Dynheart posted...


I've been one of the guys trying to figure this thing out. I give up. Why? There are too many unknown variables when it comes to a custom card/chip, let alone a card/chip that customized to the hilt such as this one.

We can theorize all day and still be wrong. It could be weaker than what we guessed. Or, like the Game Cubes case, stronger than we totally expect the thing to perform at.

Unless Nintendo comes out with the specs, which they wont, we are left here with more questions than answers with this GPU die photo.

Not to mention, it's been stated that the GPU may work, in an odd way, with the CPU. So more pics are on the way. Is the other half of the clues (it's not like we can figure out the GPU now, anyway) related to the CPU?

We have only started. I highly doubt DF figured it out in mere minutes/hours, just in time to right up an article on it.

I've tried to figure it out, but it's too customized to conclude anything based on off-shelf products.

The only way the GPU would be weaker, would be if its SPU was much lower than 160, because that would mean the newer features it has couldn't be able to bring it up to at least par. There have been absolutey no credible reports of Latte being weaker at all. If it was simply "on par", Trine 2 would not have been possible (it has improved graphics, newer PhysX, 720p resolution (PS360 have dynamic resolution with max at 720p in less graphical areas), and is streaming a 480p signal to the gamepad). It apparently doesn't have enough power "left over" to bring up framerate to 60fps with the graphics they're throwing at it (which would require 2 - 3x more power aside from the graphics enhancements). A GPU simply 1.5x would not be able to do that, which is why I believe that 1.5x Xenos multiplier is too low.

It'll be nice if we could figure out exactly what the numbers are though, looks like the 320:16:8 are just educated guesses for now.

It's only been one day, and we have some speculated info that DF (and some trolls) pass as a fact. This is going to take more than one day to figure out as much as we can. DF, are idiots, at least GAF realizes that.

Though, I think we finally figured out how the RAM isn't the bottleneck we thought it was, so looking into the GPU was very helpful in that case.

And yeah, apparently Chipworks will be releasing CPU shots as well. That'll be interesting to see.
---
According to someone, I am a well known Troll.........not sure how.
#212mini_blightPosted 2/5/2013 4:19:40 PM(edited)
squatch22 posted...
http://www.eurogamer.net/articles/df-hardware-wii-u-graphics-power-finally-revealed

Despite its general release two months ago, Nintendo's Wii U console would remain something of a technological mystery. We quickly gained a good idea of the make-up of the IBM tri-core CPU, but the system's apparent strengths are in its graphics hardware, and in that regard we had little or no idea of the composition of the Radeon core. Indeed, it's safe to say that we knew much more about the graphics processors in the next-generation Xbox and PlayStation. Until now.


But DF said the Wii U was identical to the PS360 back in November. So all this time they knew absolutely nothing, and all the anti-Nintendo virals were running with that article as proof the Wii U was weaker than PS360.

The best part is next--

However, the story of how these photos came into existence is a fascinating tale in itself: community forum NeoGAF noticed that Chipworks were selling Wii U reverse-engineering photography on its website, with shots of the principal silicon being offered at $200 a pop. Seeking to draw a line under the controversy surrounding the Nintendo hardware, a collection was started to buy the photos.

There was just one problem. The shots were simply higher quality versions of what had already been revealed on sites like Anandtech - good for getting an idea of the amount of silicon used and the make-up of the overall design, but without the ultra-magnification required to provide answers, and therefore no further use in unearthing the secrets of the Wii U hardware. At this point, Chipworks itself became aware of the community money-raising effort, and decided to help out by providing the required shot - for free. It's a remarkably generous gesture bearing in mind that the cost of carrying out this work is, as Chipworks' Rob Williamson told us, "non-trivial".


Free? lol no.

squatch22 posted...

While there's still room for plenty of debate about the Wii U hardware, the core fundamentals are now in place and effectively we have something approaching a full spec. It took an extraordinary effort to get this far and you may be wondering quite why it took a reverse engineering specialist using ultra-magnification photography to get this information, when we already know the equivalent data for Durango and Orbis. The answer is fairly straightforward - leaks tend to derive from development kit and SDK documentation and, as we understand it, this crucial information simply wasn't available in Nintendo's papers, with developers essentially left to their own devices to figure out the performance level of the hardware.


No, you don't. All you know are rumored specs leaked by the same suspect source. That's all anyone knows that isn't under NDA.

The Orbis specs are fake, btw.
#213fon1988Posted 2/5/2013 4:41:34 PM
DaLagga posted...
fon1988 posted...
I like playing games for fun.


Really? Did you enjoy playing Mass Effect on your Wii? What about Fallout 3/NV? How about Skyrim? Or maybe Bioshock? Those sure were fun on the Wii, right? Oh wait, I forgot! The Wii was far too weak to handle those games.

Now, given the specs, It's all but confirmed that the Wii U will be in the same boat once the PS4/x720 launch. The Wii U will be far too weak to handle games designed for those systems so if you only own a Wii U, you won't be able to play them. That sure sounds like fun, right? Now do you understand why processing power matters?


No I enjoyed Mass Effect on my 360. As well as Fallout 3/NV, Skyrim, Bioshock, Halo, Grand Theft Auto, DMC, and many more.

I also highly enjoyed many games on the PS3.

What I enjoyed more though was playing Twilight Princess, Skyward Sword, Mario Galaxy 1 and 2, Brawl, Kirby's Return to Dreamland, Donkey Kong Country Returns, No More Heroes 1 and 2, Super Paper Mario, Metroid Prime Trilogy, and many more on my Wii. Processing power didn't result in any of those games being less fun than they could be.

I enjoy playing games on the PS3, my 360, and my Wii/Wii U, and processing power has nothing to do with it.

So as I stated before, I like playing games for fun.
#214darkjedilinkPosted 2/5/2013 4:51:24 PM
Lefty128k posted...
icarus231 posted...
parkourboybryan posted...
So the GPU is approximately 1.5x more powerful, but unknown factors are preventing them from knowing much else. That's what I got from reading it.


From what I can gather the 1.5x is still only a guess at this point. The GPU is custom and not exactly what anyone expected and essentially they need to reverse engineer it to figure out how good it really is.


Unless those custom components are pure genius and/or completely revolutionary, then they aren't likely to make this GPU any more powerful than the current educated guesses say it is.


Current "educated" guesses put it a bit farther ahead than DF ever gave it credit for, and much farther ahead than what's in the article in question.
---
Gaming is like a pair of boobs - Sony and Microsoft fight over whos boobs look more realistic, while Nintendo is about having fun with them - Walkiethrougie
#215MegagunstarmanPosted 2/5/2013 4:55:10 PM
aether17 posted...
jmichaelbp posted...
I've been looking at the NeoGaf threads, so it has more than just 32MB of EDRAM and it has like 2-4 MB more, and it has 1 MB of SRAM?


It does, probably used to emulate the high speed RAM in Wii for Wii games. It doesn't seem like it COULDN't be used for Wii U games though.


It is used in Wii U mode.

The 2MB MEM0/EFB (framebuffer in Wii mode) is used as fast general purpose RAM in Wii U mode. Dunno about the 1MB SRAM (Wii texture cache)

Hm, actually, it seems to use at least 2.75MB of MEM0. They might be throwing in the texture cache SRAM as MEM0 too.


twitter.com/marcan42/status/298922907420200961

twitter.com/marcan42/status/298929740063051776
---
I'm not changing this until a new Jet Set Radio is announced. Started 10/13/11
#216MrMisanthropePosted 2/5/2013 5:06:40 PM
fon1988 posted...
DaLagga posted...
fon1988 posted...
I like playing games for fun.


Really? Did you enjoy playing Mass Effect on your Wii? What about Fallout 3/NV? How about Skyrim? Or maybe Bioshock? Those sure were fun on the Wii, right? Oh wait, I forgot! The Wii was far too weak to handle those games.

Now, given the specs, It's all but confirmed that the Wii U will be in the same boat once the PS4/x720 launch. The Wii U will be far too weak to handle games designed for those systems so if you only own a Wii U, you won't be able to play them. That sure sounds like fun, right? Now do you understand why processing power matters?


No I enjoyed Mass Effect on my 360. As well as Fallout 3/NV, Skyrim, Bioshock, Halo, Grand Theft Auto, DMC, and many more.

I also highly enjoyed many games on the PS3.

What I enjoyed more though was playing Twilight Princess, Skyward Sword, Mario Galaxy 1 and 2, Brawl, Kirby's Return to Dreamland, Donkey Kong Country Returns, No More Heroes 1 and 2, Super Paper Mario, Metroid Prime Trilogy, and many more on my Wii. Processing power didn't result in any of those games being less fun than they could be.

I enjoy playing games on the PS3, my 360, and my Wii/Wii U, and processing power has nothing to do with it.

So as I stated before, I like playing games for fun.


I really don't understand why this isn't painfully obvious to everyone.
#217mini_blightPosted 2/5/2013 5:30:15 PM
guttertalk posted...
If it's true that Nintendo provided very little documentation about the hardware, then it seems little wonder that we've not seen better looking and performing games on the Wii U because devs are not able to program to take advantage of the hardware as much as they could be.

I expected the Wii U to be somewhere between the technology of the MS/Sony 7th and 8th generations. I'm honestly not sure where that the Wii U is now--because there are still some questions with the Wii U and because I'm not accepting the 720/PS4 rumors until we have official specs.


Nintendo is secretive for good reason. It has nothing to do with third parties and everything to do with their direct competitors.
#218aether17Posted 2/5/2013 5:38:00 PM
mini_blight posted...
guttertalk posted...
If it's true that Nintendo provided very little documentation about the hardware, then it seems little wonder that we've not seen better looking and performing games on the Wii U because devs are not able to program to take advantage of the hardware as much as they could be.

I expected the Wii U to be somewhere between the technology of the MS/Sony 7th and 8th generations. I'm honestly not sure where that the Wii U is now--because there are still some questions with the Wii U and because I'm not accepting the 720/PS4 rumors until we have official specs.


Nintendo is secretive for good reason. It has nothing to do with third parties and everything to do with their direct competitors.


Not sure about Nintendo not even providing tech specs, I've heard some devs from NeoGaf talk about clock speeds and such, and even GFLOPS. Some who suggested 600mhz clock were then told "a little too high", and even the 600GFLOPS were told "a little to high". They likely had specs, at least after the final dev kits that is.
---
According to someone, I am a well known Troll.........not sure how.
#219DarkZV2BetaPosted 2/5/2013 6:39:31 PM
Enigma149 posted...
the GCN hardware in Durango and Orbis is in a completely different league


What exactly does GCN stand for here?


Graphics Core Next, which is AMD's name for their new, slightly less incompetent shader architecture.
Basically, they finally came out and admitted that regular Radeon cores are garbage, and that more robust GPU cores are a better idea.
---
AMD CACHING = NOT YET FINISHED
a high end card gets bg3 1080p maxed around 200fps ~The Q on BF3.
#220ElectricKaibutuPosted 2/5/2013 6:46:13 PM
kissdadookie posted...
FenderMaster posted...
DaLagga posted...
ElectricKaibutu posted...
Wow. Today was the day that someone said "I like playing games for fun" on a forum for games and then got angrily ranted at.


How was that an angry rant? It's been explained countless times as to why processing power matters, yet people still try to dismiss it. What good are games if your system is too weak to run them? Where's the fun in that? Especially when most of the best console games this past generation were multi-platform titles which the Wii obviously couldn't handle.


This. Fanboys love to fall back on the old "who cares about graphics, it's all about the games" which is completely missing the point. If the hardware is incapable of running any multiformat games produced in the next 10 years, then that means a whole lot less fun for WiiU only gamers, or anyone who bought WiiU expecting to play next gen third party games.


What I don't understand is why these fanboys b!tch about multi-plats and then also likes to run around trying to make it out like the Nintendo systems can handle those games but it's just that Nintendo doesn't want those games. It's dumb. Let's face it, we buy Nintendo platforms for their exclusives, simple as that. If the exclusives are good, the platform is good, however, it's stupid to try to claim ridiculous things about the hardware's capabilities especially when they are false. Let's see all the hilariously bad misinformation fanboys have been spitting out:

1) Wii U is at least 2x's more powerful than PS360 (it's not, at all, it's on par).

2) The Wii U has GPGPU capabilities that are worth a damn (it's not, it's a Radeon 4xxx part, the GPGPU capabilities on that series was crude at best, it wasn't until the series 6xxx before the GPGPU functionality became an actual beneficial asset).

3) That the Wii U is 1.5x's more powerful than the 360 based on the DF article (HILARIOUS, people clearly are incapable of reading, the Wii U has a theoretical RAW SHADER performance which is roughly 1.5x's more capable than the 360 GPU but this does NOT make it a 1.5x's more powerful system compared to the 360).

4) That the MCM is a SOC (HILARIOUS, the MCM implementation on the Wii U is essentially Nintendo placing the main processors on to it's own daughtercard, this is FAR FAR FAR from combining all your processors into a SOC).

5) That the GPU on the Wii U was at least a series 5xxx (clearly not, proven not to be, and I've been saying that it's a 4xxx part for months now on here).

Face it people, we (including myself here) bought Nintendo systems to play Nintendo exclusives (like the new Fire Emblem, which is AMAZING), but let's not fool ourselves and then try to fool others into believing that the hardware in the Wii U can ever keep up with actual next gen Sony and MS boxes.


Hey guys, chill out. I agree with you. The Wii U isn't close to Orbis and Durango in power. I wish it were, but it's not.

I was just pointing out that the guy who only said "I like playing games for fun" didn't deserve to get blasted.