Digital Foundry: Wii U

#61ShinChuckPosted 2/5/2013 9:42:24 AM
Eoin posted...
Enigma149 posted...
What exactly does GCN stand for here?

Since you already got one nonsense answer, here's the correct one: it means "Graphics Core Next", a brandname for AMD's new architecture.

http://www.amd.com/uk/products/technologies/gcn/Pages/gcn-architecture.aspx


Psh, my answer was as spot on as you'll get. Wikipedia backs me up, and Wikipedia never lies. ...right? Right?!
---
GamerTag: ShinChuck, PSN: ShinChuck1
http://homepage.mac.com/shinchuck - Video games, YouTube, Deviantart, Etcetera.
#62kissdadookiePosted 2/5/2013 9:52:24 AM(edited)
PENDRAG0ON posted...
aether17 posted...
I'm sorry but these idiots are simply taking the speculation of NeoGaf and passing it out as a fact, despite not completely understanding what everything does in there. 22 logical blocks are unaccounted for, and they make a conclusion? Ridiculous.


Yea, Digital Foundry lost a lot of credibility in my eye thanks to this. About all I will trust them for going forward is the framerate comparisons.


You people are clueless aren't you? Those 22 logical unaccounted for blocks is not going to make a dent of a difference in GPU performance in any significant way. The major parts of the GPU has been identified already and it's a match for a series 4xxx Radeon GPU except it's a slightly gimped iteration of the series 4xxx.

Why don't you people go and research the makeup of GPUs? If you had bothered to research this, you will understand how the 22 logical blocks that aren't accounted for really is not going to change the fact that this is pretty much a Radeon series 4xxx part.

Essentially, the argument here and the comments made about how DF doesn't know their stuff are just stupid excuses and apologizing for Nintendo bringing out a next gen console with basically PS360 generation specs. You people are just going to keep going on and on about how sites like DF and all the analysis articles that have been out for the Wii U are BS and essentially unless Nintendo comes out with a public spec sheet, you folks are just going to claim that anything negative about the Wii U must be complete BS unless Nintendo themselves makes it public (and seriously, that's never going to happen, what company is stupid enough to publicly announce their deficiencies?).
#63FenderMasterPosted 2/5/2013 9:49:45 AM
DaLagga posted...
ElectricKaibutu posted...
Wow. Today was the day that someone said "I like playing games for fun" on a forum for games and then got angrily ranted at.


How was that an angry rant? It's been explained countless times as to why processing power matters, yet people still try to dismiss it. What good are games if your system is too weak to run them? Where's the fun in that? Especially when most of the best console games this past generation were multi-platform titles which the Wii obviously couldn't handle.


This. Fanboys love to fall back on the old "who cares about graphics, it's all about the games" which is completely missing the point. If the hardware is incapable of running any multiformat games produced in the next 10 years, then that means a whole lot less fun for WiiU only gamers, or anyone who bought WiiU expecting to play next gen third party games.
---
http://www.jvp-boston.org
#64PendragoonPosted 2/5/2013 9:58:51 AM
kissdadookie posted...
PENDRAG0ON posted...
aether17 posted...
I'm sorry but these idiots are simply taking the speculation of NeoGaf and passing it out as a fact, despite not completely understanding what everything does in there. 22 logical blocks are unaccounted for, and they make a conclusion? Ridiculous.


Yea, Digital Foundry lost a lot of credibility in my eye thanks to this. About all I will trust them for going forward is the framerate comparisons.


You people are clueless aren't you? Those 22 logical unaccounted for blocks is not going to make a dent of a difference in GPU performance in any significant way. The major parts of the GPU has been identified already and it's a match for a series 4xxx Radeon GPU except it's a slightly gimped iteration of the series 4xxx.

Why don't you people go and research the makeup of GPUs? If you had bothered to research this, you will understand how the 22 logical blocks that aren't accounted for really is not going to change the fact that this is pretty much a Radeon series 4xxx part.

Essentially, the argument here and the comments made about how DF doesn't know their stuff are just stupid excuses and apologizing for Nintendo bringing out a next gen console with basically PS360 generation specs. You people are just going to keep going on and on about how sites like DF and all the analysis articles that have been out for the Wii U are BS and essentially unless Nintendo comes out with a public spec sheet, you folks are just going to claim that anything negative about the Wii U must be complete BS unless Nintendo themselves makes it public (and seriously, that's never going to happen, what company is stupid enough to publicly announce their deficiencies?).


/facepalm

Over react much? Neogaf is also not happy with DF because they took their base findings, which were incomplete and ran with them. This is why people aren't happy with them right now, what little they tried to add on their own was even incorrect. Neogaf users doing the breakdown that DF ripped off have stated that comparisons to any off the shelf gpu is pointless.

Right now all we can do is wait until the analysis is complete, until then we are still in the dark.
---
Know Japanese? Post your advice in the topic below!
http://gamefaqs.com/boards/316-gamefaqs-world-japan/63714709
#65kissdadookiePosted 2/5/2013 10:00:48 AM
FenderMaster posted...
DaLagga posted...
ElectricKaibutu posted...
Wow. Today was the day that someone said "I like playing games for fun" on a forum for games and then got angrily ranted at.


How was that an angry rant? It's been explained countless times as to why processing power matters, yet people still try to dismiss it. What good are games if your system is too weak to run them? Where's the fun in that? Especially when most of the best console games this past generation were multi-platform titles which the Wii obviously couldn't handle.


This. Fanboys love to fall back on the old "who cares about graphics, it's all about the games" which is completely missing the point. If the hardware is incapable of running any multiformat games produced in the next 10 years, then that means a whole lot less fun for WiiU only gamers, or anyone who bought WiiU expecting to play next gen third party games.


What I don't understand is why these fanboys b!tch about multi-plats and then also likes to run around trying to make it out like the Nintendo systems can handle those games but it's just that Nintendo doesn't want those games. It's dumb. Let's face it, we buy Nintendo platforms for their exclusives, simple as that. If the exclusives are good, the platform is good, however, it's stupid to try to claim ridiculous things about the hardware's capabilities especially when they are false. Let's see all the hilariously bad misinformation fanboys have been spitting out:

1) Wii U is at least 2x's more powerful than PS360 (it's not, at all, it's on par).

2) The Wii U has GPGPU capabilities that are worth a damn (it's not, it's a Radeon 4xxx part, the GPGPU capabilities on that series was crude at best, it wasn't until the series 6xxx before the GPGPU functionality became an actual beneficial asset).

3) That the Wii U is 1.5x's more powerful than the 360 based on the DF article (HILARIOUS, people clearly are incapable of reading, the Wii U has a theoretical RAW SHADER performance which is roughly 1.5x's more capable than the 360 GPU but this does NOT make it a 1.5x's more powerful system compared to the 360).

4) That the MCM is a SOC (HILARIOUS, the MCM implementation on the Wii U is essentially Nintendo placing the main processors on to it's own daughtercard, this is FAR FAR FAR from combining all your processors into a SOC).

5) That the GPU on the Wii U was at least a series 5xxx (clearly not, proven not to be, and I've been saying that it's a 4xxx part for months now on here).

Face it people, we (including myself here) bought Nintendo systems to play Nintendo exclusives (like the new Fire Emblem, which is AMAZING), but let's not fool ourselves and then try to fool others into believing that the hardware in the Wii U can ever keep up with actual next gen Sony and MS boxes.
#66ElectricMolePosted 2/5/2013 10:01:14 AM
Enigma149 posted...
the GCN hardware in Durango and Orbis is in a completely different league


What exactly does GCN stand for here?


http://www.amd.com/us/products/technologies/gcn/Pages/gcn-architecture.aspx

28nm, built on the Radeon 7000 series.

Not much of a leap from what's in the Wii u IMO.
---
NinNetID: ElectricMole, PSN: kronekodow7188
#67aether17Posted 2/5/2013 10:01:29 AM
kissdadookie posted...


You people are clueless aren't you? Those 22 logical unaccounted for blocks is not going to make a dent of a difference in GPU performance in any significant way. The major parts of the GPU has been identified already and it's a match for a series 4xxx Radeon GPU except it's a slightly gimped iteration of the series 4xxx.

Why don't you people go and research the makeup of GPUs? If you had bothered to research this, you will understand how the 22 logical blocks that aren't accounted for really is not going to change the fact that this is pretty much a Radeon series 4xxx part.


Clueless? No. I've seen the 4870 die shot many times, the Latte structure is pretty different from that, hence it's customized. Major parts have been identified, but is it all for sure? We are speculating and assuming that all of the R700 configurations still apply, and we cannot even be sure about that, which is why the board on NeoGaf clearly state that it's "speculation". All of this is IF it wasn't configured in a different manner. I'm not a fantroll, I don't expect PS4720 performance, but taking speculation and passing it out as "fact" (like DF has done) is ridiculous.

Again, I DO NOT expect PS4720 performance, and I never did.
---
According to someone, I am a well known Troll.........not sure how.
#68silverbulltPosted 2/5/2013 10:04:42 AM
Good, we finally all agree power defines a generation. Thought you fanboys were never gonna give up.
---
XBOX360 - It only does everything.
Nintendo FIX the Wii U FFS: http://youtu.be/gl5_qODv-gQ
#69kissdadookiePosted 2/5/2013 10:06:31 AM
PENDRAG0ON posted...
kissdadookie posted...
PENDRAG0ON posted...
aether17 posted...
I'm sorry but these idiots are simply taking the speculation of NeoGaf and passing it out as a fact, despite not completely understanding what everything does in there. 22 logical blocks are unaccounted for, and they make a conclusion? Ridiculous.


Yea, Digital Foundry lost a lot of credibility in my eye thanks to this. About all I will trust them for going forward is the framerate comparisons.


You people are clueless aren't you? Those 22 logical unaccounted for blocks is not going to make a dent of a difference in GPU performance in any significant way. The major parts of the GPU has been identified already and it's a match for a series 4xxx Radeon GPU except it's a slightly gimped iteration of the series 4xxx.

Why don't you people go and research the makeup of GPUs? If you had bothered to research this, you will understand how the 22 logical blocks that aren't accounted for really is not going to change the fact that this is pretty much a Radeon series 4xxx part.

Essentially, the argument here and the comments made about how DF doesn't know their stuff are just stupid excuses and apologizing for Nintendo bringing out a next gen console with basically PS360 generation specs. You people are just going to keep going on and on about how sites like DF and all the analysis articles that have been out for the Wii U are BS and essentially unless Nintendo comes out with a public spec sheet, you folks are just going to claim that anything negative about the Wii U must be complete BS unless Nintendo themselves makes it public (and seriously, that's never going to happen, what company is stupid enough to publicly announce their deficiencies?).


/facepalm

Over react much? Neogaf is also not happy with DF because they took their base findings, which were incomplete and ran with them. This is why people aren't happy with them right now, what little they tried to add on their own was even incorrect. Neogaf users doing the breakdown that DF ripped off have stated that comparisons to any off the shelf gpu is pointless.

Right now all we can do is wait until the analysis is complete, until then we are still in the dark.


NeoGAF if you haven't figured out yet, are TERRIBLE at tech analysis. Remember their claims about the 3DS GPU? HILARIOUSLY wrong. PROVEN wrong. Yet fanboys are STILL running with the VERY wrong analysis that originated from NeoGAF.

Here's the thing about NeoGAF, most of them take basic principals learned from the internet and then add a lot of assumptions to them, basically fabricating complete nonsense masquerading as facts. Take for instance, the Chipworks breakdown of the Wii U GPU. Yes, there's a lot of extra silicon there which is unaccounted for, but all that we need to know is the main GPU. Thus Chipworks outlined the main GPU components, thus giving us the complete picture that this is 100% a Radeon 4xxx GPU part. That's all we need to know, the extra silicon could do a hundred million amazing things but it's not going to change the fact that the GPU performance is STILL going to be Radeon 4xxx level (which some features actually being cut out on the Wii U GPU, making it slightly gimped).

There's a difference between wanting to know how everything works and wanting to know performance levels. For the purpose of knowing the level of performance, the DF article is spot on and that's all DF was out to accomplish, establish the level of performance on the Wii U. Simple as that. The only question remaining now is if the CPU is the bottleneck or if it's also the memory bandwidth.

Seems like you have facepalmed yourself there buddy. So sad. So so sad.
#70PendragoonPosted 2/5/2013 10:08:06 AM
Megagunstarman posted...
To keep this topic current, there's this from the GAF thread:

http://www.neogaf.com/forum/showpost.php?p=47337979&postcount=1082

Jim Morrison, Chipworks
Been reading some of the comments on your thread and have a few of my own to use as you wish.

1. This GPU is custom.
2. If it was based on ATI/AMD or a Radeon-like design, the chip would carry die marks to reflect that. Everybody has to recognize the licensing. It has none. Only Renesas name which is a former unit of NEC.
3. This chip is fabricated in a 40 nm advanced CMOS process at TSMC and is not low tech
4. For reference sake, the Apple A6 is fabricated in a 32 nm CMOS process and is also designed from scratch. Itís manufacturing costs, in volumes of 100k or more, about $26 - $30 a pop. Over 16 months degrade to about $15 each
a. Wii U only represents like 30M units per annum vs iPhone which is more like 100M units per annum. Put things in perspective.
5. This Wii U GPU costs more than that by about $20-$40 bucks each making it a very expensive piece of kit. Combine that with the IBM CPU and the Flash chip all on the same package and this whole thing is closer to $100 a piece when you add it all up
6. The Wii U main processor package is a very impressive piece of hardware when its said and done.

Trust me on this. It may not have water cooling and heat sinks the size of a brownie, but its one slick piece of silicon. eDRAM is not cheap to make. That is why not everybody does it. Cause its so dam expensive


This needs to be posted here too.
---
Know Japanese? Post your advice in the topic below!
http://gamefaqs.com/boards/316-gamefaqs-world-japan/63714709