Digital Foundry: Wii U

#71kissdadookiePosted 2/5/2013 10:12:40 AM
aether17 posted...
kissdadookie posted...


You people are clueless aren't you? Those 22 logical unaccounted for blocks is not going to make a dent of a difference in GPU performance in any significant way. The major parts of the GPU has been identified already and it's a match for a series 4xxx Radeon GPU except it's a slightly gimped iteration of the series 4xxx.

Why don't you people go and research the makeup of GPUs? If you had bothered to research this, you will understand how the 22 logical blocks that aren't accounted for really is not going to change the fact that this is pretty much a Radeon series 4xxx part.


Clueless? No. I've seen the 4870 die shot many times, the Latte structure is pretty different from that, hence it's customized. Major parts have been identified, but is it all for sure? We are speculating and assuming that all of the R700 configurations still apply, and we cannot even be sure about that, which is why the board on NeoGaf clearly state that it's "speculation". All of this is IF it wasn't configured in a different manner. I'm not a fantroll, I don't expect PS4720 performance, but taking speculation and passing it out as "fact" (like DF has done) is ridiculous.

Again, I DO NOT expect PS4720 performance, and I never did.


Again, you also understand the laws of physics right? Based on what was identified on the Wii U GPU, the actual dimensions of the Wii U GPU, it's essentially a customized Radeon 4xxx GPU. Customized or not, it's not exceeding the performance of the Radeon 4xxx series GPU. All we are gauging here is the performance level of the hardware, the details of how it all works really isn't all the important if all we want to know is performance levels of the machine. Thus the reason why all we ever needed to identify was the major components of the GPU. The Wii U can do a million and twenty seven amazing things that wasn't in the original Radeon 4xxx series but the bottom line is, graphics performance-wise, we are still looking at Radeon 4xxx series level of performance.

It's like the 3DS, that PICA200 GPU has a LOT of bells and whistles but it still doesn't change the fact that it's at a level which is not that much further above from the PSP. It's maybe 1.5x's the performance of the PSP with the added bells and whistles of some shader support but the thing still lags behind even a PS2 (yet people claimed that the 3DS is more powerful than the Wii U, hilarious). Why is it that with all these bells and whistles of the PICA200 GPU that the games are essentially just a tad below PS2 level at best? Because the BASE GPU is only capable of so much. No amount of bells and whistles and customization is going to get around the base GPU performance.
#72The_HyphenatorPosted 2/5/2013 10:14:22 AM
kissdadookie posted...
NeoGAF if you haven't figured out yet, are TERRIBLE at tech analysis. Remember their claims about the 3DS GPU? HILARIOUSLY wrong. PROVEN wrong. Yet fanboys are STILL running with the VERY wrong analysis that originated from NeoGAF.


Considering that DF apparently can't even keep track of what card they're claiming the Wii U's GPU is based on (they claim it's a RV770 GPU, then state that it's based on a Radeon HD 4670, which uses a completely different GPU), I think I'll be listening to NeoGAF over DF.

Besides, I've seen far more people trolling Nintendo cite NeoGAF as a source than people supporting them. It's hardly a haven for "fanboys."
#73silverbulltPosted 2/5/2013 10:14:28 AM
PENDRAG0ON posted...
Megagunstarman posted...
To keep this topic current, there's this from the GAF thread:

http://www.neogaf.com/forum/showpost.php?p=47337979&postcount=1082

Jim Morrison, Chipworks
Been reading some of the comments on your thread and have a few of my own to use as you wish.

1. This GPU is custom.
2. If it was based on ATI/AMD or a Radeon-like design, the chip would carry die marks to reflect that. Everybody has to recognize the licensing. It has none. Only Renesas name which is a former unit of NEC.
3. This chip is fabricated in a 40 nm advanced CMOS process at TSMC and is not low tech
4. For reference sake, the Apple A6 is fabricated in a 32 nm CMOS process and is also designed from scratch. Itís manufacturing costs, in volumes of 100k or more, about $26 - $30 a pop. Over 16 months degrade to about $15 each
a. Wii U only represents like 30M units per annum vs iPhone which is more like 100M units per annum. Put things in perspective.
5. This Wii U GPU costs more than that by about $20-$40 bucks each making it a very expensive piece of kit. Combine that with the IBM CPU and the Flash chip all on the same package and this whole thing is closer to $100 a piece when you add it all up
6. The Wii U main processor package is a very impressive piece of hardware when its said and done.

Trust me on this. It may not have water cooling and heat sinks the size of a brownie, but its one slick piece of silicon. eDRAM is not cheap to make. That is why not everybody does it. Cause its so dam expensive


This needs to be posted here too.


~$150(LESS) for a logic board IS CHEAP given the type of electronic. You get what you pay for though. It's a highly customized piece of silicon and that has its ups and downs. Third-party support being one of the downs given next-gen development philosophies.
---
XBOX360 - It only does everything.
Nintendo FIX the Wii U FFS: http://youtu.be/gl5_qODv-gQ
#74kissdadookiePosted 2/5/2013 10:21:41 AM
The_Hyphenator posted...
kissdadookie posted...
NeoGAF if you haven't figured out yet, are TERRIBLE at tech analysis. Remember their claims about the 3DS GPU? HILARIOUSLY wrong. PROVEN wrong. Yet fanboys are STILL running with the VERY wrong analysis that originated from NeoGAF.


Considering that DF apparently can't even keep track of what card they're claiming the Wii U's GPU is based on (they claim it's a RV770 GPU, then state that it's based on a Radeon HD 4670, which uses a completely different GPU), I think I'll be listening to NeoGAF over DF.

Besides, I've seen far more people trolling Nintendo cite NeoGAF as a source than people supporting them. It's hardly a haven for "fanboys."


Actually, it doesn't make a difference between RV770 and RV700. Why? Because from a performance aspect, they are pretty much a match. Only difference is that one is fabricated using the 45nm process, but performance-wise, they are essentially the same parts.

It's like how the PS3 and 360 both started using the 45nm process for their processors, didn't change the processors though, just changed the size of their packaging.

So no, DF was not wrong at all or throwing out inaccurate information. DF analysis is always about performance, that's it. So if they use RV770 and RV700 interchangeably, it makes NO difference in the context they are using it in.
#75aether17Posted 2/5/2013 10:23:58 AM
kissdadookie posted...


Again, you also understand the laws of physics right? Based on what was identified on the Wii U GPU, the actual dimensions of the Wii U GPU, it's essentially a customized Radeon 4xxx GPU. Customized or not, it's not exceeding the performance of the Radeon 4xxx series GPU. All we are gauging here is the performance level of the hardware, the details of how it all works really isn't all the important if all we want to know is performance levels of the machine. Thus the reason why all we ever needed to identify was the major components of the GPU. The Wii U can do a million and twenty seven amazing things that wasn't in the original Radeon 4xxx series but the bottom line is, graphics performance-wise, we are still looking at Radeon 4xxx series level of performance.

It's like the 3DS, that PICA200 GPU has a LOT of bells and whistles but it still doesn't change the fact that it's at a level which is not that much further above from the PSP. It's maybe 1.5x's the performance of the PSP with the added bells and whistles of some shader support but the thing still lags behind even a PS2 (yet people claimed that the 3DS is more powerful than the Wii U, hilarious). Why is it that with all these bells and whistles of the PICA200 GPU that the games are essentially just a tad below PS2 level at best? Because the BASE GPU is only capable of so much. No amount of bells and whistles and customization is going to get around the base GPU performance.


I don't know what laws of physics has to do with this (TDP?)....., anyway, they added several components newer than whatever the 4xxx series was capable off (newer than D10.1), if it was simply a mid ranged 4xxx GPU, it wouldn't have those extras. DF is assuming 4670 performance based on assumptions in shader count etc. Could be more for all we know it core configurations weerent changed.
---
According to someone, I am a well known Troll.........not sure how.
#76The_HyphenatorPosted 2/5/2013 10:29:29 AM
kissdadookie posted...
The_Hyphenator posted...
kissdadookie posted...
NeoGAF if you haven't figured out yet, are TERRIBLE at tech analysis. Remember their claims about the 3DS GPU? HILARIOUSLY wrong. PROVEN wrong. Yet fanboys are STILL running with the VERY wrong analysis that originated from NeoGAF.


Considering that DF apparently can't even keep track of what card they're claiming the Wii U's GPU is based on (they claim it's a RV770 GPU, then state that it's based on a Radeon HD 4670, which uses a completely different GPU), I think I'll be listening to NeoGAF over DF.

Besides, I've seen far more people trolling Nintendo cite NeoGAF as a source than people supporting them. It's hardly a haven for "fanboys."


Actually, it doesn't make a difference between RV770 and RV700. Why? Because from a performance aspect, they are pretty much a match. Only difference is that one is fabricated using the 45nm process, but performance-wise, they are essentially the same parts.

It's like how the PS3 and 360 both started using the 45nm process for their processors, didn't change the processors though, just changed the size of their packaging.

So no, DF was not wrong at all or throwing out inaccurate information. DF analysis is always about performance, that's it. So if they use RV770 and RV700 interchangeably, it makes NO difference in the context they are using it in.


Um, yeah, no. The R770 is a MUCH better chip:

http://www.gpureview.com/ati-rv770-chip-151.html
http://www.gpureview.com/ati-rv730-chip-156.html

Oh, and FYI, both the 770 and the 730 were manufactured using a 55nm process, not a 45nm process and certainly not the 40nm that has been confirmed for the Wii U for a while now. So you're wrong on that point, too.
#77icarus231Posted 2/5/2013 10:41:53 AM
kissdadookie posted...
PENDRAG0ON posted...
kissdadookie posted...
PENDRAG0ON posted...
aether17 posted...
I'm sorry but these idiots are simply taking the speculation of NeoGaf and passing it out as a fact, despite not completely understanding what everything does in there. 22 logical blocks are unaccounted for, and they make a conclusion? Ridiculous.


Yea, Digital Foundry lost a lot of credibility in my eye thanks to this. About all I will trust them for going forward is the framerate comparisons.


You people are clueless aren't you? Those 22 logical unaccounted for blocks is not going to make a dent of a difference in GPU performance in any significant way. The major parts of the GPU has been identified already and it's a match for a series 4xxx Radeon GPU except it's a slightly gimped iteration of the series 4xxx.

Why don't you people go and research the makeup of GPUs? If you had bothered to research this, you will understand how the 22 logical blocks that aren't accounted for really is not going to change the fact that this is pretty much a Radeon series 4xxx part.

Essentially, the argument here and the comments made about how DF doesn't know their stuff are just stupid excuses and apologizing for Nintendo bringing out a next gen console with basically PS360 generation specs. You people are just going to keep going on and on about how sites like DF and all the analysis articles that have been out for the Wii U are BS and essentially unless Nintendo comes out with a public spec sheet, you folks are just going to claim that anything negative about the Wii U must be complete BS unless Nintendo themselves makes it public (and seriously, that's never going to happen, what company is stupid enough to publicly announce their deficiencies?).


Over react much? Neogaf is also not happy with DF because they took their base findings, which were incomplete and ran with them. This is why people aren't happy with them right now, what little they tried to add on their own was even incorrect. Neogaf users doing the breakdown that DF ripped off have stated that comparisons to any off the shelf gpu is pointless.

Right now all we can do is wait until the analysis is complete, until then we are still in the dark.


NeoGAF if you haven't figured out yet, are TERRIBLE at tech analysis. Remember their claims about the 3DS GPU? HILARIOUSLY wrong. PROVEN wrong. Yet fanboys are STILL running with the VERY wrong analysis that originated from NeoGAF.

Here's the thing about NeoGAF, most of them take basic principals learned from the internet and then add a lot of assumptions to them, basically fabricating complete nonsense masquerading as facts. Take for instance, the Chipworks breakdown of the Wii U GPU. Yes, there's a lot of extra silicon there which is unaccounted for, but all that we need to know is the main GPU. Thus Chipworks outlined the main GPU components, thus giving us the complete picture that this is 100% a Radeon 4xxx GPU part. That's all we need to know, the extra silicon could do a hundred million amazing things but it's not going to change the fact that the GPU performance is STILL going to be Radeon 4xxx level (which some features actually being cut out on the Wii U GPU, making it slightly gimped).

There's a difference between wanting to know how everything works and wanting to know performance levels. For the purpose of knowing the level of performance, the DF article is spot on and that's all DF was out to accomplish, establish the level of performance on the Wii U. Simple as that. The only question remaining now is if the CPU is the bottleneck or if it's also the memory bandwidth.
.


Its nice to see all these expert computer engineers on here
#78PendragoonPosted 2/5/2013 10:43:10 AM
Chipworks has stated that the GPU is 100% custom and is not based on any existing AMD design. It has no relation to the 4000 series. All this talk of R770 vs R730 vs R700 is pretty much pointless.
---
Know Japanese? Post your advice in the topic below!
http://gamefaqs.com/boards/316-gamefaqs-world-japan/63714709
#79GoombaXPosted 2/5/2013 10:46:20 AM
Icecreamdunwich posted...
From: darkjedilink | #040
The_Hyphenator posted...
darkjedilink posted...
I chose the second link because the 7670 is a rebadged 6670 GDDR5. As you can see, the Wii U GPU's stock form just plain outclasses the GPU's in Durango and Orbis, regardless of GCN architecture.

Also, that first comparison link that you say is supposed to be a 6670 is actually a 6770.


Ah my bad. I meant to say 6770, since that's what I recalled hearing that Microsoft was supposed to be using. If it's the 6670, that's even worse.


Documents leaked in a court battle between AMD and nVidia, involving some industrial espionage, confirm that the PS4 is getting a 7670 mated to a quad-core AMD A8 APU (which is weird, since the only reason to get a GPU and APU is to tie them together, and no AMD APU can tie with a 7670, and even if it could, DirectX is required to do so) and Durango's getting a 6670 with a quad-core IBM PowerPC derivative (basically, a quad-core version of the Wii U's CPU).

Nobody wants to believe it though, because it puts those consoles right around the Wii U's power and ability level, meaning the Wii U won't be a repeat of the Wii powerwise.


Anyone want to screenshot this for future use? It's simply too funny not to.

Lol you would be a fool to think it would bee much more powerful.specially since Sony reps have said the ps4 won't be much more powerful than the ps3.
But what evs troll on kid.
---
Pro tip : Thats not how I would do it
#80The_HyphenatorPosted 2/5/2013 10:46:40 AM
PENDRAG0ON posted...
Chipworks has stated that the GPU is 100% custom and is not based on any existing AMD design. It has no relation to the 4000 series. All this talk of R770 vs R730 vs R700 is pretty much pointless.


Yeah, I know. It's just impressive how much DF screwed up their own analysis, even assuming that the GPU was an AMD chip.

Sadly, a lot more people are going to read that article than NeoGAF's breakdown. Which means we can expect to see a lot more ill-informed haters online in the near future...