8GB DDR3 + 32MB eSRAM > 8GB GDDR5

#101IemanderPosted 9/25/2013 8:56:18 AM(edited)
jubrany posted...
http://www.eurogamer.net/articles/digitalfoundry-vs-the-xbox-one-architects

Goossens lays down the bottom line:

"If you're only doing a read you're capped at 109GB/s, if you're only doing a write you're capped at 109GB/s," he says. "To get over that you need to have a mix of the reads and the writes but when you are going to look at the things that are typically in the ESRAM, such as your render targets and your depth buffers, intrinsically they have a lot of read-modified writes going on in the blends and the depth buffer updates. Those are the natural things to stick in the ESRAM and the natural things to take advantage of the concurrent read/writes."


Sounds to me like the average bandwidth of the 32MB of eSRAM will likely not exceed 176GB/sec. And in order to acheive anywhere near the theoretical peak of 204GB/sec, significant optimization (i.e. time) will be required. (Not to mention the total average bandwidth of the DDR3 + eSRAM combo not coming close to 176GB/sec since DDR3 is 68GB/sec).


No matter how they spin it, it does not seem like a justifiable argument. I don't see PS4 costing much more in terms of electricity or manufacturing cost, but it certainly will be easier to develop for, while giving about the same (if not better) performance.

Microsoft didn't want to fork out the cash for the GDDR5 and they didn't think Sony was going to either. If they had a time machine, you can bet your ass they would have went with DDR5 with Xbox One.


Seriously man... don't get in on this argument. The Xbox One really is much much slower. There is simply no way 32MB of RAM running at high speed can give the same performance at 8GB of high speed.

The biggest advantage to doing this is no advantage for us. It's more cost/value effective for Microsoft giving them a higher sales margin. That's it. If it weren't for that, they would've gone the same route as Sony as it's simply much more powerfull.

This is also just one part of the performance issues the XOne has, it's also got a much slower GPU and rumored a much slower CPU.

Even if developers create a game specifically for the XOne and don't do any optimizations for the PS4 whatsoever, it's still going to run much faster on the PS4. The PS4 is allround better in every possible specification, no XOne game can ever be made to run worse on PS4 unless deliberate framerate caps are being used or deliberately start misusing the installed drivers/OpenGL API.
---
PC: i7-3820 | GTX 680 | 16GB DDR3 ---- Laptop: Dell Precision M6700 | i7-3740QM | Quadro K3000M | 16GB DDR3
360 | PS3 | WII | DS | 3DS | Vita
#102HalecticPosted 9/25/2013 1:45:51 PM
georgewduff1 posted...
Are people really talking about this again....

PS4 is going with a big piece of GDDR5 with 176Gb of max bandwith...with high latency...but the ability of Gddr5 of read/write simultaneously compensate in a certain way...
GDDR5 is optimized for max data bandwidth, but it is not stable enough to be used as CPU RAM...
Gonna find out how Sony is going deal with this....

XboxOne is going with standard DDR3 2133mhz...with a 68Gb max bandwith...lantency proof...CPU proof...But not ideal for the GPU...as you want max bandwith as possible on it...So they backed with EsRam...4 piece of 8mb super fast ram... it works on the low mb portion and let the high texture portion to the DDR portion...but with the upcoming DX11.2.....


Textures are the biggest RAM consumers by a large margin compared to everything else that needs to be stored in RAM.

With partial resident textures, or tiled resources, or as Microsoft likes to call it, the texture is split up into smaller tiles allowing you to load only the tiles necessary to be displayed at a particular detail level. So for our stretched road, it's not really necessary to display all the detail you need at 1 foot away, for the portion of the road that might be 50 feet away from the player's camera.

Without going into all the technical details, the benefits of removing these limitations are impressive enough that it allows developers to store texture data sizes that previously took up 3GB of RAM in only 16Mb of RAM

For the X1 this is particularly important since using this technique the 32Mb of eSRAM can theoretically be capable of storing up to 6GB worth of tiled textures going by those numbers. Couple the eSRAM's ultra fast bandwidth with tiled texture streaming middleware tools like Granite, and the eSRAM just became orders of magnitute more important for your next gen gaming.




GDDR5 will not work for running windows 8, which the Xbox One uses. Since the PS4 does not use windows, it is much less of a problem.
---
Digimon World Dawn FC 000111545835
#103SonyPonyTonyPosted 9/25/2013 2:49:59 PM
XBOXvsPC posted...
MetroidFan9999 posted...
XBOXvsPC posted...
The design is similar to the 360. And when you look at its specs and what it can do, it's quite surprising.


Not really, when theoreticals are still involved. Why would you make a system anymore that had more "theoreticals" than even necessary? And why would you say removing a theoretical component put "performance/power/etc" potential in an "uncomfortable place"?

How in gods name would GDDR5 over this design prove to be "uncomfortable" when it comes to performance? It is COMPLETELY backward logic.


Sony hasn't announced their CPU speed yet...I wonder why


CPU speed, actual number of servers they have, just like Sony wanting to stop reporting total units of consoles sold last gen to lessen the damage of how badly they were getting beaten by MS and Nintendo. When Sony knows they're inferior they just go silent :(
#104XBOXvsPC(Topic Creator)Posted 9/25/2013 3:21:18 PM
SonyPonyTony posted...
XBOXvsPC posted...
MetroidFan9999 posted...
XBOXvsPC posted...
The design is similar to the 360. And when you look at its specs and what it can do, it's quite surprising.


Not really, when theoreticals are still involved. Why would you make a system anymore that had more "theoreticals" than even necessary? And why would you say removing a theoretical component put "performance/power/etc" potential in an "uncomfortable place"?

How in gods name would GDDR5 over this design prove to be "uncomfortable" when it comes to performance? It is COMPLETELY backward logic.


Sony hasn't announced their CPU speed yet...I wonder why


CPU speed, actual number of servers they have, just like Sony wanting to stop reporting total units of consoles sold last gen to lessen the damage of how badly they were getting beaten by MS and Nintendo. When Sony knows they're inferior they just go silent :(


yep, MS already reversed policies so that's it.Sony don't have much to offer and nothing else to troll about. *enjoy the silence*
#10582xenoPosted 9/25/2013 3:40:30 PM
Halectic posted...
GDDR5 will not work for running windows 8, which the Xbox One uses. Since the PS4 does not use windows, it is much less of a problem.


.....Ummm......I think the only person that would claim this as the reason the X1 doesn't use GDDR5, is a microsoft PR guy.

This is the most ridiculous thing I've heard in a while, lol.
---
Shwing
#10695_EclipsePosted 9/25/2013 7:13:14 PM
82xeno posted...
Halectic posted...
GDDR5 will not work for running windows 8, which the Xbox One uses. Since the PS4 does not use windows, it is much less of a problem.


.....Ummm......I think the only person that would claim this as the reason the X1 doesn't use GDDR5, is a microsoft PR guy.

This is the most ridiculous thing I've heard in a while, lol.


Fanboys see other fanboys say things like that, and they are like "that sounds good" and they run with it.

That's why so many of them don't seem to understand than 32 MB doesn't magically make the other 8000+MB run faster.
---
Favorite game to date - Xenogears \/-/-/-/\
If you believe used games are the industries problem, then you sir or ma'am are a fool.
#107assassin10133Posted 9/25/2013 7:44:47 PM
95_Eclipse posted...
82xeno posted...
Halectic posted...
GDDR5 will not work for running windows 8, which the Xbox One uses. Since the PS4 does not use windows, it is much less of a problem.


.....Ummm......I think the only person that would claim this as the reason the X1 doesn't use GDDR5, is a microsoft PR guy.

This is the most ridiculous thing I've heard in a while, lol.


Fanboys see other fanboys say things like that, and they are like "that sounds good" and they run with it.

That's why so many of them don't seem to understand than 32 MB doesn't magically make the other 8000+MB run faster.


No, the 32MB doesn't make the other 8000+MB run faster, but it doesn't take much common sense for someone to realize that game code will not require that all 8000+MB gets accessed in the same read/write cycle, therefore if the game code requires only 32MB be accessed at a time then yes it will make a difference, although I don't know how feasible or difficult that will be. But PS4 setup is much more developer friendly and easier to use that bandwidth, and I imagine the performance results will be more consistent as well due to the much simpler design.
#108VanderZooPosted 9/25/2013 8:07:22 PM
The 8gb of unified GDDR5 is better for gaming. It's really just that simple lol.
---
Man of Steel - 26.06.13
"You will give the people of Earth an ideal to strive towards"
#109georgewduff1Posted 9/25/2013 8:18:14 PM
assassin10133 posted...
95_Eclipse posted...
82xeno posted...
Halectic posted...
GDDR5 will not work for running windows 8, which the Xbox One uses. Since the PS4 does not use windows, it is much less of a problem.


.....Ummm......I think the only person that would claim this as the reason the X1 doesn't use GDDR5, is a microsoft PR guy.

This is the most ridiculous thing I've heard in a while, lol.


Fanboys see other fanboys say things like that, and they are like "that sounds good" and they run with it.

That's why so many of them don't seem to understand than 32 MB doesn't magically make the other 8000+MB run faster.


No, the 32MB doesn't make the other 8000+MB run faster, but it doesn't take much common sense for someone to realize that game code will not require that all 8000+MB gets accessed in the same read/write cycle, therefore if the game code requires only 32MB be accessed at a time then yes it will make a difference, although I don't know how feasible or difficult that will be. But PS4 setup is much more developer friendly and easier to use that bandwidth, and I imagine the performance results will be more consistent as well due to the much simpler design.


We will see...When you will be online on a PS4 multiplayer game and multitask with the camera...running the OS background...and gpu computing surround sound...

I clearly see a advantage of a 8gb 2133mhz ddr3 cpu friendly with 32mb esram and 8gb flash memory linked the southbrige to be faster...and its own audio cpu...
---
XboxOne and my gaming addiction are perfect match :) GT: george w duff
#110Cowboy082288Posted 9/25/2013 10:16:06 PM
Great debate guys, real insightful stuff here. Everyone has represented their side well. Some people got maybe a little to emotional about it, but that's ok, I like enthusiasm.

All that being said, I think we can conclude. Ford is probably better than Chevy.
---
PSN/XBL/Steam - cowboyoni