Why did MS uses DDR3 ram?

#41TonyKojimaPosted 12/30/2013 2:42:53 PM
LOL A measly 32mb of of ESram?? What is this 2002?
---
Though the XBOX 360 is good in theory, it's hardware limitations say otherwise - Hideo Kojima
PSN - Guncrazy56
#42TBONE_OGPosted 12/30/2013 2:46:45 PM
Ellesarien posted...
Same %^&^...EVERY freaking day...


Laughed at this post, then closed the topic.
---
Always O.G.
#43DerekLoffinPosted 12/30/2013 2:49:52 PM
The rumor is this:

Back when both consoles were in the design phase, GDDR5 was still only being produced with chips that would make 2GB reasonable. MS needed a lot more than that, so early on went the route of DDR3 which they knew they could get in the quantity they wanted. Both console makers then designed their APU's around this, MS putting in the ESRAM at the cost of the GPU portion to mitigate the bandwidth loss of DDR3.

However, things went sideways when the chip makers not only rolled out 1 size, but 2 sizes bigger of GDDR5 chips, allowing Sony to switch to 8GB as well. However, by this point, MS was already set in their plans (both chips have memory controller inbuilt so changing them isn't really an option in late development) so they are stuck with DDR3. So essentially, Sony got lucky.

Now, a couple things. Number one, DDR3 does NOT have lower latency. It has a per cycle lower latency, but because GDDR5 is clocked higher, it actually has the lower actual latency. Two, although this one I've yet to see confirmation on, since GDDR5 is DDR3 under the hood (the only difference is the memory controller) it is possible that PS4's CPU's memory controller might treat it as such, but again I have never seen that confirmed.

As to the ESRAM, it is meant to mitigate the difference. Make no mistake, it is needed. The DDR3 is just not fast enough by itself to cover the bandwidth needs of the GPU, even though it is a weaker one. It is debatable if GDDR5 would be overkill for the GPU, but DDR3 is definitely not enough. MS wouldn't have put the transistor budget of their APU to the ESRAM if it wasn't needed.
---
--
I am power made flesh, feel how weak you truly are. --Akuma
#44kissdadookiePosted 12/30/2013 2:53:59 PM
DerekLoffin posted...
The rumor is this:

Back when both consoles were in the design phase, GDDR5 was still only being produced with chips that would make 2GB reasonable. MS needed a lot more than that, so early on went the route of DDR3 which they knew they could get in the quantity they wanted. Both console makers then designed their APU's around this, MS putting in the ESRAM at the cost of the GPU portion to mitigate the bandwidth loss of DDR3.

However, things went sideways when the chip makers not only rolled out 1 size, but 2 sizes bigger of GDDR5 chips, allowing Sony to switch to 8GB as well. However, by this point, MS was already set in their plans (both chips have memory controller inbuilt so changing them isn't really an option in late development) so they are stuck with DDR3. So essentially, Sony got lucky.

Now, a couple things. Number one, DDR3 does NOT have lower latency. It has a per cycle lower latency, but because GDDR5 is clocked higher, it actually has the lower actual latency. Two, although this one I've yet to see confirmation on, since GDDR5 is DDR3 under the hood (the only difference is the memory controller) it is possible that PS4's CPU's memory controller might treat it as such, but again I have never seen that confirmed.

As to the ESRAM, it is meant to mitigate the difference. Make no mistake, it is needed. The DDR3 is just not fast enough by itself to cover the bandwidth needs of the GPU, even though it is a weaker one. It is debatable if GDDR5 would be overkill for the GPU, but DDR3 is definitely not enough. MS wouldn't have put the transistor budget of their APU to the ESRAM if it wasn't needed.


That's the rumour. The reality is more likely the following:

MS built a PC basically whilst Sony aimed at building a gaming specific system. So if you look at how the XBOne is designed, you can easily tell that it was built very much like traditional PCs would be built, instead of prioritizing everything for the video card, they've prioritized the box to handle general tasks better. It's very likely that MS looked at the memory more as general system memory rather than looking at memory the way Sony did. What Sony made with the PS4 is like they made a really nice video card with a computer hanging off of it.
#45MaximoomPosted 12/30/2013 2:59:51 PM
ILikeGamesMan posted...
Sony(c)(TM)(r) - Home of the teraflops!*

* Games sold separately **

** No games available at this time.


good one, should have used a small barely visible font.
#46Bellum_SacrumPosted 12/30/2013 3:11:57 PM
It was an intelligent choice, like calling it the Xbox One, mandatory Kinect, and that end of May reveal that was all about TV & Sports parterships.
---
"Now go ahead and leap ignorantly to the defense of wealthy game companies who don't know or care about you."
#47RyzekiPosted 12/30/2013 3:17:00 PM
krystyla posted...
ZenGamer64 posted...
Why not the more powerful DDR5 like Sony used is the PS4?


It's GDDR5, which is a version of DDR3, and has a faster bandwidth but is worse at multi tasking afaik


Not worse, at all. It goes from the same to faster for small memory, huge IO operations.

TC, they went DDR3 because it's cheaper and it provides a good compromise between performance and price.
---
Core i7 4700MQ | | 16GB DDR3L || 128GB SSD + 1TB || GTX780M OC
#482ndAtomiskPosted 12/30/2013 3:18:00 PM
So that ZenGamer64 would create this topic.
---
I'm gonna cut out your eyes and piss in the ****ing sockets! ~ Kaine
#49WorldStarHHPosted 12/30/2013 3:19:18 PM
shootsmack posted...
DDR3 delivers the best possible television experience.
#50SythisTaruPosted 12/30/2013 3:21:08 PM
WorldStarHH posted...
shootsmack posted...
DDR3 delivers the best possible television experience.


Im using DDRG4 and it gives a smoother television signal, Sir.