For people who can't comprehend the differences between Ram
You're browsing the GameFAQs Message Boards as a guest. Sign Up for free (or Log In if you already have an account) to be able to post messages, change how messages are displayed, and view media in posts.
Check out my PC,360,ps3, Movie, Old School game vidoes http://www.youtube.com/user/EpicConspiracy?feature=mhee
XBOXvsPC posted..."PS4 has 256 32MB blocks running at 176 GB/s." - that's not how it works kid.
Sorry man, but it really falls on deaf ears. All the Sony Trolls are busy stroking themselves to this topic right now, like sheep.
Official Voice of the Xbox One Community
XBOXvsPC posted...XBOXvsPC posted..."PS4 has 256 32MB blocks running at 176 GB/s." - that's not how it works kid.
I don't know what you know about sheep...but hooves make for terrible hand jobs.
Instead of spewing insults why not prove TC wrong? All I see here from the defensive is lol no...stupid TC.
I don't conform to social convention
The eSRAM garbage is just MS being cheap.
They'd have made it better for developers and gamers alike if they had simply left it out and went with GDDR5 instead. Now developers will have a tougher learning curve with the Xbone while the console will still be weaker than PS4 anyway.
Throwback Mode 4 Lyfe
Well, I'm sure you know more than the people who actually created the thing, so yes - we will defer to your superior knowledge that it is indeed a slow piece of ****, utterly inferior to the mighty PS4 in every conceivable way.
It's laughable to expect a poster on GF to know more about the console specs than the devs who actually work on them let alone a fanboy. Thanks for the chuckle anyway
The people who created the thing have little credibility because obviously they're not going to admit that their creation falls short of expectations. They're going to prop their device up and hype it. Basic PR.
Unless they give concrete specs and explanations that show why Xbone can compete with PS4, it's safe to take their vague "just wait and see" claims with a grain of salt.
Throwback Mode 4 Lyfe
I still can't get over the fact that people don't understand that the 32 MB is all that runs at that speed. The magical crop fairy apparently makes the rest speed up.
Favorite game to date - Xenogears \/-/-/-/\
If you believe used games are the industries problem, then you sir or ma'am are a fool.
The eSRAM garbage is just MS being cheap.
This argument just doesn't make any sense. If there has been anything demonstrated this next generation, it is that Microsoft has been anything but cheap. They have spent billions developing the console itself, they've spent hundreds of millions redesigning the Kinect, they've invested a billion (so far) for exclusives, they've invested 8.64 billion in cloud computing, which is 90% of their annual R&D budget (of course that also goes to non-azure things like microsoft office, but it's obvious that a vast majority of that is going straight to the 300,000 dedicated servers and the companies they recently bought up), and they've likely spent hundreds of millions more investing in partnerships like with EA.
The notion that they decided to spend billions on cloud computing to streamline online games, but that going with GDDR5 was just too expensive is simply absurd.
Furthermore, the notion that Cloud Computing is just blast processing by Microsoft is equally as absurd. They're not dropping billions in R&D just so that they can use it as a PR selling point.
They purposefully chose the eSRAM in conjunction with the DDR3 ram because experts that work for them see the advantage in this. They're not going to start saving pennies on something as trivial as RAM, yet invest billions into cloud computing if they weren't absolutely sure that the RAM was perfectly sufficient.
Honestly, this whole tech specs argument is a waste of time. The games are going to look and play virtually identically. If Microsoft wanted to have larger numbers than Sony, they could easily have done that. Microsoft can buy and Sell Sony twice over. Perhaps after this generation of console wars people will finally realize that we're not arguing about the number of bits anymore and that tech specs have never and will never have a distinctive effect on the graphical quality of the game. Go take a look at comparisons of games between the 360 and PS3 multiplats and you'll quickly notice that the graphical distinction between them is nonexistent. Travel back to the 80's and compare the NES with the Atari 7800 and you'll quickly see the exact same non-debate playing out. The Atari 7800 utilized the exact same processor and memory that the NES did. The tech specs don't matter. What matters is getting talented people working on your game with the solid console you've built. Both the PS4 and Xbox One have a solid console. The question becomes whether they can go above and beyond simple token efforts.
"I refuse to prove that I exist" says God. "For proof denies faith, and without faith I am nothing..."
You are making the opposite mistake of the people who want to simply add the ESRAM speed to the main ram. You are forgetting utilization. Let's take an extreme case and say there was a very simple game that only needed 32 MB to run. In that case everything could be contained in the ESRAM and the ESRAM speed would equal the running speed of the system. However, as you increase the complexity and quality of the game you need to use more and more main memory in addition to the ESRAM. The more you do this, the lower the "effective" memory rate will go.
So you might be saying Ah HA!! Well I want a complex high quality game so I'll be using all of the memory so your numbers still stand. Well, no. You don't use all of memory for every frame. More than likely most of a level is out of sight at any given time so it is not needed at that time. So the actual rate will vary from game to game based on its design.
However the benefit for the PS4 is that it will just work without any special optimization needed to make sure that at any given time the amount of memory that is needed is not so large that it brings the effect rate of memory down to the 68GB/s of the main memory. That is probably why there was a rumor that an initial Call of Duty Ghosts port to the XB1 ran at 15 FPS while the PS4 port ran at 90 FPS. The XB1 has to be optimized so that it doesn't need too much main memory at any one time. That optimization might require anything from using less quality or fewer textures to lowering draw distances. So it is possible that the memory limitation of the XB1 could impact game quality.
MS's Azure cloud computing system was not made for the XB1. It is a general purpose system that already exists and that MS is repurposing it to be used for the XB1. MS is clearly overhyping the idea because all they are is dedicated servers. I was playing on Battlefield 1942 dedicated servers way back in 2002 so that concept isn't new.
MS made a compromise to choose eSRAM. They needed the low latency for the Kinect lookup database so they went with the small but fast eSRAM along with the larger and cheaper DDR3. MS also needed the guaranteed large amount of ram to do their fast app switching and couldn't gamble on GDDR5 prices dropping enough to allow 8GB. You'll notice that neither of those reasons relate to traditional gaming. You simply have to accept that MS sacrificed some of the XB1's performance so that it could be more of a multimedia device.
Excellent post TC. These XBots don't realize that MS chose a tiny amount of super fast RAM to run kinect and not for gaming.
Math > trolls.
PS3 / Vita / 2600K@4.7Ghz water cooled by Corsair H80, Dual GTX580s in SLi, 8 Gigs Corsair RAM, 120Hz screen. Vsync? I don't need no Vsync! PSN: Liquidpain
oh wth! no response?
yup totally agree math is better. would have loved for someone else to repeat and explain. i actually did hope for an answer. i saw simple freaking arithmetic, attempted to duplicate results, im just like that i guess, you know checking for myself. but seriously, trolls on either side, its simple arithmetic, i tried a couple different ways to get similar numbers, im apparently using different information in my math. anyone coulda pulled up the calculator for a second. the reason i did is that 255% total memory is significant so i checked for myself. i also agree it might mean nothing real-world. but no matter what you cant take billions per second for a minute and get a million of anything so i just wanted to know where i went wrong. and again its simple arithmetic, its easy to prove him wrong if he is so throw some numbers up. show how the total system is leveraged to nearly balance out.
Add user to Ignore List after reporting