Is the GPU on XB1 really that bad?

#41method115Posted 4/22/2014 8:43:20 AM
BoneRevolution posted...
BoneRevolution posted...
GPS are inaccurate, I still prefer a map


On a serious note and as far as I'm concerned, the Xbox One is actually the only system to have dedicated video ram: 32MB ESRAM (204 GB/s)


Yep and all that's done for MS is given them a bunch of games running below 1080p.
---
PSN: method114
XBL: method115
#42BoneRevolutionPosted 4/22/2014 8:47:15 AM
method115 posted...
BoneRevolution posted...
BoneRevolution posted...
GPS are inaccurate, I still prefer a map


On a serious note and as far as I'm concerned, the Xbox One is actually the only system to have dedicated video ram: 32MB ESRAM (204 GB/s)


Yep and all that's done for MS is given them a bunch of games running below 1080p.


Look on the bright side. Xbox One games usually looks better than the competition, regardless of resolution.
#43EnclavePosted 4/22/2014 9:54:28 AM(edited)
xHughJasx posted...
Consoles are used so so so differently to make games than pcs. Look at the last generation. We have amazing looking games that utilize less than A HALF of ONE gig of ram. The ps4 and Xbox one resolutions will even out as time goes on without a doubt.


Yes, PCs need more RAM due to multitasking, doesn't change what I said. Games more and more are being designed around 4-6 GB RAM. The games you see with these kinds of requirements for are generally open world games.

Now speaking about your dedicated gpu ram for pcs? ? Again, my brothers gpu does not use gddr5 dedicated to his card. Mine does. We both can run tomb raider on ultra no problem. This gddr5 bs gets so overblown. I'm not saying it isn't better. But once Xbox devs figure out how to best utilize the system, it won't be a problem. The Xbox one is leaps and bounds more powerful than the 360. Thus 720p business won't last.


I'm not speaking of dedicated RAM for PCs, I'm pointing out your mistake in only talking about a PCs on board RAM and ignoring the fact that video cards have dedicated memory that they also utilise. Now yes, a number of cards still use DDR3 for their video cards dedicated memory, however more and more cards use GDDR5 instead of DDR3 as the prices are going down and it's just all around superior to DDR3 with regards to graphics processing.

The GDDR5 thing does indeed get overblown, but you'll note I didn't overblow it in my post so I don't know why you are reacting in this manner. Additionally you'd have a point about it not being a problem if over time those very same devs wouldn't be getting more and more used to working with the PS4 hardware and pushing more and more out of it. It's a fact that the Xbone is not as powerful as the PS4. Games that get designed for the PS4 will always have higher resolutions or higher frame rates than the Xbone versions of games, it's just the nature of how hardware works, you cannot optimise so much that the difference in power between these two systems will vanish.

Honestly, don't know why I'm typing so much, I simply had pointed out you completely ignored a slew of memory that is in PCs and thus made a point that's inherently flawed. You then go on to attack my post, going on about things I never mentioned (likely in an attempt to cover up your mistake).

I know Sony boys don't like hearing that and I'm sorry. It's the truth.


Sorry to tell you but I'm not a Sony fanboy. Funny how people always accuse me of being a fanboy of Sony or Microsoft or Nintendo or PCs when I point out them being incorrect about something. So many idiots have this idea that if you're the least bit critical of something then you're obviously a fanboy of the competition, they of course do this to try to discredit the person they're "debating" with without actually admitting wrong or addressing valid points, it's really a very dishonest way to carry yourself in discussions.
---
The commercial says that Church isn't for perfect people, I guess that's why I'm an atheist.
#443HPPosted 4/22/2014 10:09:01 AM
People seem to think that only the Xbox One's games will look better with time.
#45Webmaster4531Posted 4/22/2014 11:07:40 AM
Enclave posted...
xHughJasx posted...
Consoles are used so so so differently to make games than pcs. Look at the last generation. We have amazing looking games that utilize less than A HALF of ONE gig of ram. The ps4 and Xbox one resolutions will even out as time goes on without a doubt.

Yes, PCs need more RAM due to multitasking, doesn't change what I said. Games more and more are being designed around 4-6 GB RAM. The games you see with these kinds of requirements for are generally open world games.

xHughJasx posted...
Now speaking about your dedicated gpu ram for pcs? ? Again, my brothers gpu does not use gddr5 dedicated to his card. Mine does. We both can run tomb raider on ultra no problem. This gddr5 bs gets so overblown. I'm not saying it isn't better. But once Xbox devs figure out how to best utilize the system, it won't be a problem. The Xbox one is leaps and bounds more powerful than the 360. Thus 720p business won't last.

I'm not speaking of dedicated RAM for PCs, I'm pointing out your mistake in only talking about a PCs on board RAM and ignoring the fact that video cards have dedicated memory that they also utilise. Now yes, a number of cards still use DDR3 for their video cards dedicated memory, however more and more cards use GDDR5 instead of DDR3 as the prices are going down and it's just all around superior to DDR3 with regards to graphics processing.

The GDDR5 thing does indeed get overblown, but you'll note I didn't overblow it in my post so I don't know why you are reacting in this manner. Additionally you'd have a point about it not being a problem if over time those very same devs wouldn't be getting more and more used to working with the PS4 hardware and pushing more and more out of it. It's a fact that the Xbone is not as powerful as the PS4. Games that get designed for the PS4 will always have higher resolutions or higher frame rates than the Xbone versions of games, it's just the nature of how hardware works, you cannot optimise so much that the difference in power between these two systems will vanish.

Honestly, don't know why I'm typing so much, I simply had pointed out you completely ignored a slew of memory that is in PCs and thus made a point that's inherently flawed. You then go on to attack my post, going on about things I never mentioned (likely in an attempt to cover up your mistake).

xHughJasx posted...
I know Sony boys don't like hearing that and I'm sorry. It's the truth.

Sorry to tell you but I'm not a Sony fanboy. Funny how people always accuse me of being a fanboy of Sony or Microsoft or Nintendo or PCs when I point out them being incorrect about something. So many idiots have this idea that if you're the least bit critical of something then you're obviously a fanboy of the competition, they of course do this to try to discredit the person they're "debating" with without actually admitting wrong or addressing valid points, it's really a very dishonest way to carry yourself in discussions.


He also completely forgot how the old consoles were holding pc gaming back. Of course tomb raider plays fine on a graphics card with on board ddr3 .
---
Ad Hominem.
#46RyzekiPosted 4/22/2014 12:40:03 PM
richboy900 posted...
Yeah it is. It's only like what, a $100 gpu. It's a very low spec card. Isn't it something like a 7870?

In contrast, the 360 had a cutting edge gpu worth around $600 when it was new. Pc didn't get its equivalent until the following year. It was even more powerful than that used in the PS3.


Nah, it was a modified version. Lower specs but still pretty good. Specially castrated memory bandwidth though, which is partly the reason it had many low resolution games.
---
Core i7 4700MQ | | 16GB DDR3L || 128GB SSD + 1TB || GTX780M OC
#4782xenoPosted 4/22/2014 12:42:58 PM
BoneRevolution posted...
Look on the bright side. Xbox One games usually looks better than the competition, regardless of resolution.


....hmmm, sounds legit.
---
Shwing
#48Talk2DaHandPosted 4/22/2014 12:47:04 PM
xHughJasx posted...
pigboy posted.
it is.

BBQMoosehead posted...
I hear people keep saying that XB1 has a weak GPS and no dedicated video ram. Is the hardware for it really that bad?


It is.

Xbone GPU is about 50% weaker raw power wise than the PS4 one by pure "number". Plus the Xbone also had to reserve 3-4% of GPU power for the Kinect w/ its already weaker GPU. Big M had to reduce the reserve from 10% due to the complaints of many developers for the Xbone is losing ground fast.

Another huge issue is that the PS4 memory, GDDR5, is so much faster and easy to use than the Xbone slow DDR3 memory + 32mb cache buffer ESRAM architecture. It make the development of the game so much more complicated.

It is why almost all multi-console games are looking so much better on PS4 than the Xbone.


Slow ddr3? Give me a break. Even the best looking pc games out there only use 4gb of ddr3.

The Xbox one is plenty powerful. Give devs some time and the games will look amazing. Just look at the difference between 360s first games and it's more recent games. Everyone needs to stop being hard-core fanboys and look at the facts.

Stop comparing systems to other systems. My brothers pc is at least 100% worse than mine. But we can both run games on very high.


this soooo much
---
http://www.northernsun.com/images/thumb/2214.jpg
#49SoulTrapperPosted 4/22/2014 12:49:51 PM
The biggest issue with the xbox isn't the GPU, it's the eSRAM.

The main barrier that is breakable is ESRAM design.
All 3D hardware except PS1 and nintendo DS need at least three "bitmaps" stored on the memory (they're actually quite a LOT similar to a windows bitmap, except no header).
Those are the backbuffer, front buffer and Zbuffer.

The backbuffer is where the GPU draw the current frame, frontbuffer is a copy of the last backbuffer that is feed to the circuitry that sends the picture to the screen, and the zbuffer is a map of the depth of every pixel, that is used to cheaply sort the triangles and allow geometry intersection etc..

In the case of the Xbone, at least the backbuffer and the Zbuffer must be stored on the ESRAM because the regular memory is just too slow to make that role effectively, and well, a 1080p HDR backbuffer requires 16 MB of memory, plus the 8 used by the Z-buffer and suddently you only have 8MB of fast memory for all the rendering purposes, and this is quite short when you're using stuff like 4096x4096 textures and millions of polygons.

The easiest cop out out of this is of course setting a lower resolution to leave more ESRAM space for the game, but you also can do stuff like rendering half of the screen then the other half, or not using HDR, or going zbufferless etc...

Meanwhile on PS4, no juggling around is needed as all the memory is the fast memory.

They're the same memory format, though GDDR is designed for very high bandwidth at the expense of latency, whilst DDR3 aims for low latency at the expense for high bandwidth.

The thing is, DDR3 is not good for storing graphics assets and game data, which works best off of GDDR memory because of the high bandwidth.

To compensate for that, MIcrosoft used eSRAM which offers the very high bandwidth that GDDR5 is known for. However, there's only 32MB of it, which simply isn't enough and no amount of polished SDK, better drivers, and more lower-level access courtesy of Microsoft's SDK is going to fix the fundamental fact that physically, in terms of what is on the chip--isn't enough to match what the PS4 has.

The Xbox One even with its weaker CPU may have been able to compete with the ps4 and render games at 1080p IF the APU had 48 or even 64MB of eSRAM.

But it doesn't and it never will; which will be its Achilles heel as the generation lifespan increases.


There's one other MAJOR problem with the XB1.

The Data Move Engines that MS incorporated into the APU that sit between the DDR3 and the eSRAM.

Whenever data needs to be transferred from DDR3 to eSRAM, the data has to utilize these move engines, which eats up 8GB/s of bandwidth.

So your DDR3 in the XB1 which was sitting at 68GB/s, now suddenly is doing 60GB/s and the 172GB/s you were getting with the eSRAM is now getting 164GB/s.

Which is basically creating latency during data transfers between two memory pools, and when you're rendering stuff with a lot of physics and player input, that latency is an absolute nightmare.

PS4 doesn't have these latency issues because there's only ONE memory pool, not two and furthermore, the PS4 doesn't need to rely on additional circuitry to move data around.

The whole data move engines are brilliant from a set-top-box console that can decrpyt loads of media and interface with cable perspective + kinect camera.

But it's a colossal waste for gaming.
#5082xenoPosted 4/22/2014 12:50:33 PM
Talk2DaHand posted...
xHughJasx posted...
pigboy posted.
it is.

BBQMoosehead posted...
I hear people keep saying that XB1 has a weak GPS and no dedicated video ram. Is the hardware for it really that bad?


It is.

Xbone GPU is about 50% weaker raw power wise than the PS4 one by pure "number". Plus the Xbone also had to reserve 3-4% of GPU power for the Kinect w/ its already weaker GPU. Big M had to reduce the reserve from 10% due to the complaints of many developers for the Xbone is losing ground fast.

Another huge issue is that the PS4 memory, GDDR5, is so much faster and easy to use than the Xbone slow DDR3 memory + 32mb cache buffer ESRAM architecture. It make the development of the game so much more complicated.

It is why almost all multi-console games are looking so much better on PS4 than the Xbone.


Slow ddr3? Give me a break. Even the best looking pc games out there only use 4gb of ddr3.

The Xbox one is plenty powerful. Give devs some time and the games will look amazing. Just look at the difference between 360s first games and it's more recent games. Everyone needs to stop being hard-core fanboys and look at the facts.

Stop comparing systems to other systems. My brothers pc is at least 100% worse than mine. But we can both run games on very high.


this soooo much


not this, sooo much

DDR3 is fine for CPU tasks, when you reference computers and the like. It's why the ram sticks for PC's is usually DDR3, it's pretty much CPU dedicated.

Get into the GPU though, it requires faster access to the memory for assets, as well as deferred buffers.

So no, DDR3 alone is pretty bad for gaming these days, which is the entire reason MS shoved an ESRAM chip in the machine, to compensate for the slow bandwidth on the GPU end. Suffice to say, this ESRAM is the main bottleneck for the X1 due to it's size.
---
Shwing