Is the GPU on XB1 really that bad?

#61Spetsnaz420Posted 4/23/2014 12:21:50 AM
DojoMax posted...
Garage_Man posted...
pigboy posted...
it is.

BBQMoosehead posted...
I hear people keep saying that XB1 has a weak GPS and no dedicated video ram. Is the hardware for it really that bad?


It is.

Xbone GPU is about 50% weaker raw power wise than the PS4 one by pure "number". Plus the Xbone also had to reserve 3-4% of GPU power for the Kinect w/ its already weaker GPU. Big M had to reduce the reserve from 10% due to the complaints of many developers for the Xbone is losing ground fast.

Another huge issue is that the PS4 memory, GDDR5, is so much faster and easy to use than the Xbone slow DDR3 memory + 32mb cache buffer ESRAM architecture. It make the development of the game so much more complicated.

It is why almost all multi-console games are looking so much better on PS4 than the Xbone.


I think it's gonna be really interesting to see that the PS4 can do once they unlock the power of GDDR5. It's still VERY new for a processor to use GDDR over standard DDR. AMD says there are huge gains to be had...I mean look at HSA and Mantle and what they can do...We could see the PS4 trounce the competition...then again maybe not.


HAHAHAAHA


Are you laughing because what he said is absurd? Or because it mirrors the kind of reassurance this board is rife with?
---
Searching for my long lost twin, is this you?
http://www.gamefaqs.com/boards/user.php?=2426382
#62AxleHeadXPosted 4/23/2014 7:14:17 AM
Ultimately the problem is Microsoft cheaped out on both Ram and GPU. The PS4 is using GDDR5 while Xbox One is using DDR3. PC GPU's are even using GDDR5 now. The Xbox One will eventually be pushing better graphics once developers get the hang of it, but the PS4 will always have the advantage power wise. All I can say is enjoy the console you own and if you're lucky enough to own both then enjoy both. They are video games, surely you can find something more constructive to bash and debate.
#63pigboyPosted 4/24/2014 1:00:29 AM
SoulTrapper posted...
The biggest issue with the xbox isn't the GPU, it's the eSRAM.

The main barrier that is breakable is ESRAM design.
All 3D hardware except PS1 and nintendo DS need at least three "bitmaps" stored on the memory (they're actually quite a LOT similar to a windows bitmap, except no header).
Those are the backbuffer, front buffer and Zbuffer.

The backbuffer is where the GPU draw the current frame, frontbuffer is a copy of the last backbuffer that is feed to the circuitry that sends the picture to the screen, and the zbuffer is a map of the depth of every pixel, that is used to cheaply sort the triangles and allow geometry intersection etc..

In the case of the Xbone, at least the backbuffer and the Zbuffer must be stored on the ESRAM because the regular memory is just too slow to make that role effectively, and well, a 1080p HDR backbuffer requires 16 MB of memory, plus the 8 used by the Z-buffer and suddently you only have 8MB of fast memory for all the rendering purposes, and this is quite short when you're using stuff like 4096x4096 textures and millions of polygons.

The easiest cop out out of this is of course setting a lower resolution to leave more ESRAM space for the game, but you also can do stuff like rendering half of the screen then the other half, or not using HDR, or going zbufferless etc...

Meanwhile on PS4, no juggling around is needed as all the memory is the fast memory.

They're the same memory format, though GDDR is designed for very high bandwidth at the expense of latency, whilst DDR3 aims for low latency at the expense for high bandwidth.

The thing is, DDR3 is not good for storing graphics assets and game data, which works best off of GDDR memory because of the high bandwidth.

To compensate for that, MIcrosoft used eSRAM which offers the very high bandwidth that GDDR5 is known for. However, there's only 32MB of it, which simply isn't enough and no amount of polished SDK, better drivers, and more lower-level access courtesy of Microsoft's SDK is going to fix the fundamental fact that physically, in terms of what is on the chip--isn't enough to match what the PS4 has.

The Xbox One even with its weaker CPU may have been able to compete with the ps4 and render games at 1080p IF the APU had 48 or even 64MB of eSRAM.

But it doesn't and it never will; which will be its Achilles heel as the generation lifespan increases.


There's one other MAJOR problem with the XB1.

The Data Move Engines that MS incorporated into the APU that sit between the DDR3 and the eSRAM.

Whenever data needs to be transferred from DDR3 to eSRAM, the data has to utilize these move engines, which eats up 8GB/s of bandwidth.

So your DDR3 in the XB1 which was sitting at 68GB/s, now suddenly is doing 60GB/s and the 172GB/s you were getting with the eSRAM is now getting 164GB/s.

Which is basically creating latency during data transfers between two memory pools, and when you're rendering stuff with a lot of physics and player input, that latency is an absolute nightmare.

PS4 doesn't have these latency issues because there's only ONE memory pool, not two and furthermore, the PS4 doesn't need to rely on additional circuitry to move data around.

The whole data move engines are brilliant from a set-top-box console that can decrpyt loads of media and interface with cable perspective + kinect camera.

But it's a colossal waste for gaming.


Nice!

I was going to respond to some of the incorrect concept some xbots posted, but your post explain the problem better than I could.
#64chubbychaserPosted 4/25/2014 9:26:40 PM
pigboy posted...
SoulTrapper posted...
The biggest issue with the xbox isn't the GPU, it's the eSRAM.

The main barrier that is breakable is ESRAM design.
All 3D hardware except PS1 and nintendo DS need at least three "bitmaps" stored on the memory (they're actually quite a LOT similar to a windows bitmap, except no header).
Those are the backbuffer, front buffer and Zbuffer.

The backbuffer is where the GPU draw the current frame, frontbuffer is a copy of the last backbuffer that is feed to the circuitry that sends the picture to the screen, and the zbuffer is a map of the depth of every pixel, that is used to cheaply sort the triangles and allow geometry intersection etc..

In the case of the Xbone, at least the backbuffer and the Zbuffer must be stored on the ESRAM because the regular memory is just too slow to make that role effectively, and well, a 1080p HDR backbuffer requires 16 MB of memory, plus the 8 used by the Z-buffer and suddently you only have 8MB of fast memory for all the rendering purposes, and this is quite short when you're using stuff like 4096x4096 textures and millions of polygons.

The easiest cop out out of this is of course setting a lower resolution to leave more ESRAM space for the game, but you also can do stuff like rendering half of the screen then the other half, or not using HDR, or going zbufferless etc...

Meanwhile on PS4, no juggling around is needed as all the memory is the fast memory.

They're the same memory format, though GDDR is designed for very high bandwidth at the expense of latency, whilst DDR3 aims for low latency at the expense for high bandwidth.

The thing is, DDR3 is not good for storing graphics assets and game data, which works best off of GDDR memory because of the high bandwidth.

To compensate for that, MIcrosoft used eSRAM which offers the very high bandwidth that GDDR5 is known for. However, there's only 32MB of it, which simply isn't enough and no amount of polished SDK, better drivers, and more lower-level access courtesy of Microsoft's SDK is going to fix the fundamental fact that physically, in terms of what is on the chip--isn't enough to match what the PS4 has.

The Xbox One even with its weaker CPU may have been able to compete with the ps4 and render games at 1080p IF the APU had 48 or even 64MB of eSRAM.

But it doesn't and it never will; which will be its Achilles heel as the generation lifespan increases.


There's one other MAJOR problem with the XB1.

The Data Move Engines that MS incorporated into the APU that sit between the DDR3 and the eSRAM.

Whenever data needs to be transferred from DDR3 to eSRAM, the data has to utilize these move engines, which eats up 8GB/s of bandwidth.

So your DDR3 in the XB1 which was sitting at 68GB/s, now suddenly is doing 60GB/s and the 172GB/s you were getting with the eSRAM is now getting 164GB/s.

Which is basically creating latency during data transfers between two memory pools, and when you're rendering stuff with a lot of physics and player input, that latency is an absolute nightmare.

PS4 doesn't have these latency issues because there's only ONE memory pool, not two and furthermore, the PS4 doesn't need to rely on additional circuitry to move data around.

The whole data move engines are brilliant from a set-top-box console that can decrpyt loads of media and interface with cable perspective + kinect camera.

But it's a colossal waste for gaming.


Nice!

I was going to respond to some of the incorrect concept some xbots posted, but your post explain the problem better than I could.

---
aaaHAA!! X800 XT PE
#65Cowboy082288Posted 4/25/2014 9:52:48 PM
shamfuru posted...
DojoMax posted...
Garage_Man posted...
pigboy posted...
it is.

BBQMoosehead posted...
I hear people keep saying that XB1 has a weak GPS and no dedicated video ram. Is the hardware for it really that bad?


It is.

Xbone GPU is about 50% weaker raw power wise than the PS4 one by pure "number". Plus the Xbone also had to reserve 3-4% of GPU power for the Kinect w/ its already weaker GPU. Big M had to reduce the reserve from 10% due to the complaints of many developers for the Xbone is losing ground fast.

Another huge issue is that the PS4 memory, GDDR5, is so much faster and easy to use than the Xbone slow DDR3 memory + 32mb cache buffer ESRAM architecture. It make the development of the game so much more complicated.

It is why almost all multi-console games are looking so much better on PS4 than the Xbone.


I think it's gonna be really interesting to see that the PS4 can do once they unlock the power of GDDR5. It's still VERY new for a processor to use GDDR over standard DDR. AMD says there are huge gains to be had...I mean look at HSA and Mantle and what they can do...We could see the PS4 trounce the competition...then again maybe not.


HAHAHAAHA


I know right?

I have had "gaming grade," 5-core RAM in my PC for years. This isn't some arcane mystery unearthed from the fires of mordor.


LOL when you read a post like Garage mans you realize just how little some of the poster here know. Also whoever said 'dedicated memory'.... uh no. These consoles have one pool of ram for everything.

Look the PS4 is about equivalent to a $150 vid card and the X1 to a $100 card. Sure over the years they will really push these consoles to the max and get more out of them than the equivalent vid cards will do. But don't kid yourself, they will never close the gap with even a $1,000 PC that can be built today.

Both these consoles will be stuck at 1080P for the next 8 years. May not matter much now but in 4 years, the tech in these consoles is going to start to look pretty outdated. This is the weakest console gen ever. Their cpus are already crap.

That being said I think anyone who buys either console will get plenty of enjoyment out of it. They are pretty cool devices. People just need to be realistic about what they are buying.
---
PSN/XBL/Steam/iOS - cowboyoni
#66SaltyBotzPosted 4/25/2014 10:23:57 PM
GPUs shmeePUs. Once MS gets the cloud up, its going to be OVER for the PS4!! Then nobody will be able to say games on PS4 look better. Cloud is going to lead the gaming of the future and MS is already one step ahead Sony.
#67xcmon3yx2Posted 4/26/2014 1:02:36 PM
jimm120 posted...
I think of it liek PC video cards

7890
7870
are TOP teir cards (last year....)

Then a drop off to the
7850
.
7930

The 70 and 90 are the "real" cards that are actually powerful. The 50 card is an ok card but less than half the power of the "real cards". The 30 is just a bargain card.

The PS4 has a card like the 7950...still weakish but cheap and can run stuff.
The One has a card like the 7830...lower model than the PS4.


Moral of the story? Both cards suck. Just that we had to compromise for a slightly lower end card for Kinect.


7890 doesn't exist, 7870 was never a top tier card, it was a budget/middle tier card. Second the Xbone doesn't even have a GPU it has a CPU/GPU hybrid called an APU
---
http://www.youtube.com/user/xcmon3yx777,(3DS FC: 5257-9927-9011), (Steam: xcmon3yx2), (XBL: HakudoshiV77360), (WoW: xcmon3yx2#1204)
#68MichelleRheePosted 4/26/2014 1:10:05 PM
Garage_Man posted...
pigboy posted...
it is.

BBQMoosehead posted...
I hear people keep saying that XB1 has a weak GPS and no dedicated video ram. Is the hardware for it really that bad?


It is.

Xbone GPU is about 50% weaker raw power wise than the PS4 one by pure "number". Plus the Xbone also had to reserve 3-4% of GPU power for the Kinect w/ its already weaker GPU. Big M had to reduce the reserve from 10% due to the complaints of many developers for the Xbone is losing ground fast.

Another huge issue is that the PS4 memory, GDDR5, is so much faster and easy to use than the Xbone slow DDR3 memory + 32mb cache buffer ESRAM architecture. It make the development of the game so much more complicated.

It is why almost all multi-console games are looking so much better on PS4 than the Xbone.


I think it's gonna be really interesting to see that the PS4 can do once they unlock the power of GDDR5. It's still VERY new for a processor to use GDDR over standard DDR. AMD says there are huge gains to be had...I mean look at HSA and Mantle and what they can do...We could see the PS4 trounce the competition...then again maybe not.
#69AenimaGenesisPosted 4/26/2014 1:10:49 PM
jimm120 posted...
I think of it liek PC video cards

7890
7870
are TOP teir cards (last year....)

Then a drop off to the
7850
.
7930

The 70 and 90 are the "real" cards that are actually powerful. The 50 card is an ok card but less than half the power of the "real cards". The 30 is just a bargain card.

The PS4 has a card like the 7950...still weakish but cheap and can run stuff.
The One has a card like the 7830...lower model than the PS4.


Moral of the story? Both cards suck. Just that we had to compromise for a slightly lower end card for Kinect.


Sounds to me like you're trying to marginalise the difference between the two into one pigeon hole.

Yes, the PS4's GPU is no GTX 780 ti, but it's far and away more powerful than the Xbox One's.

If the PS4's GPU "sucks", then the same cannot be said for the Xbox One's. That would make the Xbox One's a shambolic abomination of a joke.
#70chubbychaserPosted 4/27/2014 3:20:17 PM
SaltyBotz posted...
GPUs shmeePUs. Once MS gets the cloud up, its going to be OVER for the PS4!! Then nobody will be able to say games on PS4 look better. Cloud is going to lead the gaming of the future and MS is already one step ahead Sony.


LoL. Delusional is strong in this one.
---
aaaHAA!! X800 XT PE