If its true that DX12 update will double the XB1 graphics and performance

#111SoulTrapperPosted 4/28/2014 7:25:54 AM
Sin_Angelus_ posted...

Cherry picking, a true sign of desperation. I guess you missed the part where he said "It’s especially helpful because the memory is readily available for any purpose and unit: the CPU, the GPU, textures, render targets, etc. It really smoothes out the optimization process."


I didn't miss it, it's just that they aren't using it for that.

That's the potential he thinks it could be used for, not what they're actually doing with it.

SoulTrapper posted...

Ok. So that makes your post on ESRAM Sony-fan BS. I like how instead of actually responding to what was written in the article you just write it off. Another sign of desperation.


It's MS themselves, so they aren't going to say their own design choice causes issues.

It also goes perfectly in-line with my post on eSRAM.
His post explains what it is and does, my post explains it's limitations.

But, by all means continue ignoring facts and cling to your articles that don't actually disagree with anything I posted.



Which is still just as credible, if not more, than your opinion. But again, nice try dismissing my sources with no real basis or counter-argument.


Again, that entire post earlier in this topic, to which you yet again failed to reply, is the counter argument to all of it.



And the articles I posted explain why that "problem" will become less as devs get used to the architecture, like they did with 360, and are able to utilize other tools like DX12 and the cloud to help balance things out. It doesn't really matter whether you think they will work or not- the proof says otherwise. Your opinion matters very little in the grand scheme of things, especially when people who are working on these consoles are saying pretty much the opposite. It's fine to share your opinion on here, but be honest with yourself at the very least- you're not a developer, and you can't see into the future.


Now try and read the "essay", you'll notice that getting used to the architecture is not a solution to the eSRAM being too small.

Please, don't bring your drivel about the cloud and DX12 into this, people will only think of you as a fool.

Nobody is saying the opposite of what my post said, not even the die hard fanboys here are disagreeing with it or posting any counter arguments.
And those guys have much, much more knowledge on the subject than you.


Nope, it's called citing multiple sources, you should try it sometime. It's how intelligent people argue.


Intelligent people have at least an idea of what they're arguing about, you clearly have no clue.
Citing sources, even though you're well aware there are just as many sources saying the opposite, isn't going to make facts go away.

But if you want sources, here you go:

http://www.edge-online.com/news/power-struggle-the-real-differences-between-ps4-and-xbox-one-performance/

“Xbox One is weaker and it’s a pain to use its ESRAM,” concluded one developer.

http://gearnuke.com/microsoft-surprised-xbox-ones-esram-backlash-calls-esram-evolution-xbox-360s-edram/

Microsoft’s hardware architecture team manager Nick Baker said that they were surprised by the backlash that started regarding Xbox One’s ESRAM and it being difficult to work with

http://attackofthefanboy.com/news/xbox-esram-preventing-1080p-games/

Part of the problem is that it’s just a little bit too small to output 1080p within that size.

http://www.redgamingtech.com/xbox-one-esram-720p-why-its-causing-a-resolution-bottleneck-analysis/
#112SoulTrapperPosted 4/28/2014 7:35:06 AM
Sin_Angelus_ posted...

You can continue to regurgitate that childishness all you want, you'll still be wrong. You posted your opinion on the ESRAM stuff- I posted proof from devs. That makes you wrong, and look all the more foolish by lashing out with name calling.


It's not an opinion, those are simple facts.

Go ahead and try to refute any of it.

And there's no "several people" here, unless you want to be grouped with the other shills here who just agree with anything negative-


The other topic had 3 different people pointing out exactly why you were wrong.

You remember the topic, right? The one were you mysteriously disappeared when I posted the stuff about the esram?

In case you forgot, it's right here:

http://www.gamefaqs.com/boards/691088-xbox-one/68906623?page=15

there's just you, making petty insults and starting pointless debates on the board of a system you don't even own, anytime someone says something semi-positive. That's what makes it look like you're taking things personally, and that's why your opinion on matters like this will continue to look biased and be largely ignored. Every topic you enter is a confrontation. You don't come here for discussion, you come here to start arguments, like a petulant child with something to prove. Grow up, learn to argue without insulting, and maybe people will take you more seriously. In the meantime, I'm done your redundant arguments and lack of sources. You can call it a cop out if you want, but your opinion is so insignificant to me at this point you'd be wasting your energy to type that.


Pointing out how repeating the same stuff that has already been proven wrong is idiotic is an insult?

The only reason I need to argue is because people like you can't accept the truth.
You just keep clinging on to buzzwords like "The Cloud" and "DX12", hoping that they'll magically improve the weaker hardware of the xbox one.

This isn't going to happen.
If you chose the xbox one because you were expecting the best looking games, you chose wrong.
MS even said this themselves months before release:

http://www.computerandvideogames.com/408068/xbox-one-does-not-target-the-highest-end-graphics-says-ms-engineer/
#113KOOGARPosted 4/28/2014 7:38:23 AM
No, it will triple the power!!!
---
Don't read this - it hurts your eyes.
#114macmahon187Posted 4/28/2014 8:15:59 AM
[This message was deleted at the request of the original poster]
#115N3xtG3nGam3rPosted 4/28/2014 9:34:23 PM
Viet0ne posted...
DX12, nVidia DX11 Optimized Drivers, and Mantle all do the same thing. Optimize CPU overhead to reduce CPU bottleneck and allow the GPU to run at a higher efficiency.

We already know the peak performance of both the GPU in the Xbox One and the PS4. The question then becomes how much of the performance from the GPUs is either console getting. You will never get 100% of the performance from the GPU due to the additional overhead associated with the hardware between the components and the software layer that interacts with the application. Even if you were to run a game that has its own software layer to interact with the hardware and bypass the OS, you still have the overhead from accessing the GPU from the PCI Express interface through the motherboard.

The only possible way to double the existing performance from the Xbox one is if the Xbox one is getting less than half the performance form the GPU now. This would mean existing games are only able of achieving ~45% or lower of the theoretical performance of the Xbox One GPU due to the bottleneck. DX12 would optimize and remove the bottleneck and double the performance achieved.

Based on the games already released, we already know the Xbox One games are getting more than half the theoretical performance from the GPU since the games perform near identical to the performance level of a similar spec PC. Even if you factor in future console optimizations, that only accounts for a small improvement in performance.


Excellent post. Miss the days when we would discuss the PS3/360 back at their launch.

I agree pretty much with everything you are stating. The main thing that needs to be taken into consideration is how much different the hardware inside the X1 is as compared to the PS4. The unified aspect and the GDDR5 being easy to work with, are yielding immediate results in games with it's power. The xbox one, not so much.

You have eSRAM, which is on the GPU, and has been said by MS themselves to not have to be/nor can it be accessed by the CPU, meaning nothing needs to be ran from GPU to the eSRAM, to main ram, to the CPU, and back, etc. Also the eSRAM allows read/write/copy simultaneously, which is a much bigger deal than people are letting on. There are 4 "move engines" that compress/decompress data, and are able to do a few other things as well without the CPU needing to do a thing.

There are a few other things making the architecture more complex as well, I can't think of them off the top of my head, but you get the idea.

While most dont want to hear it, the cloud plays a role in the systems complexity as well.

With these complexities, I personally believe DX12 will help developers to use the hardware the way Microsoft designed it to, which should in simpler terms: "narrow the gap".

TL;DR - The design of the PS4s hardware is allowing it's power to be accessed and harnessed fairly quickly, while the Xbox (which has the capabilities to produce equal results as of this early on) and it's more complex hardware is making power extraction more difficult.
---
ASUS p8h61-M (Rev 3.0) | Intel CORE i3 2100 | 8GB Dual-Channel DDr3 | 500GB HDD | 600w PSU | nVidia GTX 770 4GB GDDr5
#116PsyEdPosted 4/28/2014 10:04:49 PM
Xbo has weaker GPU and slower ram...and the OS itself is a mess.

Ps4 is simple and to the point. I'm sure there will be a DX12 hybrid on ps4...which will also let dev's get more out of Ps4 than they do now.

Ps4 at the end of the day will be the definitive edition for all multi platform console games.
---
i7 3770k | GTX 780 Ti | 32GB DDR3 | 512GB Samsung SSD Pro | ASUS Sabertooth Z77
#117Sin_Angelus_Posted 4/28/2014 11:03:20 PM(edited)
TrueBlue91 posted...
Damn, Sin Angelus is a clueless tool.


I'm sorry if you're upset that I disagreed with your boyfriend, but that doesn't make me clueless.

Soul Trapper, I'm going to address a few things you said and then I truly am done. First of all, those links you're posting are from last year, which is pretty significant considering the whole point of this discussion is that devs will get better at utilizing ESRAM with time.

That article I posted from the Murdered: Soul Suspect dev is more significant than anything you've posted, since it's from a few weeks ago and he addresses exactly what you're claiming is a big issue for X1- he states they had no issues with ESRAM, and both games will run at 1080p. That's proof that a) devs are getting better with it, and b) the gap is already starting to close.

Lastly, stop referencing an old topic where the only people that agreed with you were well-known trolls who mostly just re-quoted and circle-jerked each other. I seriously doubt you want to be lumped in with that crowd.

N3xtG3nGam3r posted...
TL;DR - The design of the PS4s hardware is allowing it's power to be accessed and harnessed fairly quickly, while the Xbox (which has the capabilities to produce equal results as of this early on) and it's more complex hardware is making power extraction more difficult.


Why is this so hard for some people to accept? That the X1 just MIGHT improve as time goes on? Honestly calm down people, you can still enjoy your PS4 even if it loses some it's slight graphical advantage.
#118SoulTrapperPosted 4/29/2014 6:13:12 AM
Sin_Angelus_ posted...
TrueBlue91 posted...
Damn, Sin Angelus is a clueless tool.


I'm sorry if you're upset that I disagreed with your boyfriend, but that doesn't make me clueless.

Soul Trapper, I'm going to address a few things you said and then I truly am done. First of all, those links you're posting are from last year, which is pretty significant considering the whole point of this discussion is that devs will get better at utilizing ESRAM with time.

That article I posted from the Murdered: Soul Suspect dev is more significant than anything you've posted, since it's from a few weeks ago and he addresses exactly what you're claiming is a big issue for X1- he states they had no issues with ESRAM, and both games will run at 1080p. That's proof that a) devs are getting better with it, and b) the gap is already starting to close.


What makes you cleuless is all the drivel about the cloud and the complete failure to understand that YOU CAN'T IMPROVE HARDWARE THROUGH SOFTWARE UPDATES.
You can push it further, but it will always be limited to what the hardware can do.


The ESRAM hasn't changed compared to last year.
You seem to not understand this: the ESRAM does not change. Devs can learn how to work with it, but my post explained exactly what the issue with it is.
And that issue isn't going to change.

In the case of the Xbox one, at least the backbuffer and the Zbuffer must be stored on the ESRAM because the regular memory is just too slow to make that role effectively, and well, a 1080p HDR backbuffer requires 16 MB of memory, plus the 8 used by the Z-buffer and suddenly you only have 8MB of fast memory for all the rendering purposes, and this is quite short when you're using stuff like 4096x4096 textures and millions of polygons.

The article you posted actually proves that the esram is a pain to work with.

I'll post this again, since you seem to ignore it:

“We haven’t played around with the eSRAM much yet. Currently, we use it for storing the zbuffer and shadowmaps.

Could you point out exactly where he says they haven't had any issues with it?

This is further proven by this article:

http://www.videogamer.com/ps4/dying_light/news/dying_light_targeting_1080p_60fps_on_ps4_and_xbox_one.html

Dying Light was shown running at 1080p on PS4 during this weekend's VGX, although it wasn't clear that the developer would be targeting the same resolution on Xbox One.

If they weren't having any issues with the xbox version, why not show it as well?

Because the ESRAM is a bottleneck, as I've already explained in this earlier post, to which you still haven't managed to form a coherent counter-argument or rebuttal:

http://www.gamefaqs.com/boards/691088-xbox-one/69064045/781796918

The esram is a bottleneck and it always will be, no matter what tricks devs use.


Lastly, stop referencing an old topic where the only people that agreed with you were well-known trolls who mostly just re-quoted and circle-jerked each other. I seriously doubt you want to be lumped in with that crowd.


That's just silly.
People aren't trolls because they disagree with you and prove you're full of crap.

If by "that crowd" you mean people who are correct and actually have some idea of what they're talking about (you obviously don't), by all means: lump away
#119Ryan-06Posted 4/29/2014 6:16:40 AM
^ good work.
---
chart: http://ow.ly/uuJ6g
chocolate: http://ow.ly/t0gvj, http://ow.ly/t0g1Y
#120DonomegaPosted 4/29/2014 6:37:48 AM
Its like saying DX12 will make Intel integrated graphics looks like a GTX Titan ;)