3 years ago#261
darkwizard3533 posted...the fail is because people don't understand what the cloud is.
Yeah, if you have a great connection and your game doesn't require instant processing. Ironically, great for RPGs, not exactly Microsoft's forte.
Forest trees are greener.
3 years ago#262
I only see rain in the clouds!
0=Rei. Pronounced Rei-six. Born New England. Lived in Japan. Citizen of Earth.
KamenHentai - http://www.youtube.com/watch?v=TozprFrnn10
3 years ago#263
sigh. first the cpu has very little to do with the graphics. saved cpu cycles aren't going to improve your graphics at all.
there's this wonderful piece called a gpu.........graphics processing unit.background ai is happening in the background and has no bearing on onscreen activity.
what your talking about is foreground ai which is impractical for other reasons(essentially a dedicated server environment to play a single player game). ever played diablo 3 on pc? seen the enemies warp around? ever played hardcore and had a lag death? there's very very little performance benefit and a whole bunch of negatives. you want that in your single player games? more power to you. i sure the heck don't.
also its only beneficial for more complex algorithms as your still using cycles to send/receive and process the results. which again is why im suggesting it is far more reasonable to do this for background ai. in other words there has to be a certain level of complexity in order for it to have any potential benefit.
3 years ago#264
I love the irony of how you say you know what you're talking about and that basically everyone else in the thread doesn't, then you go on to state absolutes which are totally untrue.
Have you never heard of games being CPU bound? Before graphics can be rendered to the screen, the engine has to check for blend animations for every character, compute matrices for the vertices to be skinned, update particle simulations (could be done on GPU), cull the scene to the view frustum, sort things in the view by alpha or by material, build draw lists for the GPU, and so on. If you save clock cycles of course it can be used for all the above. Again, not significantly, but to get on your soap box to declare a benefit of zero while simultaneously admitting it saves clock cycles, is a strange stance to take. If you had just said, "...minimal graphics benefit..." that would have been more accurate.
My mad face and my happy face are the same.
3 years ago#265
You can do certain things with cloud technology that could theoretically improve graphics.
The problem is and always will be latency issues.
3 years ago#266
Mark Cerny ;
"The Xbox One board isn't the place for personal anecdotes, joke topics or fanboy affair." Gamefaqs Moderator
3 years ago#267
http://www.youtube.com/watch?v=TozprFrnn10, XBO dunk fail - http://i.imgur.com/bOGdIFW.gif
Cute serious internets http://i.imgur.com/JZ7u2DD.jpg
3 years ago#268
No you're not.
Pour grammer annoy's me
3 years ago#269
Latency trumps all, get back to me when your calculations + network latency can keep up with a 33 ms frame generation.
3 years ago#270
So the bottom line is we still don't know how this is all going to work. All we know is that it is going to be AWESOME!
November can't get here soon enough!
|More topics from this board...|
|USB saves to Cloud||rekmusik||3||2/7 1:00AM|
|The Division has graphics settings we're not used to seeing on console (update)||quincy2000a||1||2/7 5:02AM|
|The Division Open Beta Announced||AttackOnTitan||3||2/9 7:35PM|
|Star Wars Battlefront Double XP confirmed on Xbox One, PS4 and PC||quincy2000a||17||2/9 7:33PM|
|Selling my ps4 for Halo Wars 2||HypnoG||37||2/9 7:32PM|