You're browsing the GameFAQs Message Boards as a guest. Sign Up for free (or Log In if you already have an account) to be able to post messages, change how messages are displayed, and view media in posts.
darkwizard3533 posted...the fail is because people don't understand what the cloud is.
Yeah, if you have a great connection and your game doesn't require instant processing. Ironically, great for RPGs, not exactly Microsoft's forte.
Forest trees are greener.
I only see rain in the clouds!
0=Rei. Pronounced Rei-six. Born New England. Lived in Japan. Citizen of Earth.
KamenHentai - http://www.youtube.com/watch?v=TozprFrnn10
sigh. first the cpu has very little to do with the graphics. saved cpu cycles aren't going to improve your graphics at all.
there's this wonderful piece called a gpu.........graphics processing unit.background ai is happening in the background and has no bearing on onscreen activity.
what your talking about is foreground ai which is impractical for other reasons(essentially a dedicated server environment to play a single player game). ever played diablo 3 on pc? seen the enemies warp around? ever played hardcore and had a lag death? there's very very little performance benefit and a whole bunch of negatives. you want that in your single player games? more power to you. i sure the heck don't.
also its only beneficial for more complex algorithms as your still using cycles to send/receive and process the results. which again is why im suggesting it is far more reasonable to do this for background ai. in other words there has to be a certain level of complexity in order for it to have any potential benefit.
I love the irony of how you say you know what you're talking about and that basically everyone else in the thread doesn't, then you go on to state absolutes which are totally untrue.
Have you never heard of games being CPU bound? Before graphics can be rendered to the screen, the engine has to check for blend animations for every character, compute matrices for the vertices to be skinned, update particle simulations (could be done on GPU), cull the scene to the view frustum, sort things in the view by alpha or by material, build draw lists for the GPU, and so on. If you save clock cycles of course it can be used for all the above. Again, not significantly, but to get on your soap box to declare a benefit of zero while simultaneously admitting it saves clock cycles, is a strange stance to take. If you had just said, "...minimal graphics benefit..." that would have been more accurate.
My mad face and my happy face are the same.
You can do certain things with cloud technology that could theoretically improve graphics.
The problem is and always will be latency issues.
Mark Cerny ;
"The Xbox One board isn't the place for personal anecdotes, joke topics or fanboy affair." Gamefaqs Moderator
http://www.youtube.com/watch?v=TozprFrnn10, XBO dunk fail - http://i.imgur.com/bOGdIFW.gif
Cute serious internets http://i.imgur.com/JZ7u2DD.jpg
No you're not.
Pour grammer annoy's me
Latency trumps all, get back to me when your calculations + network latency can keep up with a 33 ms frame generation.
So the bottom line is we still don't know how this is all going to work. All we know is that it is going to be AWESOME!
November can't get here soon enough!
Add user to Ignore List after reporting