Mark cerny says you can't improve graphics by using cloud

#261Chao_YunPosted 7/29/2013 8:50:03 AM
Sheepinator posted...
darkwizard3533 posted...
the fail is because people don't understand what the cloud is.

the cloud is a cluster of servers that primarily provide storage. can you do distributed computing? yes, but only for things where latency is irrelevant. so a persistent mmo style world? thats a possibility.

but improved graphics or performance? heck no. heres why. the hardware inside the consoles operates in the realm of nanoseconds. take ddr 3 (the type of ram used in teh xbox one) and assuming using quality ddr3 probably has a cas latency some where in the realm of 7-8 nanoseconds.

thats the response time from the memory controller. your NETWORK latency which is how you would connect to the cloud has a response time measured in MILISECONDS.

1 nanosecond is 0.000001 miliseconds.

if you had a ping of 100ms which is not uncommon as most broadband is in the 50ms-200ms range your ping would be 100000000ns

thats 13333333.33 x 7.5 ns. 13333333.33 times longer than what your consoles hardware is expecting.

so your console would be sitting there tapping its foot saying im still waiting..................


do yall kinda understand why its not possible? it would only work for latency insensitive calculations. but graphics, physics are heavily latency dependent. its possible to offload AI to the cloud but it would be the equivalent of a multiplayer bot game and subject to all the lag spikes, delays that you would see in any normal multiplayer game. also it would require an internet connection to even play single player.

so essentially it may be useful for background ai calculations but thats about all.

The problem with your description is you state with absolute certainty that it cannot improve graphics, then you go on to explain how it can save console clock cycles... which could in turn be used to improve graphics (albeit slightly and probably only in select genres).


Yeah, if you have a great connection and your game doesn't require instant processing. Ironically, great for RPGs, not exactly Microsoft's forte.
---
Forest trees are greener.
#262Ryan-06Posted 7/29/2013 8:52:47 AM
I only see rain in the clouds!
---
0=Rei. Pronounced Rei-six. Born New England. Lived in Japan. Citizen of Earth.
KamenHentai - http://www.youtube.com/watch?v=TozprFrnn10
#263darkwizard3533Posted 7/29/2013 9:23:57 AM
Sheepinator posted...

The problem with your description is you state with absolute certainty that it cannot improve graphics, then you go on to explain how it can save console clock cycles... which could in turn be used to improve graphics (albeit slightly and probably only in select genres).


sigh. first the cpu has very little to do with the graphics. saved cpu cycles aren't going to improve your graphics at all.

there's this wonderful piece called a gpu.........graphics processing unit.background ai is happening in the background and has no bearing on onscreen activity.


what your talking about is foreground ai which is impractical for other reasons(essentially a dedicated server environment to play a single player game). ever played diablo 3 on pc? seen the enemies warp around? ever played hardcore and had a lag death? there's very very little performance benefit and a whole bunch of negatives. you want that in your single player games? more power to you. i sure the heck don't.

also its only beneficial for more complex algorithms as your still using cycles to send/receive and process the results. which again is why im suggesting it is far more reasonable to do this for background ai. in other words there has to be a certain level of complexity in order for it to have any potential benefit.
#264SheepinatorPosted 7/29/2013 9:34:08 AM
darkwizard3533 posted...
Sheepinator posted...
The problem with your description is you state with absolute certainty that it cannot improve graphics, then you go on to explain how it can save console clock cycles... which could in turn be used to improve graphics (albeit slightly and probably only in select genres).

sigh. first the cpu has very little to do with the graphics. saved cpu cycles aren't going to improve your graphics at all.

there's this wonderful piece called a gpu.........graphics processing unit.background ai is happening in the background and has no bearing on onscreen activity.

I love the irony of how you say you know what you're talking about and that basically everyone else in the thread doesn't, then you go on to state absolutes which are totally untrue.

Have you never heard of games being CPU bound? Before graphics can be rendered to the screen, the engine has to check for blend animations for every character, compute matrices for the vertices to be skinned, update particle simulations (could be done on GPU), cull the scene to the view frustum, sort things in the view by alpha or by material, build draw lists for the GPU, and so on. If you save clock cycles of course it can be used for all the above. Again, not significantly, but to get on your soap box to declare a benefit of zero while simultaneously admitting it saves clock cycles, is a strange stance to take. If you had just said, "...minimal graphics benefit..." that would have been more accurate.
---
My mad face and my happy face are the same.
#265Board_hunter567Posted 7/29/2013 11:44:25 AM
You can do certain things with cloud technology that could theoretically improve graphics.
The problem is and always will be latency issues.
#266NeoMonkPosted 7/29/2013 11:47:29 AM
JusticeSword posted...
Mark Cerny ;

Itís possible to do computing in the Cloud, PlayStation 4 can do computing in the Cloud. We do something today: Matchmaking is done in the Cloud and it works very well. If we think about things that donít work wellÖ Trying to boost the quality of the graphics, that wonít work well in the Cloud.


/thread
---
"The Xbox One board isn't the place for personal anecdotes, joke topics or fanboy affair." Gamefaqs Moderator
#267BushidoEffect3Posted 7/31/2013 7:41:10 AM
I'm Batman.
---
http://www.youtube.com/watch?v=TozprFrnn10, XBO dunk fail - http://i.imgur.com/bOGdIFW.gif
Cute serious internets http://i.imgur.com/JZ7u2DD.jpg
#268TrueBlue91Posted 7/31/2013 8:46:25 AM
BushidoEffect3 posted...
I'm Batman.

No you're not.
---
Pour grammer annoy's me
#26982xenoPosted 7/31/2013 8:50:33 AM
Latency trumps all, get back to me when your calculations + network latency can keep up with a 33 ms frame generation.
---
Shwing
#270Juzten76Posted 7/31/2013 8:50:51 AM
So the bottom line is we still don't know how this is all going to work. All we know is that it is going to be AWESOME!

November can't get here soon enough!