Another Mature Discussion: Cloud Gaming/Computing

#41A_Someone_ElsePosted 9/5/2013 3:17:37 PM(edited)
Ch3wy posted...

A_Someone_Else posted...
"The Cloud" can not add much graphical power or "offload" much computation.

Even on fiber optic net, you only get 6MB a second of data. The console's native bandwidth is over 68,000MB/s. Even without latency, a 6MB/s boost doesn't add much graphical "power" to 68,000MB/s. To increase that even 10%, you'd need something with HUNDREDS OF TIMES the bandwidth of fiber optic. (Even with very powerful compression.)

Even if you sent small equations with small answers but large processing overhead, latency would limit it to unimportant or slow background activities, and how much power do they require? Even with this method and without latency, could anything communicating at 6MB/s offload even 1% of 68,000MB/s?


Well I could see them getting creative and reducing stutter in a lot of games. For example in STALKER there is fairly frequent stutter for a quarter second or so when it's loading offscreen enemies as you're running around. Granted in their case they could have probably just programmed the game better and lowered the priority of this task in the first place, but it would still allow for more calculations like this to go on without any hiccups.

Also, shorter wait time on turn-based games like Civilization.


I'm not convinced that using the cloud to "load off-screen enemies" is realistic. Characters (especially next generation ones) seem like enormous graphical components. I believe they use a majority of the processing power in most FPS games. Perhaps that's why they can cause certain entire games to stutter. I imagine they'd take way too long "loading in" over 6MB/s, compared to a console's native 68,000+ MB/s which exists largely to render such characters.

For Civilization, I wouldn't be surprised if opponent turns are almost instant next gen. If the consoles are really "ten times as powerful", I'd almost be surprised otherwise. If opponents still take considerable time, the cloud could be convenient for that turn-based minority of games where small pauses aren't as disruptive. However, just because calculations take a while, I'm not sure that means they're taking up much of the power. Could opponent AI take up a mere 2% of the calculations, if the graphics and other mechanics are extremely complex?

I'm not convinced this would offload much toward graphical power. Even if it could, I'm not sure how many developers would bother with the complexity of a transforming game that looks one way online, and another offline.
#42Ch3wyPosted 9/5/2013 3:35:31 PM
A_Someone_Else posted...
Ch3wy posted...

A_Someone_Else posted...
"The Cloud" can not add much graphical power or "offload" much computation.

Even on fiber optic net, you only get 6MB a second of data. The console's native bandwidth is over 68,000MB/s. Even without latency, a 6MB/s boost doesn't add much graphical "power" to 68,000MB/s. To increase that even 10%, you'd need something with HUNDREDS OF TIMES the bandwidth of fiber optic. (Even with very powerful compression.)

Even if you sent small equations with small answers but large processing overhead, latency would limit it to unimportant or slow background activities, and how much power do they require? Even with this method and without latency, could anything communicating at 6MB/s offload even 1% of 68,000MB/s?


Well I could see them getting creative and reducing stutter in a lot of games. For example in STALKER there is fairly frequent stutter for a quarter second or so when it's loading offscreen enemies as you're running around. Granted in their case they could have probably just programmed the game better and lowered the priority of this task in the first place, but it would still allow for more calculations like this to go on without any hiccups.

Also, shorter wait time on turn-based games like Civilization.


I'm not convinced that using the cloud to "load off-screen enemies" is realistic. Characters (especially next generation ones) seem like enormous graphical components. I believe they use a majority of the processing power in most FPS games. Perhaps that's why they can cause certain entire games to stutter. I imagine they'd take way too long "loading in" over 6MB/s, compared to a console's native 68,000+ MB/s which exists largely to render such characters.

For Civilization, I wouldn't be surprised if opponent turns are almost instant next gen. If the consoles are really "ten times as powerful", I'd almost be surprised otherwise. If opponents still take considerable time, the cloud could be convenient for that turn-based minority of games where small pauses aren't as disruptive. However, just because calculations take a while, I'm not sure that means they're taking up much of the power. Could opponent AI take up a mere 2% of the calculations, if the graphics and other mechanics are extremely complex?

I'm not convinced this would offload much toward graphical power. Even if it could, I'm not sure how many developers would bother with the complexity of a transforming game that looks one way online, and another offline.


For Civilization I'm not proposing that it would help increase graphics, just make the turns shorter. Calculating turns does take a lot of power, that's why it can take so long for them to calculate. Even with a high-end PC it can take them a while in Civilization 5.

And I'm not proposing the cloud could fully render offscreen enemies, but rather does calculations and provides specific data about them, what they are doing, where they are, what items they are holding etc... That data, and not just the graphics rendering can cause hiccups.
---
Every time you point out that something is an opinion Jesus shoots a kitten in the face.
#43A_Someone_ElsePosted 9/5/2013 4:59:06 PM
Ch3wy posted...
A_Someone_Else posted...
Ch3wy posted...

A_Someone_Else posted...
"The Cloud" can not add much graphical power or "offload" much computation.

Even on fiber optic net, you only get 6MB a second of data. The console's native bandwidth is over 68,000MB/s. Even without latency, a 6MB/s boost doesn't add much graphical "power" to 68,000MB/s. To increase that even 10%, you'd need something with HUNDREDS OF TIMES the bandwidth of fiber optic. (Even with very powerful compression.)

Even if you sent small equations with small answers but large processing overhead, latency would limit it to unimportant or slow background activities, and how much power do they require? Even with this method and without latency, could anything communicating at 6MB/s offload even 1% of 68,000MB/s?


Well I could see them getting creative and reducing stutter in a lot of games. For example in STALKER there is fairly frequent stutter for a quarter second or so when it's loading offscreen enemies as you're running around. Granted in their case they could have probably just programmed the game better and lowered the priority of this task in the first place, but it would still allow for more calculations like this to go on without any hiccups.

Also, shorter wait time on turn-based games like Civilization.


I'm not convinced that using the cloud to "load off-screen enemies" is realistic. Characters (especially next generation ones) seem like enormous graphical components. I believe they use a majority of the processing power in most FPS games. Perhaps that's why they can cause certain entire games to stutter. I imagine they'd take way too long "loading in" over 6MB/s, compared to a console's native 68,000+ MB/s which exists largely to render such characters.

For Civilization, I wouldn't be surprised if opponent turns are almost instant next gen. If the consoles are really "ten times as powerful", I'd almost be surprised otherwise. If opponents still take considerable time, the cloud could be convenient for that turn-based minority of games where small pauses aren't as disruptive. However, just because calculations take a while, I'm not sure that means they're taking up much of the power. Could opponent AI take up a mere 2% of the calculations, if the graphics and other mechanics are extremely complex?

I'm not convinced this would offload much toward graphical power. Even if it could, I'm not sure how many developers would bother with the complexity of a transforming game that looks one way online, and another offline.


For Civilization I'm not proposing that it would help increase graphics, just make the turns shorter. Calculating turns does take a lot of power, that's why it can take so long for them to calculate. Even with a high-end PC it can take them a while in Civilization 5.

And I'm not proposing the cloud could fully render offscreen enemies, but rather does calculations and provides specific data about them, what they are doing, where they are, what items they are holding etc... That data, and not just the graphics rendering can cause hiccups.


To me, those things suggest a different logic than yours. Although I see where you're coming from, I have a different viewpoint on how the reality of it works.

I believe if certain calculations take a long time whether on console or high-end PC, it suggests it has less to do with power than task priority. I believe the non-graphics parts of characters are extra tiny and quickly processed, and thus won't make much difference in stuttering. Even added up over time, I don't feel like they'd amount to much stutter-smoothing, since bandwidth would limit them to [6 megabytes per second] of difference.
#44Ch3wyPosted 9/5/2013 5:35:50 PM
A_Someone_Else posted...


To me, those things suggest a different logic than yours. Although I see where you're coming from, I have a different viewpoint on how the reality of it works.

I believe if certain calculations take a long time whether on console or high-end PC, it suggests it has less to do with power than task priority. I believe the non-graphics parts of characters are extra tiny and quickly processed, and thus won't make much difference in stuttering. Even added up over time, I don't feel like they'd amount to much stutter-smoothing, since bandwidth would limit them to [6 megabytes per second] of difference.


You're too caught up on the bandwidth... you wouldn't need a lot to do what I'm suggesting. The amounts of data being returned aren't gigantic, but the amount of CPU usage can be very large.

In STALKER as I mentioned earlier, it stutters regardless of graphics settings. It has absolutely nothing to do with graphics, it's all related to these calculations. In this specific scenario, yeah they could have probably just prioritized the tasks better and got rid of the stuttering, but just think about it a bit more large scale.

You'd definitely have to get really creative here but there are plenty of things you could do that return way less than 6Mb/s worth of data but require tons of calculation. Like for example - think of a massive MMO version of a game like the Sims. Thousands of houses, each with their own Sims and every one of them would continue simulating in real time even if the player wasn't there. You could zoom out and in on a whole cities worth of people and nothing would be randomly generated. This wouldn't be possible without cloud computing for a couple reasons - all those calculations happening at once would easily use up all the processing power on a single machine, and also it would need to report the data back to every player so the cloud is the obvious option here.

I guess anything significant here would probably have to be restricted to an MMO type game - since it's not really feasible for each individual player to have a bunch of cloud computing going towards computing massive scale simulations in their single player games. Don't think the infrastructure is quite there yet to say the least. But there are still plenty of uses if you get creative with it.
---
Every time you point out that something is an opinion Jesus shoots a kitten in the face.
#45A_Someone_ElsePosted 9/5/2013 8:29:22 PM
[This message was deleted at the request of the original poster]
#46DerekLoffinPosted 9/5/2013 10:26:32 PM
No single player game is going to use cloud compute. The only way to make cloud compute viable is to be sharing the computation (ie, you have 10 players sharing 100 NPC calcs). Trying to do cloud compute for a single player game, even a relatively modestly successful game would overwhelm Azure's whole network. Any decent usage of cloud compute will have to be multiplayer, and probably fairly large multiplayer (like 100s per map) to be viable.
---
--
I am power made flesh, feel how weak you truly are. --Akuma
#47TyrannicalPosted 9/6/2013 12:45:26 AM
First is that Cloud Gaming/Computing is NOT limited to just the X1. Sony is fully capable of going into this field



El-Wrongo.
Sony can't because you can bet MS has this approach backed up with patents. It's also not about buying servers, it's about integrating the cloud into the software development kits so that game developers can easily take advantage of it with minimum latency.

The hardware CPU/GPU differences are tough to judge because the quality of the console software development kits is much more important. The consoles are not using off the shelf identical video drivers like in the test, but each will be customized to the bare hardware. MS has a huge advantage over Sony because MS has been developing Direct X and OS software for years. Think back to all the PC games that had serious performance issues until the video driver was patched.
---
Australian customs: Do you have a criminal record?
Me: I didn't think you needed one anymore.
#48TyrannicalPosted 9/6/2013 12:55:20 AM
Also, you want MS to have virtual servers. It's really popular in industry now. VMware is the industry leader, but MS's Hyper-V is probably what they'll use. Virtual servers allows MS to quickly and seamlessly add more servers to what ever game is getting a lot of play. So no more too few servers at launch date of a big game, it can quickly carve out all the Halo V servers it needs from the cloud pool.

Of course all the super awesome stuff will take a year or two.
---
Australian customs: Do you have a criminal record?
Me: I didn't think you needed one anymore.
#49FaytalChaos(Topic Creator)Posted 9/7/2013 9:39:44 AM
Tyrannical posted...
First is that Cloud Gaming/Computing is NOT limited to just the X1. Sony is fully capable of going into this field



El-Wrongo.
Sony can't because you can bet MS has this approach backed up with patents. It's also not about buying servers, it's about integrating the cloud into the software development kits so that game developers can easily take advantage of it with minimum latency.

The hardware CPU/GPU differences are tough to judge because the quality of the console software development kits is much more important. The consoles are not using off the shelf identical video drivers like in the test, but each will be customized to the bare hardware. MS has a huge advantage over Sony because MS has been developing Direct X and OS software for years. Think back to all the PC games that had serious performance issues until the video driver was patched.


If your first statement is true why is Nvidia alrighty showing off the capabilities of doing so if it is patented by MS