This is a split board - You can return to the Split List for other boards.

What's better: native 576p, or native 576p upscaled to 720p, on a 1080p monitor?

#41DarkZV2BetaPosted 3/29/2014 1:10:31 AM
Conker posted...
http://research.microsoft.com/en-us/people/fengwu/gpu_icip_09.pdf

Interesting read, especially sections 3 and 4.


Doesn't relate to what we're talking about, though. That's a proprietary software scaling solution for remote desktop access.
Also, your wording of the question is terrible. Being easier implies that there is a difference when there isn't.

So where is your conclusive proof that there is any performance difference or extra demand for scaling to a higher resolution on a modern GPU?
---
god invented extension cords. -elchris79
Starcraft 2 has no depth or challenge -GoreGross
#42ConkerPosted 3/29/2014 2:49:36 AM
It actually relates quite a bit, outside of them specifically focusing on hardware acceleration during remote desktop access and screen sharing, the basis applies for all types of scaling. You think various types of filters, passes, and processes aren't used outside of remote access and screen sharing?

Regardless:

http://en.wikipedia.org/wiki/Image_scaling

"With bitmap graphics, as the size of an image is reduced or enlarged, the pixels that form the image become increasingly visible, making the image appear "soft" if pixels are averaged, or jagged if not. With vector graphics the trade-off may be in processing power for re-rendering the image, which may be noticeable as slow re-rendering with still graphics, or slower frame rate and frame skipping in computer animation.

Apart from fitting a smaller display area, image size is most commonly decreased (or subsampled or downsampled) in order to produce thumbnails. Enlarging an image (upsampling or interpolating) is generally common for making smaller imagery fit a bigger screen in fullscreen mode, for example. In “zooming” a bitmap image, it is not possible to discover any more information in the image than already exists, and image quality inevitably suffers. However, there are several methods of increasing the number of pixels that an image contains, which evens out the appearance of the original pixels."


^What do you think that paper was talking about in reference to GPU acceleration? They are simply using various algorithms to more efficiently process the image through the graphics card for better/clearer image quality.
---
Lets Go: Lions, Red Wings, Tigers, Pistons!
#43DarkZV2BetaPosted 3/29/2014 3:01:06 AM
Conker posted...
It actually relates quite a bit, outside of them specifically focusing on hardware acceleration during remote desktop access and screen sharing, the basis applies for all types of scaling. You think various types of filters, passes, and processes aren't used outside of remote access and screen sharing?

Regardless:

http://en.wikipedia.org/wiki/Image_scaling

"With bitmap graphics, as the size of an image is reduced or enlarged, the pixels that form the image become increasingly visible, making the image appear "soft" if pixels are averaged, or jagged if not. With vector graphics the trade-off may be in processing power for re-rendering the image, which may be noticeable as slow re-rendering with still graphics, or slower frame rate and frame skipping in computer animation.

Apart from fitting a smaller display area, image size is most commonly decreased (or subsampled or downsampled) in order to produce thumbnails. Enlarging an image (upsampling or interpolating) is generally common for making smaller imagery fit a bigger screen in fullscreen mode, for example. In “zooming” a bitmap image, it is not possible to discover any more information in the image than already exists, and image quality inevitably suffers. However, there are several methods of increasing the number of pixels that an image contains, which evens out the appearance of the original pixels."


^What do you think that paper was talking about in reference to GPU acceleration? They are simply using various algorithms to more efficiently process the image through the graphics card for better/clearer image quality.


GPU accelerating their reconstruction upscaling. Nothing to do with linear scaling done on frame output.
So are you just picking random things to argue about now?
---
god invented extension cords. -elchris79
Starcraft 2 has no depth or challenge -GoreGross
#44ConkerPosted 3/29/2014 3:49:17 AM(edited)
It is about general upscaling with reconstructed methods/algorithms already in use to more efficiently process the image to have better quality through hardware output. They chose to use it in relation to remote access and screen sharing, but that it applies the same way to standard output from GPU > Monitor. The type of scaling and more importantly WHAT is being scaled (still images, multiple frames of images/videos, games, renders, etc) can all vary on their requirements of processing and frame output from the GPU.

Again, I'm NOT saying this is going to impact the end user in the sense of pushing their GPU, but it does vary how much is required of it (whether GPU or GPU+Monitor to do all the scaling process).

So, in response to your question:

Did you ever address that you don't have any evidence to support that basic scaling on a modern GPU is more demanding than running native? Because it seems like you've just kept ignoring it.


Yes, basic scaling on a modern GPU is more demanding than running native because of the processing required to upscale an image. End of story.
---
Lets Go: Lions, Red Wings, Tigers, Pistons!
#45drinkPosted 3/29/2014 4:36:30 AM
This is what Gamefaqs is all about. A simple question about upscaling turns into this.
---
You didn't try google, you made that up...
Twist those dirty bags - Shake
#46DarkZV2BetaPosted 3/29/2014 11:00:33 PM
Conker posted...
It is about general upscaling with reconstructed methods/algorithms already in use to more efficiently process the image to have better quality through hardware output. They chose to use it in relation to remote access and screen sharing, but that it applies the same way to standard output from GPU > Monitor. The type of scaling and more importantly WHAT is being scaled (still images, multiple frames of images/videos, games, renders, etc) can all vary on their requirements of processing and frame output from the GPU.

Again, I'm NOT saying this is going to impact the end user in the sense of pushing their GPU, but it does vary how much is required of it (whether GPU or GPU+Monitor to do all the scaling process).

So, in response to your question:

Did you ever address that you don't have any evidence to support that basic scaling on a modern GPU is more demanding than running native? Because it seems like you've just kept ignoring it.


Yes, basic scaling on a modern GPU is more demanding than running native because of the processing required to upscale an image. End of story.


We aren't talking about reconstruction upscaling, though. At all. That's a completely different mess, and totally unrelated to this conversation.
Also, you keep saying that without ever providing any proof. I'm still waiting.
---
god invented extension cords. -elchris79
Starcraft 2 has no depth or challenge -GoreGross