This is a split board - You can return to the Split List for other boards.

Adantages and Disadvantages of V-Sync?

#11arleasPosted 8/21/2013 9:35:24 PM
I think the input lag comes from having more of a frame buffer (so triple buffering would look better but feel worse)... but I'm not 100% sure on that. I just know that via the nVidia control panel there was an option to increase the prerendered frames and the higher you went, the more input lag you got.

Normally, with Vsync, the instant your framerate drops below 60, it would have to hit an even fraction of the frame rate (like going immediately from 60fps to 30fps even if you would only actually have gone down to 58 fps). However, nVidia has an "adaptive VSync" which is supposed to be better about framerates below 60. allowing the framerate to drop by a little without a drastic drop to 30 fps.

I would only use Vsync in cases where the tearing was horrible and it wasn't possible to cap the framerate otherwise. Sometimes you can cap the framerate to a certain number and it'll remove most of the tearing.
---
http://badges.mypersonality.info/badge/0/19/193056.png
http://www.speedtest.net/result/2309944238.png
#12poopninjamvc3mkPosted 8/21/2013 9:39:30 PM
You think SR4 tearing is bad? Have you not played Mirror's Edge or even SR2 on consoles?
---
http://www.speedtest.net/result/2063467937.png Deleted my old sig since some mod apparantly got offended because I mentioned Penn State.
#13Jiryn(Topic Creator)Posted 8/21/2013 10:16:34 PM(edited)
poopninjamvc3mk posted...
You think SR4 tearing is bad? Have you not played Mirror's Edge or even SR2 on consoles?


I never noticed SR2 screen tearing on console honestly, though I played it the month of release and that's it honestly.
I do hate the sound quality of SR2 on PC however.
Never played ME.

As for SR4, yea it's tearing a ton, especially in quick camera cuts and cutscenes.
---
Mewtwo_Soul: "You can't trust a smile from Sony, but you'd better watch your back after shaking Microsoft's hand."
#14GreenMage7Posted 8/21/2013 10:18:03 PM
Marblesbunbun posted...
poopninjamvc3mk posted...
You think SR4 tearing is bad? Have you not played Mirror's Edge or even SR2 on consoles?


I never noticed SR2 screen tearing on console honestly, though I played it the month of release and that's it honestly.
I do hate the sound quality of SR2 on PC however.
Never played ME.

As for SR4, yea it's tearing a ton, especially in quick camera cuts and cutscenes.


Well, how about you do this.

Enable V-Sync, and see if you find the game more acceptable.

If you don't like it, turn it off.

It's a single player game, I don't think the aliens are going to t-bag you if you die from input lag. Well, it is Saint's Row, so they might actually...
#15Robtcee13Posted 8/21/2013 10:39:13 PM
You don't even like lose anything if you die in SR3/4 do you? I've only died like twice in SR3, and that's because of the tremendous fall damage it has. The Saints Row games are easy, input lag or not.
---
If you think my name is silly, you should call me either Rob or LaughingThesaurus.
No need for being strictly formal on a message board, right?
#16ZeraphLordSPosted 8/21/2013 11:08:59 PM(edited)
tl;dr on vsync and input lag

double-buffered vsync with 60fps minimums @ 60hz
16.67ms input lag

double-buffered vsync with 30-60fps @ 60hz
33.33ms input lag

triple buffered vsync with fluctuating fps @ 60hz
33.33 to 50.00ms input lag (variable, changes per frame)


input lag also comes from low framerates, and can be reduced further by high fps

30fps @ 60hz = 0 - 33.33ms

60fps @ 60hz = 0 - 16.67ms

120fps @ 60hz = 0 - 8.33ms

240fps @ 60hz = 0 - 4.17ms

300fps (Source default max) @ 60hz = 0 - 3.33ms


Vsync only decreases performance insofar as it caps you to your refresh rate (limits maximums), and double-buffering will limit you to framerates that are a factor of the refresh rate (i.e. 60hz = 60fps, 30fps, 20fps, 15fps, 10fps, etc.) (120hz = 120fps, 60fps, 40fps, 30fps, 24fps, 20fps, etc.).


Screen tears occur without exception at all framerates (even sub-refreshrate fps), but become more visible depending on frame differential. Consequently, games in which you move faster (your surroundings change more from one frame to the next), or games where you look around more (large changes in perspective) will be more adversely affected).

Certain patterns of camera movement also exacerbate the problem (horizontal panning for instance), and likewise, a still camera will have no visble tearing.

Because the tear is from the previous frame to the next, a tear from frame 59 to frame 60 is going to look worse than a tear from frame 179 to frame 180 (within the same time period, so that's comparing 60fps to 180fps). You could look at it as if the image changed by 9 pixels from frame to frame in the first instance, but only 3 pixels in the second (more frames per second; less difference from the last one).


Nvidia's Adaptive Vsync is a bit of a misnomer (at least on release, I wouldn't know of any changes since), in that it only applies vsync when you exceed the refreshrate (thus capping you), but applies triple-buffered input lag the entire time). It only achieves more accurate temporal frame placement at sub-refreshrate fps (thus reducing microstutter).

If you don't notice tearing, leave it off. Turn it on when necessary. Some people are more sensitive to input lag than others. With that said, don't go looking for it (L4D has a switchable ingame Triple-Buffering option if you really feel the need to check it out).


Edit: Some games have broken vsync settings (some outright just not working, others coming with mandatory fps caps and additional input lag), some games don't label which type they're using (triple or double, use a fps meter to check), and the control panel settings for both Nvidia and AMD only apply to openGL. D3DOverrider can be used to reliably force double-buffering or triple-buffering in most games when all else fails.
---
The best course of action is to just get the information you need, then get out while you're still alive. - destroy everything on GameFAQs
#17Jiryn(Topic Creator)Posted 8/22/2013 2:56:23 PM
ZeraphLordS posted...
tl;dr on vsync and input lag

double-buffered vsync with 60fps minimums @ 60hz
16.67ms input lag

double-buffered vsync with 30-60fps @ 60hz
33.33ms input lag

triple buffered vsync with fluctuating fps @ 60hz
33.33 to 50.00ms input lag (variable, changes per frame)


input lag also comes from low framerates, and can be reduced further by high fps

30fps @ 60hz = 0 - 33.33ms

60fps @ 60hz = 0 - 16.67ms

120fps @ 60hz = 0 - 8.33ms

240fps @ 60hz = 0 - 4.17ms

300fps (Source default max) @ 60hz = 0 - 3.33ms


Vsync only decreases performance insofar as it caps you to your refresh rate (limits maximums), and double-buffering will limit you to framerates that are a factor of the refresh rate (i.e. 60hz = 60fps, 30fps, 20fps, 15fps, 10fps, etc.) (120hz = 120fps, 60fps, 40fps, 30fps, 24fps, 20fps, etc.).


Screen tears occur without exception at all framerates (even sub-refreshrate fps), but become more visible depending on frame differential. Consequently, games in which you move faster (your surroundings change more from one frame to the next), or games where you look around more (large changes in perspective) will be more adversely affected).

Certain patterns of camera movement also exacerbate the problem (horizontal panning for instance), and likewise, a still camera will have no visble tearing.

Because the tear is from the previous frame to the next, a tear from frame 59 to frame 60 is going to look worse than a tear from frame 179 to frame 180 (within the same time period, so that's comparing 60fps to 180fps). You could look at it as if the image changed by 9 pixels from frame to frame in the first instance, but only 3 pixels in the second (more frames per second; less difference from the last one).


Nvidia's Adaptive Vsync is a bit of a misnomer (at least on release, I wouldn't know of any changes since), in that it only applies vsync when you exceed the refreshrate (thus capping you), but applies triple-buffered input lag the entire time). It only achieves more accurate temporal frame placement at sub-refreshrate fps (thus reducing microstutter).

If you don't notice tearing, leave it off. Turn it on when necessary. Some people are more sensitive to input lag than others. With that said, don't go looking for it (L4D has a switchable ingame Triple-Buffering option if you really feel the need to check it out).


Edit: Some games have broken vsync settings (some outright just not working, others coming with mandatory fps caps and additional input lag), some games don't label which type they're using (triple or double, use a fps meter to check), and the control panel settings for both Nvidia and AMD only apply to openGL. D3DOverrider can be used to reliably force double-buffering or triple-buffering in most games when all else fails.


I didn't notice any input lag but I am curious as to what causes it.
Thank you for all this information.
---
Mewtwo_Soul: "You can't trust a smile from Sony, but you'd better watch your back after shaking Microsoft's hand."
#18GarquillPosted 8/22/2013 2:57:04 PM
DISPLAY LAG, not input lag
---
backloggery.com/Garquille
"Education is what remains after one has forgotten everything he learned in school." ~ A. Einstein
#19Jiryn(Topic Creator)Posted 8/22/2013 3:14:13 PM
Garquille14 posted...
DISPLAY LAG, not input lag


Either works, both mean the time between you hit the button and the reaction happens on screen.
---
Mewtwo_Soul: "You can't trust a smile from Sony, but you'd better watch your back after shaking Microsoft's hand."
#20ShebeskiiPosted 8/22/2013 4:00:42 PM
I have to add to the post by ZeraphLord:

Nvidia's control panel does indeed add vertical sync to all Direct X titles - it's not just OpenGL.

What you're referring to is Triple Buffering, which only works for OpenGL through Nvidia's and AMD's driver feature sets.
---
That which can be asserted without evidence, can be dismissed without evidence. - Christopher Hitchens