This is a split board - You can return to the Split List for other boards.

nVidia G-Sync

#41JKatarnPosted 10/19/2013 10:50:44 AM
Asellus posted...
http://blogs.nvidia.com/blog/2013/10/18/g-sync/

+ Might finally put the "V-sync on or off" debate to rest by letting monitors synchronize with the video card instead of trying to do it vice-versa eliminating both the tearing that having it off and the input lag that having it on causes.

- It'll require a monitor with their new chip built into it to function.


Yeah, that's not going to happen....I don't see legions of gamers with their chosen monitors/Korean 1440p panels going out and buying a new monitor for some "G-Sync" chip...nice try Nvidia.
---
Asus P8Z68-V LE | Core i7 2600K | 8GB G.Skill Ripjaws DDR3 | Gigabyte GeForce GTX 660 Windforce OC
PS3 | PS2 | PSP| Wii | 3DS | DS | X-Box 360 | X-Box | NES
#42DV8ingSourcesPosted 10/19/2013 11:01:36 AM
JKatarn posted...
Asellus posted...
http://blogs.nvidia.com/blog/2013/10/18/g-sync/

+ Might finally put the "V-sync on or off" debate to rest by letting monitors synchronize with the video card instead of trying to do it vice-versa eliminating both the tearing that having it off and the input lag that having it on causes.

- It'll require a monitor with their new chip built into it to function.


Yeah, that's not going to happen....I don't see legions of gamers with their chosen monitors/Korean 1440p panels going out and buying a new monitor for some "G-Sync" chip...nice try Nvidia.


Not only that but you need a newer nvidia card as well. That said, I think you underestimate the appeal to techies. This is more or less a stepping stone for what really needs to happen. Nvidia put a ton of money into the research and development and will likely try to recoup those costs a little. That said I do think that they will release the tech eventually to the masses.

I can see this sort of technology being implemented into televisions and other devices in the future. Its a logical step forward for the way visual media devices are handled. Its cumbersome and prohibitively expensive now but I also remember when a 50 inch HDTV was over 10k.
---
2500k @ 4.4 | P8Z68-V Pro | H80 | 8GB | 670 | 256 ssd | 6Tb hdd | Win 8 64bit | ax1200w | BD burner | cm690II
Steam: DV8ing1
#43JKatarnPosted 10/19/2013 6:05:35 PM
DV8ingSources posted...
JKatarn posted...
Asellus posted...
http://blogs.nvidia.com/blog/2013/10/18/g-sync/

+ Might finally put the "V-sync on or off" debate to rest by letting monitors synchronize with the video card instead of trying to do it vice-versa eliminating both the tearing that having it off and the input lag that having it on causes.

- It'll require a monitor with their new chip built into it to function.


Yeah, that's not going to happen....I don't see legions of gamers with their chosen monitors/Korean 1440p panels going out and buying a new monitor for some "G-Sync" chip...nice try Nvidia.


Not only that but you need a newer nvidia card as well. That said, I think you underestimate the appeal to techies. This is more or less a stepping stone for what really needs to happen. Nvidia put a ton of money into the research and development and will likely try to recoup those costs a little. That said I do think that they will release the tech eventually to the masses.

I can see this sort of technology being implemented into televisions and other devices in the future. Its a logical step forward for the way visual media devices are handled. Its cumbersome and prohibitively expensive now but I also remember when a 50 inch HDTV was over 10k.


The question is, will it have applications beyond the realm of gaming? It would obviously be beneficial to bleeding edge PC gamers (all 20 of them), but with console games typically running at/limited to 30/60 FPS anyway I don't see the benefit there, and I can't (personally) see the benefit beyond reducing the tearing and input lag that's a bane to gamers. Time will tell.
---
Asus P8Z68-V LE | Core i7 2600K | 8GB G.Skill Ripjaws DDR3 | Gigabyte GeForce GTX 660 Windforce OC
PS3 | PS2 | PSP| Wii | 3DS | DS | X-Box 360 | X-Box | NES
#44Master_BassPosted 10/19/2013 6:13:14 PM
PraetorXyn posted...
Snuckie7 posted...
I'm just hoping that the module will be compatible with panels other than TN :/


This...

Yeah, this. I can never go back to a TN panel again after using my U3014 for a few months.
---
Many Bothans died to bring you this post.
#45DV8ingSourcesPosted 10/19/2013 6:19:22 PM
JKatarn posted...
The question is, will it have applications beyond the realm of gaming? It would obviously be beneficial to bleeding edge PC gamers (all 20 of them), but with console games typically running at/limited to 30/60 FPS anyway I don't see the benefit there, and I can't (personally) see the benefit beyond reducing the tearing and input lag that's a bane to gamers. Time will tell.


The only additional use I can see is to properly utilize different program framerates. 24fps movies and television shows versus the possible 48fps of the future. Perhaps even the pal vs ntsc restraints would all but disappear in a better way. Its mainly a gaming advantage for sure. I remember darksiders (the first game) had atrocious screen tearing on the consoles.
---
2500k @ 4.4 | P8Z68-V Pro | H80 | 8GB | 670 | 256 ssd | 6Tb hdd | Win 8 64bit | ax1200w | BD burner | cm690II
Steam: DV8ing1
#46ZeraphLordSPosted 10/20/2013 3:16:09 AM(edited)
SinisterSlay posted...
ZeraphLordS posted...
Yay Carmack. WIth him being a large proponent of reduced input lag and strobed displays, I'm impressed.

SinisterSlay posted...
Wouldn't it still start tearing if your framerate went above the maximum for the monitor?


argh that's not what causes tearing

tearing still happens below the refresh rate

if a frame was completed every 16.67ms but the refresh interval was 16.67 (with a 8.33ms offset) you would have exactly 60fps but a huge ugly tear running constantly through the middle of the screen

vsync literally makes sure that it never displays unless they're lined up

think of it more like cinema projector how it flashes only when a clear frame will be displayed, and not when it's half way between frames on a reel, rather than an overflow issue


Yes it is, you get tearing whenever your frame rate and your refresh rate differ.
G-sync can apparently reduce the refresh rate to match framerate. But I'm saying you will still get tearing if your framerate goes above the maximum refresh rate.

So if I play monster truck madness, I expect tearing since so far, there is no monitor with a refresh rate over 400. G-sync would have to be used with an FPS limiter in this case.


No, that's not the reason. You get tearing (traditionally) when it's time to display a frame (monitor polls the buffer), and it's half full of the previous frame, and the new frame (which can happen at fps above the refresh rate, and below, and even when the refresh interval and time-to-render are the same, as long as they are out-of-sync).

Normally the monitor says hi every 16.67ms @ 60hz, regardless if the buffer is half done or not; traditional double-buffered vsync tells the monitor to wait till the next refresh for a complete frame, but also ceases rendering until it's sent a frame to the monitor).

If you render two frames faster than the minimum refresh interval, you could just render to two parallel buffers alternately and choose the most recent complete one to push to the monitor when it can refresh again (as with true parallel triple-buffering, not the sequential crap we see everywhere).

Limiting fps is not a requirement for achieving tear-free rendering.

DV8ingSources posted...
That extra clarity in that image is NOT from gsync.

Something else is at play there. What it does do is allow you to read better because it moves in a more natural smooth motion rather than jerking around due to stutters and tears.

That image isn't at all indicative of gsync at play.


You're correct that it isn't gsync, but it does show that they managed to implement a variable strobe without any variation in overall brightness (previously the backlight would fire at a set interval with a set brightness to give the illusion of being constantly lit).

It's the same effect you get from a CRT, or a lightboost2 hacked 120hz+ monitor.
---
The best course of action is to just get the information you need, then get out while you're still alive. - destroy everything on GameFAQs
#47MaxCHEATER64Posted 10/20/2013 3:40:53 AM
Isn't the V-Sync problem already solved by Triple-Buffering? Am I understanding this wrong?
---
i5-3570K | HD 7850 | Z77-D3H | 700W | Intel 550 180GB | Seagate Barracuda 1T
http://i.imgur.com/xUwa3va.png
#48Asellus(Topic Creator)Posted 10/20/2013 8:48:44 AM
Isn't the V-Sync problem already solved by Triple-Buffering? Am I understanding this wrong?

Not exactly, it's better for input latency than standard double buffering since the gpu isn't stalled between frames - it keeps working on new frames behind the scenes and when the monitor says its ready for a new one it gives it the most current one its finished. This however is letting the monitor time itself to the gpu's updates right on the mark - the monitor's refreshing itself in time to exactly how the video card's finishing with stuff rather than the video card passing frames to the monitor every 1/60'th of a second or what have you.
#49MaxCHEATER64Posted 10/20/2013 9:22:39 AM
Asellus posted...
Isn't the V-Sync problem already solved by Triple-Buffering? Am I understanding this wrong?

Not exactly, it's better for input latency than standard double buffering since the gpu isn't stalled between frames - it keeps working on new frames behind the scenes and when the monitor says its ready for a new one it gives it the most current one its finished. This however is letting the monitor time itself to the gpu's updates right on the mark - the monitor's refreshing itself in time to exactly how the video card's finishing with stuff rather than the video card passing frames to the monitor every 1/60'th of a second or what have you.


Yes. So wouldn't that solve the problem?
From what I understand, V-Sync's frame rate drop problem is due to the double-buffer system, which forces the monitor to "skip a frame" every time the GPU can't produce a frame fast enough to fill the buffer, correct?
And tripple buffering solves this by having an extra frame to fall back on if the GPU can't work fast enough, so you don't get the silly frame rate drops double buffering or single buffering has.

---
i5-3570K | HD 7850 | Z77-D3H | 700W | Intel 550 180GB | Seagate Barracuda 1T
http://i.imgur.com/xUwa3va.png
#50Asellus(Topic Creator)Posted 10/20/2013 10:04:15 AM
Yes. So wouldn't that solve the problem?

Not exactly, it helps but it's not the same thing and's still working around update scheduales that are fixed to arbitrary amounts of time (usually every 1/60'th of a second) rather than being in tune with the work as it's actually being done. With g-sync if your framerate is say 55 fps your refreshrate can also be 55 hz. It starts drawing the moment your card has finished drawing a frame rather than "the nearest 1/60'th of a second interval after your card has finished drawing a frame".

From what I understand, V-Sync's frame rate drop problem is due to the double-buffer system, which forces the monitor to "skip a frame" every time the GPU can't produce a frame fast enough to fill the buffer, correct?

With standard double buffering the video card will produce a frame and then stop until the monitor's done with the prior one. Because of this if the video card can't produce a new frame in the space of a monitor update your frame rate gets effectively halved - ie say 55 fps without v-sync becomes 30 with, 28 becomes 20, so on and etc. down the list of ratios of the monitor's refreshrate.

And tripple buffering solves this by having an extra frame to fall back on if the GPU can't work fast enough, so you don't get the silly frame rate drops double buffering or single buffering has.

... not exactly? Triple buffering doesn't mean having an extra frame to fall back on, it means you have an extra render buffer. You have one buffer that's being drawn by the monitor and two buffers in which you draw alternating frames back and forth until the monitor's finished with the first is and ready to update at which point you give it the most current completed frame from the two you've been bouncing back and forth between. It's better than double buffering in a number of ways - the gpu doesn't have idle phases while it waits for the monitor to finish drawing so you don't get the frame rate hit and it helps with input latency since you're at least guaranteed the most recent completed frame the gpu has rendered.

There is something sometimes labeled "triple buffering" which involves having a "buffer" of completed frames ahead of the one you're actually seeing which helps with avoiding hits to your framerates but is absolutely ghastly in terms of input lag (unsurprisingly since you're seeing stuff 3-5 frames behind what's actually happening in the game).