This is a split board - You can return to the Split List for other boards.

PC gamers are saying nvidia is ruining vsync in games like ac4 and cod ghosts...

#21PraetorXynPosted 11/30/2013 5:23:37 PM
Vsync is buggy enough on its own. I always keep it turned off and use Adaptive Vsync in Nvidia control panel.

They're trying to create Gsync for a reason, though I'm guessing it will flop because I don't see high end IPS panels adopting it because they're meant for graphics editing etc., and that's what most people would rather game on.
---
Console war in a nutshell:
http://imgur.com/xA6GJZ9
#22ThescyyPosted 11/30/2013 5:39:01 PM(edited)
DarkZV2Beta posted...
Thescyy posted...
Knighted Dragon posted...
From: Hi C | #007
But their subjects blindly follow them and even shell out $150-300 more per performance tier for a card. They risk nothing.



This has always baffled me. There are times when Nvidia is better, but most of the time they are not, or are flat out worse, yet you still see hordes of Nvidia fanboys defending them. I was a pretty big AMD CPU fanboy back when they were actually good bang for your buck, but when I built my comp back in April I went with Intel because it would be stupid to pay nearly as much for something that isn't even close performance wise


Except for the fact that every generation since the 8800 nvidia has been in the lead at the top end... Yes it was more expensive but you are just straight up lying if you are going to try and claim they are not leading the majority of the time in the last 7-8 years. Yes during dx9 AMD was top dog several times but since dx10 it's been pretty much the norm for nvidia to be leading with their big dies.


Wasn't the 4000 series on top?
5000 was on top of it's competing market most of the time, too, since Fermi was so late.


The 4000 series killed the price performance ratio but the 285 and 295 were both the fastest cards for single and dual gpu respectively. I also suppose you do have a point with the 5870 as far as how long it was the only new card on the market but the 480 after driver releases was the clear winner in the long run especially when games really started using tessellation.

I had a 4870 as they were a far better deal but the 285 was the better card. The 4890 got closer but still wasn't as fast.
---
i7 3930k @ 4.5Ghz | H220 | asrock x79 extreme 9 | 16GB 1600mhz ram | SLI GTX 780 | Samsung 830 256GB SSD | 8TB HDDs | EVGA Supernova 1300 G2 | Fractal design R4
#23SnadadosPosted 11/30/2013 5:31:49 PM
The cranky hermit posted...
I'm not really sure what the big deal with G-Sync is supposed to be. Doesn't Vsync with triple buffering already solve the problem adequately? I know triple buffering causes a small amount of input lag, but I figure if you have such a need for performance that this isn't acceptable, you probably should be aiming for a steady 60 FPS anyway, which would eliminate the need for triple buffering.

Maybe it makes more sense if you have a 120Hz monitor, but as an owner of a 60Hz IPS, I have never thought "Gee, I wish my monitor had a variable refresh rate so that I didn't have to use Vsync any more!"


Vsync has some problems.
I can create input lag.
It limits you to frame rates that are divisible by your monitor's frame rate (60hz= 60, 30, 20, 15, 10 FPS)
so if you dip a few frames bellow 60fps for a few moments your card will output at 30fps, dip a few frames bellow 30 and now the card will output at 20.
On top of that, I've had games where vsync just did not do anything to fix screen tearing.

What gsync does is it makes your monitor refresh at what ever rate your card is running at so you won't see the drastic drop from 60 to 30 fps that you get with vsync if your card can only output at, say, 55 fps.
Instead your card will output at 55fps and your monitor will refresh at 55fps.
---
Have you accepted Raspberyl as your loli and savior?
#24ATARIJAWAPosted 11/30/2013 7:00:59 PM
KabtheMentat posted...
The cranky hermit posted...
I'm not really sure what the big deal with G-Sync is supposed to be. Doesn't Vsync with triple buffering already solve the problem adequately? I know triple buffering causes a small amount of input lag, but I figure if you have such a need for performance that this isn't acceptable, you probably should be aiming for a steady 60 FPS anyway, which would eliminate the need for triple buffering.

Maybe it makes more sense if you have a 120Hz monitor, but as an owner of a 60Hz IPS, I have never thought "Gee, I wish my monitor had a variable refresh rate so that I didn't have to use Vsync any more!"


Sounds a lot like Nvidia trying to create a market where there really isn't one. My guess is because lately Nvidia has been getting smoked by AMD because A) Consoles use all AMD stuff, B) Price/performance ratio AMD beats the hell out of Nvidia.


You do know Nvidia turned down both Sony and Microsoft first right?
---
Gamefaqs game rating system : 10 = Best Game Ever. 8-9. Crushing dissapointment. Below 8 :Total Garbage. This is getting ridiculous. people agreeing so far 105
#25KabtheMentatPosted 11/30/2013 7:12:31 PM
Which I'm sure some people at Nvidia are regretting now. Also, the only reason to turn them down was because they didn't want to pay what Nvidia wanted.
---
Big Money. Big Women. Big Fun.
Skillz Ferguson
#26El_ZaggyPosted 11/30/2013 7:26:43 PM
AC4 is not too bad but COD suffer from stupid coding making to game reallllllly bad
#27ATARIJAWAPosted 11/30/2013 7:58:02 PM
KabtheMentat posted...
Which I'm sure some people at Nvidia are regretting now. Also, the only reason to turn them down was because they didn't want to pay what Nvidia wanted.


Well AMD didn't really have a choice. They might have gone out of business otherwise.
---
Gamefaqs game rating system : 10 = Best Game Ever. 8-9. Crushing dissapointment. Below 8 :Total Garbage. This is getting ridiculous. people agreeing so far 105
#28DarkZV2BetaPosted 11/30/2013 8:09:46 PM
KabtheMentat posted...
Which I'm sure some people at Nvidia are regretting now. Also, the only reason to turn them down was because they didn't want to pay what Nvidia wanted.


I wouldn't be so sure. It's not like they're losing GRID server sales with Sony buying them up for Gaikai.
---
god invented extension cords. -elchris79