This is a split board - You can return to the Split List for other boards.

Is there any point to run beyond 1080p and 60fps?

#1Phantom0708Posted 1/9/2014 7:20:42 PM
It's my understanding the human eye can only see up to 60fps anyway (correct me if I'm wrong)

And games look great at 60 fps anyway.


And I have seen a lot of comparisons of 1080p and say 1440p and I can't tell much of a difference, if anyone could explain that difference to me or something I'd appreciate it.



I'm asking because I'm building a gaming computer right now and I'd like to know this among other things.
---
GT: Devilwillcry42 PSN: Raidoukuzunoha42 LoL: EmiyaShirou42
My name is Kanji Tatsumi, you said I like dudes, prepare to die.
#2xcmon3yx2Posted 1/9/2014 7:21:40 PM
we can see much more then 60fps
---
http://www.youtube.com/user/xcmon3yx777,(3DS FC: 5257-9927-9011), (Steam: xcmon3yx2), (XBL: HakudoshiV77360), (WoW: xcmon3yx2#1204)
#3HydroCannabinolPosted 1/9/2014 7:23:45 PM
It's 100% fact that your eyes can tell the difference when running above both 1080p and 60fps. If you think otherwise you're ignorant. Don't listen to your friends.
---
Steam ID: Mind_Explosion
I thought I chose very easy, not brand new to the game. - CheesyPhil
#4oj_simpson007Posted 1/9/2014 7:28:06 PM
first of all the human eye doesn't see in Frames per Second so saying we cant see more than 60 fps is a misnomer.

Second I recently upgraded to a 1440p monitor a few months ago and I will say that switching to 1440p isnt as big of a difference as going back to 1080p once your used to 1440p.

Also when gaming at 1440p you dont really need AA.

Is there a point to running above 1080p 60 fps? yes
Is it something you care to invest money in to be able to play that high, thats totally up to you.

I love my setup and would definitely recommend it to anyone that has the money to do so.
---
i5 2500k @ 3.3GHZ || 8.00 GB DDR3 || ASUS P8H67-M PRO || Sapphire 7990 ||
ASUS PB278Q
#5TimePharaohPosted 1/9/2014 7:35:33 PM
Phantom0708 posted...
It's my understanding the human eye can only see up to 60fps anyway (correct me if I'm wrong)


lol
---
HE are genius, firstly. - ASlaveObeys
GestapoFAQs http://i.imgur.com/prqCDHz.png http://i.imgur.com/ooNGE4u.png
#6BogePosted 1/9/2014 7:37:59 PM
It's all up to you. If you can't see the difference beyond 1080p, then there is no reason to go beyond it. Same with framerate.
---
And so he prophesied that the sheeple will follow the stupid, and the stupid will rule the Earth.
#7WerdnAndreWPosted 1/9/2014 7:46:50 PM
You won't see a difference unless the monitor runs at a refresh rate higher than 60hz.
---
Corsair 500r ~ P8Z68-V Pro ~ i5 2500k ~ Hyper 212+ ~ Corsair 2x4gb ~ TX750w v2 ~ 560 Ti ~ F3 1TB ~ Crucial M4 128GB ~ Xonar DG ~ Tt Meka G1 ~ Asus PA248Q
#8SuigintouEVPosted 1/9/2014 8:09:23 PM(edited)
Phantom0708 posted...
It's my understanding the human eye can only see up to 60fps anyway (correct me if I'm wrong)


The human eye can percieve up to about 85hz but the human brain probably can't tell much difference between ~60fps and 85fps roughly. That said, if you really scrutinize, you can pick out mild amounts of flicker at 60hz that won't be there at 85hz on static content - but you'll never notice it when gaming.

What we can however perceive are

1) Judder from a display drawing the same frame twice and the next only once - an uneven rate (IE 30fps will produce less judder than 40 fps on a 60hz display and likewise 24fps is best suited to a monitor that displays a multiple of that rather than 3:2 pulldown at 60hz)
2) Screen Tearing

Since screen tearing is very offensive, it makes sense to use vsync - but that gives a huge performance hit which can drag your fps below 60.

SO the best soltuion tends to be

1) A monitor designed for both 24hz, 30hz, and 60hz content - since 120hz is a multple of each of those, it can be a good option. Modern plasma TVs also switch between 96hz and 60hz depending on content.

2) A graphics card which, even with vsync on, never drops below 60fps because as soon as it even hits 58, that's two frames which were repeated in that second.

of course, one final option is Nvidia's new Gsync technology which solves this kind of problem at the source. If it's what they claim it to be, I will forever buy Gsync monitors.

And I have seen a lot of comparisons of 1080p and say 1440p and I can't tell much of a difference, if anyone could explain that difference to me or something I'd appreciate it.


Depends on your display DPI and watching distance - The bigger the display and the closer you are, the more it will benefit from higher res.

I've found though that even an average display will benefit from stuff like SSAA which requires a beast graphics card with lots of VRAM. Thus a 3D rendering resolution higher than the 2D output resolution can in fact produce a more crisp image.
---
Hissatsu!!! Burst Spinning Giga Plasma Marble Screw Drill Maximum Tempest Break Punch - Pretty Arcshin Gurren Robo II
#9LordSeiferPosted 1/9/2014 7:54:22 PM
the only useful post so far in this topic

WerdnAndreW posted...
You Cant possibly see a difference unless the monitor runs at a refresh rate higher than 60hz.

---
^ this
#10Bellum_SacrumPosted 1/9/2014 7:58:24 PM(edited)
Fun facts:
~ Whether resolution differences are noticeable, depends almost entirely on your monitor's size and your distance from it.
~ Your eyes can see unlimited fps but motion blur is introduced above a certain threshold. Your eyes expect pretty much anything above 35fps to suffer from motion blur. Video games don't emulate that effect and as a result movement in games that run at high fps is unnatural and looks sped up, especially to an observer. This is the reason why companies who know what they're doing put a hard 30fps cap on cutscenes.
---
"Now go ahead and leap ignorantly to the defense of wealthy game companies who don't know or care about you."