Want proof that resolution doesn't matter as much as some of you seem to think?

#21evecharmevePosted 3/2/2014 8:33:03 PM
And the ponies let out a thunderous Neigh
---
One and only person to ever finish Tank! Tank! Tank!
50 hours wasted.
#22Reece504Posted 3/3/2014 8:26:34 PM
Again it's funny how this happens but because it's sony it's not bad if this was Xbox people would be rioting and like I said ryse is 900p and the average person would think it's 1080 by looking at it.
---
PSN:Knickyknucklez 360:Kn1ckyknucklez Wii-U:KnickyKnucklez
#23Valor_PhoenixPosted 3/3/2014 10:42:28 PM
SS_MetalSonic posted...
I think he means that according to the people who claim they can tell the difference between resolutions ; it would be impossible for them not to notice this from the beginning.
And that's why it should be an obvious fact to those people. I don't know ... Something like that I guess...There are a number of topics by people saying they can't see a difference and a bunch of replies questioning people's eyesight and whatever else.
So according to these people it should be an obvious fact when they played Killzone.

In the single-player mode, the game runs at full 1080p with an unlocked frame-rate (though a 30fps cap has been introduced as an option in a recent patch), but it's a different story altogether with multiplayer. Here Guerrilla Games has opted for a 960x1080 framebuffer, in pursuit of a 60fps refresh. Across a range of clips, we see the game handing in a 50fps average on multiplayer. It makes a palpable difference, but it's probably not the sort of boost you might expect from halving fill-rate.

Now, there are some mitigating factors here. Shadow Fall uses a horizontal interlace, with every other column of pixels generated using a temporal upscale - in effect, information from previously rendered frames is used to plug the gaps. The fact that few have actually noticed that any upscale at all is in place speaks to its quality, and we can almost certainly assume that this effect is not cheap from a computational perspective. However, at the same time it also confirms that a massive reduction in fill-rate isn't a guaranteed dead cert for hitting 60fps. Indeed, Shadow Fall multiplayer has a noticeably variable frame-rate - even though the fill-rate gain and the temporal upscale are likely to give back and take away fixed amounts of GPU time. Whatever is stopping Killzone from reaching 60fps isn't down to pixel fill-rate, and based on what we learned from our trip to Amsterdam last year, we're pretty confident it's not the CPU in this case either.


It's actually near impossible to notice. That's why no one would know it was happening without a tech article on the issue months after the game came out.

The point of the article is that for consoles the CPU is more of a bottleneck to framerate whereas on PC the GPU is the bottleneck, so lowering resolution improves PC framerates more.
---
~ PSN ID: ValorPhoenix ~ Raven [ / . \ ] Hubris
"There are more defective users than defective systems."
#24Busstavo2Posted 3/3/2014 10:55:26 PM
The fault falls to the developers because if they can make 1080p for the 360 they can do it for the One.
---
"I..just..what....no.. not that...god damit....who made that.... I quit... I be back online in ten minutes because i am so good- Kadician
#25NeoMonkPosted 3/3/2014 10:59:30 PM
"Here's proof that what you all say doesn't matter"
*Uses said proof to bash the competition*

Thanks for this thread it's basically everything these forums embody :)
---
"The Xbox One board isn't the place for personal anecdotes, joke topics or fanboy affair." Gamefaqs Moderator
#26MajesticFerretPosted 3/3/2014 11:12:32 PM
So what I'm getting from this article is that KZ uses a trick that makes the game look 1080p when it technically isn't?

Who the hell cares? If it looks like a duck and quacks like a duck, it's a duck.

The only thing this article proved is that GG are damn good at optimizing and figured out a way to make 720p look like 1080p for less computation power.


Sounds like devs need to learn a thing or 2 from them.
---
Sanity is a one trick pony, all you get with it is rational thought, but with crazy the sky's the limit.
#27Sith JediPosted 3/3/2014 11:14:57 PM
Mr Bump posted...
ElPolloDiablo87 posted...
Mr Bump posted...
Congrats on not actually reading any of that. First of all, the main game is full 1080p, secondly the MP is using a clever interlace technique that they clearly say give a full 1080p image, just over 2 frame buffers. That is different to it actually being a lower realised resolution.

The fact that few have actually noticed that any upscale at all is in place speaks to its quality, and we can almost certainly assume that this effect is not cheap from a computational perspective

It renders at 960x1080 and uses the previous frame to fill in the missing pixels. Totally not full 1080p.


I agree, it is not even slightly full 1080p. However it is very clever and as they point out, almost certainly not 'free' in terms of CPU/GPU cycles and presents an excellent image. I can easily tell the difference between 720p and 1080p on my screen, but I would have been hard pressed to not think the Killzone MP wasn't true 1080p. It definitely has less lighting and effects than the SP, but the resolution looks the same as the SP tbh - and that's on a 120" screen. Titanfall, if the beta is anything to go by, will just be using the X1's internal scaler, so it will very obviously not be 1080p.


It's not clever at all, TV and games have been using that technique forever... what do you think the i in 1080i stands for?

It is pretty funny seeing many of the people who made a huge deal about XB1 being a lower resolution now saying that the resolution doesn't matter because you can't notice it.
---
https://www.youtube.com/user/Bandin6385/
#28Xeeh_BitzPosted 3/3/2014 11:54:46 PM(edited)
Sith Jedi posted...
1080i


Interlaced,the first 540 vertical lines are displayed and within a .3 of a second the other lines are filled in.

Progressive means, instead of filling in the fields at different times, it does it at once.

It's not upscaling or filling in missing pixels, the pixels are there, the interlaced is just how they're displayed.
---
3770K | 780 Ti x 2
Steam: Xeeh Origin: TurboPeasant
#29jimmyCCFCPosted 3/4/2014 12:13:24 AM
What this boils down to is that people say resolution matters, yet were unable to notice a resolution change in a game.
#30Valor_PhoenixPosted 3/4/2014 12:29:51 AM
Xeeh_Bitz posted...
It's not upscaling or filling in missing pixels, the pixels are there, the interlaced is just how they're displayed.

It's not interlaced though.

The frame buffer is rendering a smaller frame, stretching it sideways, doing a temporal upscale with the last frame for detail, and then displaying a 1080p image.

It's 1080p, only half the pixels each frame are newly rendered, but they're blended with the old frame to create a new complete frame.
---
~ PSN ID: ValorPhoenix ~ Raven [ / . \ ] Hubris
"There are more defective users than defective systems."