Video Game Resolutions are not 480i, 480p, 720p, 1080i or 1080p.
- Topic Archived
The easiest way to explain the difference between the resolution a game runs at and the resolution a game is displayed at would be to look at it in the following way...
Resolution(Native Resolution) = 640x480/1280x720/1920x1080
Output(Output Resolution) = 480p/720p/1080p
It is not possible for you to change the resolution of a console game. Switching between 480p/720p/1080p will not change the image being displayed on the screen. The same image will be displayed regardless if you are running the game in 480p or 1080p, the only difference being the effects caused by scaling. The higher you change a game's resolution, the larger the game world becomes on screen. The window/camera becomes much wider at 1920x1080 compared to 1280x720 or 640x480, but the aspect ratio does not change. This does not happen when you switch from 720p to 1080p as you are not changing the game's resolution, just the output. A console game can look worse at 1080p compared to 720p, or even 480p, depending on what the resolution of the game is.
A game can have a resolution of 640x480 and still be displayed at 1080p. This is called upscaling and it will produce pixelation and jaggies, resulting in a worse image at 1080p compared to 480p. Likewise, a game can run at a resolution of 1920x1080 and be displayed at 480p. This is called downscaling and it will remove jaggies but cause the image to become blurred. While it can be beneficial to downscale from 1920x1080 to 720p to remove jaggies, this is only beneficial in a game that lacks graphic filters like Anti-Aliasing along with producing a large amount of jaggies. Downscaling all the way to 480p will virtually eliminate the jaggies, but the blur produced when downscaling this far is often not worth the reduction of jaggies
On consoles, there is no such thing as displaying a game in 900p, 540p or anything other than 480i, 480p, 720p, 1080i or 1080p. When developers say a game runs at 900p, they are saying that the native resolution is 1400x900, like Watch_Dogs for the PS4. However you cannot display it in 900p as the PS4 will not allow you to do that. You can upscale it to 1080p or downscale it to 720p, with the better image coming in at 720p.
The native resolution for Mario Kart 8 is 1280x720 and it does not implement any form of Anti-Aliasing. To get the best image quality for this game your Wii U should be set to 720p, which is the exact output the game was meant to be displayed at. Setting your Wii U to 1080p will produce more jaggies in Mario Kart 8, something that is already prevalent at 720p due to the lack of AA. If you set the Wii U to display at 480p it will nearly eliminate those jaggies, however the image will be blurred.
Setting your console to 480p/720p/1080p should never be compared to changing the resolution of that same game on a PC as it is in no way the same thing. What would be the same is if the console game runs at 1400x900 on your console, you set the game to run at 1400x900 on PC an then you set your console and PC to display at 720p. This excludes any differences that graphic filters such as Anti-Aliasing and Anisotropic Filtering provide on PC or higher quality source assets(textures, models) that are included, obviously.
On PC you can set your game to run at a resolution of 3840x2160 even if your monitor or HDTV only supports up to 1080p. This is called downsampling, which is different from downscaling. This allows a game to run at double the resolution your monitor or HDTV supports, producing visuals that are twice as good as they would be in 1920x1080. The textures specifically see an enormous increase in quality and the need for AA is practically eliminated due to the quality of the image. However, this requires a very powerful GPU capable of this kind of scaling while still retaining a playable framerate.
To get the best image for a console game you would want to set your console to display at an output that matches the games resolution. If a game's resolution does not have a matching output then you would want to set it to the closest lower output. For consoles, this means that if the game's resolution isn't 1920x1080 then you want your console to be set at 720p in order to receive the best possible image. Why? Because no console game has a resolution higher than 1920x1080, so anything that is not 1920x1080 will look better when downscaled to 720p as opposed to upscaled to 1080p.
It is also important to know what resolution your HDTV is. Like games, your HDTV only has one native resolution but supports multiple output resolutions. If your HDTV's resolution is 1360x768 but supports 1080p, a game with a resolution of 1920x1080 will actually look better at 720p opposed to 1080p on your HDTV. Again, just like games, your HDTV will upscale or downscale the image to be output at 480p, 720p or 1080p. If your HDTV's resolution is lower than 1920x1080 than your console should always be set to 720p in order to get the best possible image.
To find out the native resolution of your HDTV you will need to check it's specifications. You can do this by looking at the manual for it or by finding the model number and searching the specifications online. To find the native resolution of your console game you can just search for it online, making sure the resolution is listed for your console as they can vary depending on the console.
Way too much, bud. I like the time period, theme, and storylines. Hell, I don't even worry about gameplay that much.
I want to be sucked into the game like a Stephen Kinb Book.
That why I want Halo 5 and The Order. I've always enjoyed the Halo series stories and the Order with the time period and where did history skew off for them to have those weapons and such.
Sorry, it can be nothing but pixels and look like the original Onion Knights Final Fantasy.
I'm sure it is a great read on resolution though man and it sounds really technical. Some folks are gonna enjoy it.
Some people spend an entire lifetime wondering if they made a difference in the world. But, the Marines don't have that problem.
Ronald Reagan, PotUS
Very informational topic my man. It seems like this is about all "certain" people talk about nowadays btw.
Death isn't the exit of existence. It's the entrance into eternity. R.I.P Zora Nelson 3/6/13 Forever loved
Oh my God who gives a s***
I'm not a kid, I just act like one
GT: ITS DAT DAM KID --- PSN: Lamburghini89///PS4, X1, Wii U, 360, 3DS
Either way, it's being scaled on a 1080p screen. I'd venture a guess that it would come down to what has the better scaler; your display, or your console.
Case | Mother Board | CPU (OC'd!) | Video Card x 2 | RAM | PSU | SSD | HDD | Some Fans | Monitor | Mouse | Keyboard
You CAN change the resolution of the game. If the game say, supports 1080p, natively...you can set the console to 720p, and it will be rendered, and outputted in 720p. It works like that all the way down the line when the highest resolution is supported by the game, hence why games typically say 480/720/1080...and not just 1080. But even if it did only say 1080 on the back of the game, the Xbox can still render the game at a lower resolution if that's what your Xbox is set to, and your TV supports it.
If a game is being played on a standard definition TV, it will output 480i. Using the component (r/g/b) plugs on a SDTV will run it in 480p, which looks much better than 480i.
If on a HDTV, a game runs natively at 720p, and you switch your console to 720p, then the game will run natively at 720p, and be outputted the same.
If a game is 720p and you have your Xbox set to 1080p, the Scaling Chip in the Xbox will scale it to be outputted at 1080p, but still natively will be running in 720p.
Back a few years ago, when 1080p was first introduced, it was said that setting your Xbox to the native resolution of the game would yield best results visually.
Also, on the 360, using the VGA adapter, it was possible to run games in different resolutions than 480/720/1080.
Hope I educated your educating.
ASUS p8h61-M (Rev 3.0) | Intel CORE i3 2100 | 8GB Dual-Channel DDr3 | 500GB HDD | 600w PSU | nVidia GTX 770 4GB GDDr5
It was a good effort, tc. But, sadly, the people that need to read it the most will ignore it and no one else really cares about resolution. To a point where some people dont even know what native means. You cant properly educate stupid people. Mostly, and most importantly, because theyre stupid.
MSI Z87-G45 | I7-4771 Turbo | GTX 770 4GB GDDR5 | Corsair HX 850W |
Kingston Hyper X 8GB DDR3 | 1TB 128 MB Cache HD| Windows 8.1 | 360 | XB1
Add user to Ignore List after reporting
- Topic Archived