I play games on a laptop. My laptop's native display is 1366x768. When I plug in my HDMI to my TV (32" Samsung LED), the TV displays the same resolution and looks fine. However, if I play a game fullscreen, my TV's resolution switches to 1920x1080, and there's overscan and the visual quality is degraded. Nvidia Control Panel tells me the TV is displaying at 1366x768, but my TV says 1920x1080...Not all fullscreen apps do this, but most do. What's going wrong here?
I think I've figured out part of the problem. The Intel HD 4000 integrated graphics option when on my TV is usually "Maintain Display Scaling." When I run certain fullscreen apps, the setting changes to 59hz refresh rate and "Maintain Aspect Ratio." My TV's resolution says 1920x1080. When I change the Intel setting back to "Maintain Display Scaling" the TV resolution changes back to 1366x768 like I want it. Any way to stop this from happening?
Why not set it up so 1920x1080 is your native resolution when connected via HDMI? Unless you need mirroring?
1366x768 is as high as my laptop goes.
Even with an external display, you cannot go into resolution and set it to 1920x1080, one display only?
Well if that's the case then I'd be surprised if anything about using an external display works properly.
No, he can. So long as he's only displaying it on the TV he should be able to select 1920x1080. He's most likely going to have to go into the TV settings and look for something like Just Scan or 16:9 aspect ratio.
I've noticed something else. In games, if I'm playing on my TV with 1366x768 resolution set, the TV displays 1920x1080. If I set the game's resolution to 1360x768, the TV displays 1366x768. Very weird. A bug in Intel maybe?