This is a split board - You can return to the Split List for other boards.

Is screen tearing "impossible" on a CRT?

#31XjphPosted 7/5/2012 4:18:28 AM
DarkZV2Beta posted...
Also, an interesting thought for a new class of rendering would be rendering the central pixel in a set of 9 first, creating a low resolution image, and then using the next 3 in a corner, followed by the final 5. That could eliminate tearing at a huge cost of IQ.(could also be done in a 1->3 set.)


You don't even need to go to those lengths. An interlaced screen is already quite resistant to the effects of tearing albeit that's due to the screen already having several hundred permanent tears, in a sense.

The problem with any non-progressive screen updates is that you get other artifacting inherent in the draw method. Combing/twitter on interlaced screens, or some kind of odd speckle effect on the method you describe.
---
"I think the gene pool needs some chlorine..."
#32DarkZV2BetaPosted 7/5/2012 4:37:08 AM
But that kind of rendering method would never work with a modern pixel pipeline. On the other hand, an LCD that waits for an update from the system to refresh a pixel could completely eliminate screen tearing while still using a linear update.
---
Gamers: We want better performance in Dark Souls! PC Port plx!
FromSoft: So you want framerate issues on PC?! OK!
#33XjphPosted 7/5/2012 4:49:25 AM
I don't see how not updating unchanged pixels would eliminate tearing. The pixels below the tear would be changed and therefore drawn as normal. Assuming you're not futzing with the actual image data nothing would be different in the end result. If you are, well, say hello to major mpeg-style artifacting.
---
"I think the gene pool needs some chlorine..."
#34XjphPosted 7/5/2012 4:51:18 AM
...also, how exactly would an interlaced image not work? Just plug your computer into a 480i television. It'll look terrible, but taadaa, interlacing.
---
"I think the gene pool needs some chlorine..."