Friday, 21 April 2017

Is there much difference with input lag between downscaling and upscaling?

My tv died and I'm stuck with using a small Dynex LCD tv (native resolution 1360x768) and/or a Samsung LCD monitor (native resolution 2048x1152) until September for my Xbox One. The Samsung is hooked up to a Windows 10 PC and I've been streaming to it for games where input lag isn't a big concern (Halo Wars, Peggle, Walking Dead) and it actually works very well and looks very good.

I've been using the Dynex for games where you don't want any input lag (mostly Halo games.) But believe it or not, the Dynex looks a little better when I have the Xbox set to 1080p rather than 720p. When I set the Xbox video display to automatic it will set itself to 720p when it's hooked up to the Dynex, but I know that the display has to upscale a bit to get it to that 1360x768 creating some pixels somehow to get there. When I input 1080p I know it has to downscale and discard pixels. I can't tell any difference while playing except that it looks a little better with the Xbox set to 1080p, but I wonder if the input lag is technically greater with downscaling rather than upscaling?

tldr: Does a display introduce more input lag through downscaling or upscaling?



Submitted by Fat_SMP_peruser | #Specialdealer Special Offer Online Shopping Store 2016

No comments:

Post a Comment