Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I have an Aorus 4K Display [Gigabyte], which is technically "not just a TV screen," and in order to defeat the overscan mismatch [having found no method functional either in OS or on TV settings] I plug the HDMI cable into a DVI->HDMI adapter [instead of HDMI direct from TV to HDMI output].

For some reason, this dongle hack, de-necessitates the overscan settings [which never seem to work/hold]. But of course then there is no 4K output [which is fine].



From my previous reading on the subject, HDMI comes from the TV world, and DVI comes from the computer monitor world.

On a computer display, you want each pixel to be square and to be visible on the screen, a border around all the pixles doesn't matter.

On a TV, historically, pixels didn't exist, and the nearest concept to a pixel (I forget the term) wasn't square, it was rectangular. Showing a border around the content is not acceptable, and it should go right up to the very edge. Also in the TV world, the most important thing in the image, not the pixels.

So it's this philosophical difference, and separate histories, which means you should never try to get a TV to be a monitor, or a monitor to be a TV. Of course I only learnt this after buying a device which said it could be both.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: