On 2014-05-03 13:53, Billie Walsh wrote:
Normal television, "i", is interlaced. That means the picture on the screen has every other line rewritten every pass. This happens at sixty times a second and your eyes can't really see that half the picture doesn't match the other half. This is how the motion is created. What this interlacing does lead to is a certain amount of fuzziness to the picture.
No, interlacing is done for a different reason. When TV was invented, they decided to do it a 30 pictures per second (in the USA; in Europe it was 25). This was somewhat slow and there was some noticiable flicker, so what they did was first display the odd lines on 1/60 of a second, and then display the even lines on the next 1/60 second (1/50 in Europe). Thus the apparent flicker frequency was doubled, and the eye noticed it less. This technique was called "interlacing". (That the frequency is the same of the local mains is not coincidental at all) The projected film on theatres use a similar trick: they put one picture on the projector, switch on the light (so to speak), switch it off, then on again, then off, then move to the next picture. 24 pictures per second, but 48 apparent flickers. Current digital displays can display pictures at a real frame rate of 60 Hz or more, so interlacing is not needed anymore - except to reduce actual frame rate and size, perhaps. -- Cheers / Saludos, Carlos E. R. (from 13.1 x86_64 "Bottle" at Telcontar)