Currently, the obsession is for ever higher pixel counts, an approach that disregards how we actually see moving images. If broadcasters have their way, we could be on course for some ridiculous format decisions.
Intuitively, you would think that the frame rate – the number of pictures per second – would have quite a large bearing on the quality of the illusion, and you would be right. Equally, you might think that the film and TV companies had done a lot of research into human vision in order to choose those rates that have been in use unchanged for decades. Unfortunately, you would be wrong.
Let’s see how we arrived at the current frame rate of movie film: 24fps. Silent movies ran even slower, at 18fps, but when the optical soundtrack was invented, it was found that the sound was too muffled at the old film speed because the optics couldn’t resolve the higher pitches in the soundtrack. So the film was sped up in order to get enough sound quality. The use of 24fps in movies has nothing whatsoever to do with any study of human vision.
American television runs at 60 fields per second, whereas in the old world we get along with 50. Do Americans have better vision than Europeans so they need a higher rate? As many Americans are descendents of émigrés from the Old World that’s not very likely. Fact is, the rates used in television were chosen to be the same as the frequency of the local electricity supply because of fears that different rates might interfere or beat with electric light. Once more, the decision had nothing to do with human vision. The entire edifice of film and TV picture rates has no foundation and it is going to have to change.
The Register | Read the Full Article
John Watkinson makes a case against Ultra-high resolution as it becomes lost in anything except static or slow moving shots.