It stems from the motion picture industry. Back when film was actually made of real celluloid film it was expensive and you only had so much of it you could put on a reel to run through a projector. That's why long movies, like Ben Hur (the Charlton Heston one, not the new one) had an intermission - it was to allow the film crew in the projector booth time to switch the film reels. The reason 24 frames per second was settled on was that at 24 frames per second the human eye can perceive motion. That was adequate for the filming techniques of the day, which consisted mostly of stationary cameras and moving actors.
Where the disconnect occurred was in that while 24 FPS was sufficient for film someone extrapolated (erroneously) that the human eye did not require more than 24 FPS to interpret motion of any kind. When film was used as a medium it made sense due to the constraints of material cost, transport of film, risk of damage to footage, and the limitations of 20th century era projectors. Now that "filming" is all done digitally the 24 FPS limitation is an anachronism. It's especially noticeable in movies where the camera has a lot of movement. When the image jerks around it is referred to as judder. This effect is most noticeable on a fast pan, "shaky cam" shots, shots with a high degree of rotation, or fast cross-screen motion of a character or object. When you combine any of those it amplifies the effect and it can become intolerable. The filming techniques and the technology have outgrown the 24FPS standard for motion pictures, which is why some directors have been pushing for faster frame rates on filming to better reflect reality, most notably Peter Jackson and now James Cameron.
Of course, in the gaming world this comes back to the premise behind the video, which is taking a PC gamer and forcing them to a sub 30FPS experience on a high-end title. The console world often uses the 24FPS fallacy as justification for a 30 FPS max limit. Subjecting a PC gamer to that IS torture. Granted, avian eyes and brain are different so I'm not really the best metric, but I do know that even 60FPS is at the bottom end of "tolerable" for me on dark games. Brighter games I have trouble with less than 75. I prefer 100 or higher - that's when things start to really smooth out, so I'll often sacrifice some resolution to keep the frames up where I'm comfortable. 30 FPS? Yeah, I'll go read a book instead.
Point is, the PC crowd expects unlocked frame rates and a smooth experience and control over that experience - not to have someone else dictate how that experience should be, and we reject the "30 FPS is enough" dogma that is the crutch used to justify the crappy performance from underpowered console hardware. As a high-end manufacturer, Corsair "gets it". It really is torture to do that to a PC player.
|