Wirehead Studios

General Discussion => Entertainment => Topic started by: Phoenix on 2016-10-31, 21:46

Title: A PC Gamer's Worst Nightmare
Post by: Phoenix on 2016-10-31, 21:46


Title: Re: A PC Gamer's Worst Nightmare
Post by: scalliano on 2016-11-04, 00:23
LOL :doom_thumb:

Seriously, who coined that whole "eyes can only register 24fps" tripe? The human eye can register as little as 6 milliseconds, which if my arithmetic is any good, is roughly 165-170fps.

dat kb tho :doom_love:

Title: Re: A PC Gamer's Worst Nightmare
Post by: Phoenix on 2016-11-05, 01:04
It stems from the motion picture industry.  Back when film was actually made of real celluloid film it was expensive and you only had so much of it you could put on a reel to run through a projector.  That's why long movies, like Ben Hur (the Charlton Heston one, not the new one) had an intermission - it was to allow the film crew in the projector booth time to switch the film reels.  The reason 24 frames per second was settled on was that at 24 frames per second the human eye can perceive motion.  That was adequate for the filming techniques of the day, which consisted mostly of stationary cameras and moving actors.

Where the disconnect occurred was in that while 24 FPS was sufficient for film someone extrapolated (erroneously) that the human eye did not require more than 24 FPS to interpret motion of any kind.  When film was used as a medium it made sense due to the constraints of material cost, transport of film, risk of damage to footage, and the limitations of 20th century era projectors.  Now that "filming" is all done digitally the 24 FPS limitation is an anachronism.  It's especially noticeable in movies where the camera has a lot of movement.  When the image jerks around it is referred to as judder.  This effect is most noticeable on a fast pan, "shaky cam" shots, shots with a high degree of rotation, or fast cross-screen motion of a character or object.  When you combine any of those it amplifies the effect and it can become intolerable.  The filming techniques and the technology have outgrown the 24FPS standard for motion pictures, which is why some directors have been pushing for faster frame rates on filming to better reflect reality, most notably Peter Jackson and now James Cameron.

Of course, in the gaming world this comes back to the premise behind the video, which is taking a PC gamer and forcing them to a sub 30FPS experience on a high-end title.  The console world often uses the 24FPS fallacy as justification for a 30 FPS max limit.  Subjecting a PC gamer to that IS torture.  Granted, avian eyes and brain are different so I'm not really the best metric, but I do know that even 60FPS is at the bottom end of "tolerable" for me on dark games.  Brighter games I have trouble with less than 75.  I prefer 100 or higher - that's when things start to really smooth out, so I'll often sacrifice some resolution to keep the frames up where I'm comfortable.  30 FPS?  Yeah, I'll go read a book instead.

Point is, the PC crowd expects unlocked frame rates and a smooth experience and control over that experience - not to have someone else dictate how that experience should be, and we reject the "30 FPS is enough" dogma that is the crutch used to justify the crappy performance from underpowered console hardware.  As a high-end manufacturer, Corsair "gets it".  It really is torture to do that to a PC player.

Title: Re: A PC Gamer's Worst Nightmare
Post by: Phoenix on 2016-11-05, 01:10
Oh yes, and  :doom_love: keyboard.  I switched to a Strafe RGB a while back when I learned that keycaps can be replaced as talons cause excessive wear and tear.  I have literally put holes in key caps on my old keyboards, so that sold me on it.  That, and the mechanical switches are just SOOOO much nicer than bubble dome boards, and custom glowy colors = eye candy = Phoenix likes.  I used to use the old buckling spring mechanical IBM keyboards years back before USB replaced everything.  They were a bit stiff for gaming, but really held up.

Title: Re: A PC Gamer's Worst Nightmare
Post by: Thomas Mink on 2016-11-05, 16:46
I guess I'm a bit different in the PC world.
30fps is my low.. being on the border of intolerable. I've played a few games with 30-ish fps and did well, like TF2 when it was new. And even then some people in servers used to say things like it was impossible to do so.

Now, do I prefer higher? Of course. Foolish not to. Even more foolish is to claim something about not seeing higher than 24fps.. or 30. However, I'm perfectly content with 60. So meh.. whatever.
But for those that want higher, unlock it!

Guess I should note that for the longest time, I was always on mid-to-low end hardware.. so low fps was just something I probably got used to.

Title: Re: A PC Gamer's Worst Nightmare
Post by: scalliano on 2016-11-16, 02:21
Cheers for the background, Pho :doom_thumb: Nice to have a bit of context, even if the statement is still palpably untrue, heh.

I grew up on consoles, back in the days when most (but by no means all) games actually ran at 50fps (I'm in PAL territory) and it was really only in the dawn of mainstream 3D rendering that the framerates began to drop off. Back then it wasn't really as much of an issue as it is now. Hell, even our beloved Doom was locked to 35fps on PC. But as tech has marched on, the old excuses for FPS caps carry less and less weight. I'd be happy to take a hit in visual fidelity if it means that a game runs smoothly. What really winds me up though is when they try and spin a 30fps cap as being "more cinematic". That's as maybe, but that should be MY decision, not theirs. It's that old saying: don't piss in my face and tell me it's raining.

For the record, I'm running 2x GTX970's SLi and DOOM'16 runs great :doom_love: