Framerate is the measure of many unique images are shown to a viewer during a given period of time. Frames-per-second (FPS) is the common measurement among video games, movies, and animations. Many programming languages measure time intervals in milliseconds such that a 60fps framerate is often denoted as 1000/60.
Code that involves the measurement of time often denotes intervals as fractions of 1000 milliseconds — which equates to a single second. Milliseconds are convenient for computer time since processors are so fast — performance is measured in how many thousands of operations can be achieved per second.
1000/* Translates to Human Time
Dividing 1000 milliseconds into various intervals is how developers translate CPU time frames into human-friendly time frames. For example, 0.06 FPMS doesn’t sound like a very performant gaming experience, right? Likewise, 0.14MHz doesn’t sound like a high-performance monitor.
However, 60FPS and 140Hz both convey a more human-relatable denotation made in seconds, not milliseconds. Below are some examples of common millisecond-to-second conversions one might find in miscellaneous code:
- 1000/60 = 60 Frames Per Second
- 1000/30 = 30 Frames Per Second
- 1000 * 60 = 1 Minute
- 1000 * 60 * 60 = 1 Hour
- 1000 * 60 * 60 * 24 = 1 Day
Generally speaking, video and animation streaming is able to provide fixed-frequency FPS experiences. In cases of lower bandwidths or signal disruptions, the FPS might decrease. However, there is minimal processing power required for streaming video aside from processing incoming streams of images which remain at a fairly constant rate.
However, gaming applications or visual computation tools are different. These tools stream the visual output from an application that is often performing CPU or GPU-intensive operations. If an operation takes longer to complete, the visual output from that operation might not be available as quickly as in previous operations. In gaming, this is experienced as drops in framerate during scenes or sequences where lots of calculations are being made.
The human eye perceives “artifacts” in visual content like movies and amination in which content is streamed at an insufficient framerate. Historically, 24 and 30 frames per second have been common in movies and animation. More recent advances in technology have promoted frame rates of 60hz and more as “optimal.”
Research suggests that the maximum observable rate for human perception is between 50-90hz. That is, if a framerate were to exceed 90 frames per second it’s likely that the perceived quality improvements would be negligible. However, certain scenarios make it possible to perceive flickering among streams of content at frame rates as high as 500fps1.
Animation Frame Requests
requestAnimationFrame by which a web browser is instructed to request a specific function be called before the next “repaint” of an animated sequence. This method is called on a
window object in the browser and is commonly used in animation applications.
setInterval that, like
requestAnimationFrame is an API to the
window object. This method is a lower-level approach to establishing a framerate, such that a repeated call to a function is made with a fixed interval.
Both of these functions are used to control looping behavior common to streaming video or animations, as well as running game loops. In both cases, one is likely to find examples of 1000/60 placed throughout methods of these functions. This, with the concept of frame rates in mind, is a means by which developers establish a 60fps/60hz frame rate.
Dividing 1000 provides a convenient per-second denotation for controlling visual media in many programming languages. It can be used to ensure animation does not exceed the maximal human perceived amount, to ensure things aren’t played back too quickly (e.g. fast-forward effect), and that flickering is avoided. The conversion of 1000/60, for example, would code for the 60FPS/60Hz frame rate that modern gaming and monitors seek to provide.
It is paramount to remember that the applications in which this conversion is applied are vast. Animation, video, gaming, and visualization engines all have relevant applications. However, the conversion of milliseconds to seconds via 1000/* is universal in that CPU and GPU time is generally denoted as milliseconds and common human-perceived time is denoted as seconds.
- Davis, James et al. “Humans perceive flicker artifacts at 500 Hz.” Scientific reports vol. 5 7861. 3 Feb. 2015, doi:10.1038/srep07861