frame time vs fps

In the context of gaming and graphics, FPS (frames per second) is commonly used to measure the performance of a system. However, FPS alone doesn't always tell the full story. Another important metric is frame time, which is the time it takes to render each individual frame. Frame time is more directly related to system performance and can give you a clearer understanding of smoothness and consistency.

Frame Time vs FPS

Frame rate (FPS) is the reciprocal of frame time. This means:

As frame time approaches 0, the frame rate tends toward infinity, which theoretically means an infinite number of frames can be rendered in a second. However, in practical terms, this is not achievable, and the law of diminishing returns kicks in. At higher FPS levels, improvements in smoothness become less noticeable.

Common FPS Values and Their Corresponding Frame Times

Below is a table that shows common FPS values and their corresponding frame times. As you can see, frame time decreases as FPS increases.

FPS Frame Time (ms)
30 FPS 33.33 ms
60 FPS 16.67 ms
120 FPS 8.33 ms
144 FPS 6.94 ms
240 FPS 4.17 ms
300 FPS 3.33 ms
1000 FPS 1.00 ms

As shown in the table, increasing the FPS decreases the frame time, which results in faster rendering of each frame. However, the improvements become less noticeable as FPS increases, especially beyond 60-120 FPS. For instance, going from 30 FPS to 60 FPS offers a clear improvement in smoothness, but going from 240 FPS to 300 FPS offers very little perceived difference, even though the frame time has decreased further.

thinking in terms of frame time

The reason why frame time is the better metric to think about is that you can map it directly to the logic that is running in the main loop. If the main loop consists of 4 different procedures each of which take time t1, t2, t3, t4, then we can determine which takes the longest and then make it run more efficiently.

Another reason why frame time is important, is that we can understand the following table.

Frame Time FPS
16 ms 62.5
6 ms 166.67
4 ms 250
3 ms 333.33
2 ms 500
1 ms (1000 μs) 1,000
500 μs 2,000
250 μs 4,000
100 μs 10,000
50 μs 20,000
10 μs 100,000
1 μs 1,000,000

A basic estimate of how long a single operation would take on a cpu could be 1 nanosecond, which is equal to a thousandth of a micro second, so if you had a program that just did 1000 basic operations which were all cache hits for the data involved, then you could get a frame rate of 1,000,000.


edit this page