M76
[H]F Junkie
- Joined
- Jun 12, 2012
- Messages
- 14,058
I've ran a benchmark on Cyberpunk 2077, and this is what I got:
benchmark completed, 4216 frames rendered in 76.406 s
Average framerate : 55.1 FPS
Minimum framerate : 46.5 FPS
Maximum framerate : 73.7 FPS
1% low framerate : 41.3 FPS
0.1% low framerate : 31.7 FPS
How is it possible for the minimum fps to be higher than the 1% low and .1% low? Isn't minimum supposed to representative of the absolute worst frame during the entire benchmark?
Is the minimum framerate the worst 1s during the benchmark with the lowest number of frames? But then in turn that means that maximum isn't the fastest rendered frame either, but the 1s during which the AVG was the highest?
Is this the common understanding? I always thought the min/max was calculated for a singular frame.
benchmark completed, 4216 frames rendered in 76.406 s
Average framerate : 55.1 FPS
Minimum framerate : 46.5 FPS
Maximum framerate : 73.7 FPS
1% low framerate : 41.3 FPS
0.1% low framerate : 31.7 FPS
How is it possible for the minimum fps to be higher than the 1% low and .1% low? Isn't minimum supposed to representative of the absolute worst frame during the entire benchmark?
Is the minimum framerate the worst 1s during the benchmark with the lowest number of frames? But then in turn that means that maximum isn't the fastest rendered frame either, but the 1s during which the AVG was the highest?
Is this the common understanding? I always thought the min/max was calculated for a singular frame.