Every PC gamer appreciates how important framerates are. It’s this figure, perhaps more than any other, that we look for when considering a new component and how it will affect the experience we have while playing – the higher the FPS, the better. While this is broadly true, there are a few technical stumbling blocks worth making yourself aware of.
First, what is FPS? This is one of the rare computer-related questions with a simple answer. FPS measures the number of images rendered in a single second. The higher the number, the smoother the motion appears to be. You can check out this video to see a side-by-side comparison of a few different frame rates.
PC gaming, ideally, takes place at sixty frames or higher. Thus it’s tricky to get a side-by-side comparison because streaming sites like Youtube only allow for sixty fps. What’s more, there’s another limiting factor – the speed at which your display (in this case, usually a computer monitor) refreshes, and this is measured in Hz. If the frame rate is higher than the refresh rate, then those extra frames just won’t be displayed.
If films are at 24fps, isn’t that enough?
We spend hours watching films shot at twenty-four frames per second, and we don’t really complain about it. Many animated films are at less than that, with animators drawing a picture every two frames. Plus, if you watch a film at a higher frame-rate, like The Hobbit or Gemini Man, then you might find that it looks a little bit weird (especially if your television is automatically ‘interpolating’ between the frames using TrueMotion). So what makes games different?
This has a lot to do with expectations. When we sit down to watch Homes Under the Hammer, we expect a higher framerate than we do when watching a major Hollywood film. Stick one frame rate onto the other, and things are bound to look a bit wonky.
Being limited to twenty-four frames is a problem for filmmakers – there are certain types of shot where this is evident. Slow pans across environments with lots of lines (say, trees or walls or buildings) tend to look as though they’re stuttering. Wheels in motion can look as though they’re travelling backwards. Helicopters can appear as though they’re levitating.
Is playing at 240fps worth it?
Unlike in films, in a game, all of this stuff is out of the control of the director – the player has the freedom to look wherever they like. And then, there’s the fact that we need to be able to react quickly to what’s happening on-screen, which requires a steady supply of frames.
Fast-paced games like Doom and Hades benefit enormously from higher frame rates. Once you go up to 144hz, you’ll never go back. Some monitors will get you even higher. Is playing at 240 FPS worth it? It’s a matter of personal preference. If you’re playing esports in a competitive environment, then those extra frames might make all the difference!
How do graphics cards affect frames-per-second (FPS)?
To bump up the frame rate, we can do a few different things. The most obvious is to turn down a few settings. If it takes the graphics card 1% longer to render a given frame when it has to apply higher-detail shadows, then turning down the detail might result in a smoother experience. Spread this principle over multiple settings, and you’ll reduce the load on your card.
Post-processing effects tend to be the most burdensome – but every game is different. Naturally, if you’d rather not sacrifice visual fidelity, then you can instead invest in a more powerful graphics card, and get more frames without compromising.
Of course, it’s not just the graphics card that does the work. Walk into a heavily populated town or an environment where lots of physics is being performed by the CPU, and it won’t matter how great your graphics card is: The CPU will cap the framerate.
Does lowering resolution increase FPS?
Lowering the resolution will, in many cases buy, you more frames. For this reason, many modern games will dynamically scale the resolution on the fly. A few frames at half-resolution aren’t going to be noticed, which makes this approach a clever one.
When you’re messing with the resolution, it’s best to select the one that matches the native resolution of your display (i.e., the one that reflects the actual grid of tiny LEDs that you’re looking at). Look for a resolution scaling option to downsample; this will eliminate any weird behaviour that comes with messing about with the main resolution setting. If the frame rate is static no matter how much you fiddle with the resolution, it’s usually a sign of a bottleneck.
Displayport and HDMI
If there’s not enough bandwidth available to send all of that data from your graphics card to the monitor, then those extra frames you’re rendering won’t arrive in time to be displayed. The result? A garbled signal, or a blank screen.
So, you need a Displayport or HDMI cable with sufficient bandwidth. At higher resolutions, a single frame will take up more data. You can theoretically describe four 1080p frames in the same amount of data as a single 4k one. So, lower the resolution, and you’ll be able to fire over more frames in a given timeframe. That’s why the 48Gbps of HDMI 2.1 can carry a 4k signal at 120fps, but an 8k signal at 60fps.
Refresh rates
As well as the FPS that your machine can achieve, we also need to worry about the speed of the monitor, which tends to be linked to the native resolution. 4k monitors tend to have lower refresh rates than 1440p ones. To achieve truly ridiculous refresh-rates, you need to go down to 1080p – or crack open your wallet.
So, what happens if your refresh rate is higher than your FPS? In many cases, you’ll run into a problem called tearing. The display will start rendering one frame and then receive the next one midway down the screen, resulting in a choppy effect that’s especially noticeable if you’re moving the camera quickly near a vertical line, like the corner of a wall or doorway.
Back in the day, we’d correct this problem with something called vertical-sync. This matches the frame rate to the monitor’s refresh rate (or a subdivision of it). No more tearing; but we’ve introduced a nasty lag, which is far from ideal. And if the framerate dips beneath the refresh rate, we get a sudden drop in fps.
The solution here is something called adaptive sync, or VRR (Variable Refresh Rate). This comes in the form of G-sync from NVIDIA or FreeSync from AMD. Compatible monitors will adjust their refresh rates automatically depending on the incoming video – but you’ll need to be sure that you match the monitor to the graphics card manufacturer.
Frame times
FPS might give you an idea of what to expect, but it can also be misleading. Often, it’s better that the framerate be consistent at 60 than jumping regularly from 60 to 100. For this reason, many games allow you to cap the framerate at a given level.
Moreover, we also need to worry about the spacing between one frame and the next. A game that consistently delivers one frame every 16ms is going to be far smoother than a game that doesn’t, even if both technically add up to 60fps gameplay. Thus, frame times are arguably more important than frame rates – even if they are difficult to quantify.
How to buy a PC based on FPS rates
Unfortunately, it’s not easy to predict what sort of FPS you’ll achieve in a given title with a given machine. Your results might vary considerably from game to game – and patches and new graphics-card drivers can often lead to dramatic changes further down the line. If you want to be sure of a PC that’s suited to a particular game, then why not check out our range of prebuilds? We’ve got machines tailored toward a range of popular titles – and there’s room for tweaking if you have more specific needs!