Frame rate and refresh rate are two related terms that are easy to confuse. Some people use these interchangeably, but this isn't accurate. If you're trying to improve gaming performance or thinking about upgrading your hardware, you need to understand how these terms differ and how they're connected.

Below, we define both frame rate and refresh rate, then compare and contrast them so you have a clear understanding of what they mean to you.

What Is Frame Rate?

Frame rate is a measure of how fast individual images, known as frames, appear on a screen. As you may know, all video is actually a series of pictures being shown quickly. When the human eye watches these pictures change rapidly, it interprets this as motion.

A frame rate is typically expressed in FPS, or frames per second. Obviously, the higher the frame rate, the more images appear on the screen every second. More frames means more detail, so the motion looks smoother at higher frame rates.

Frame rates are commonly discussed in regards to video games. The exact frame rate depends on the system and game you're playing. But in general, 30FPS is the accepted minimum for gaming (particularly on consoles), while 60FPS is preferred when possible.

However, frame rate is relevant in other forms of video, not just games. For example, most movies and TV shows are shot at 24FPS, which is largely due to historical limitations. In early movies, film was expensive, so recording in 24FPS allowed filmmakers to conserve film, while still using a high enough frame rate that the movie wouldn't look choppy.

Nowadays, 24FPS remains the standard for most recorded media. Since most people are accustomed to this frame rate, seeing a movie at a higher frame rate looks odd—almost like you're watching the actors move right in front of you.

Meanwhile, live broadcasts like sports are typically shot in 30FPS. The higher frame rate makes the fast motion of these events easier to watch.

What Is Refresh Rate?

Refresh rate refers to the number of times that a screen updates its displayed image.

In the old days of CRT (cathode ray tube) displays, this was the number of times that the electron gun inside the display would draw a new image on the screen. A low refresh rate resulted in annoying flickering, which is when your eye notices the change in brightness between frames.

But in today's modern displays, like LCD TVs, this isn't a concern. Instead, the refresh rate of a digital display only refers to how fast the screen can possibly update the image.

Refresh rate is typically expressed in hertz (Hz). Almost every display you can buy today will have a refresh rate of at least 60Hz. Higher refresh rate displays are available, though, and are usually intended for gaming.

If you're interested in more on the history of display technology, check out what NTSC and PAL mean.

How Do Frame Rate and Refresh Rate Differ?

Now that you understand both of these terms, it's easier to see how they differ and how they work together.

The frame rate is the number of images that a computer, video game console, video player, or other device sends to the display every second. Meanwhile, the refresh rate is how fast the display can actually show those frames.

For best results, these values should sync up, or at least be close. Consider a situation where you have a gaming PC that sends 200FPS to the display, but the monitor only runs at 60Hz. This leads to screen tearing, a frequent PC gaming problem where you see parts of two or more different frames show at once.

You see pieces of frames before they should appear because your monitor can't keep up with everything that the graphics card sends it. This can lead to motion sickness, plus it looks ugly.

Solutions to this problem include VSync, a common PC game setting that sync the FPS of your game with the refresh rate of your monitor. And other solutions, like AMD's FreeSync, can eliminate the new problems that VSync introduces.

Having a mismatch in the opposite direction isn't ideal, either. If you have a monitor with a refresh rate of 144Hz, but your gaming PC can only output 60FPS, you won't be able to enjoy the full power of your monitor. We've looked more at monitor refresh rates if you're interested.

How to Maximize Frame Rate and Refresh Rate

If you're not gaming, you don't need to worry much about frame rates and refresh rates.

Since almost every display has a refresh rate of at least 60Hz, and even basic integrated graphics can run your computer at 60FPS, you'll have a fine level of performance no matter what. As we've discussed, movies are shot at lower frame rates, and most services like YouTube max out at 60FPS. There's little need to worry about either value for general computer use.

If you are gaming, then you have more considerations to make. Your refresh rate is dependent on your display, and while it's possible to overclock your monitor in some situations, this won't make a massive difference. If you have a 60FPS monitor and want to play games at 144FPS, you'll need to invest in a new monitor that supports this refresh rate.

For improving the frame rate that your games run at, though, there's a lot you can do. See our guide to fixing low game FPS for plenty of tips on boosting it.

Major improvements will require you to update your video card or other hardware, but there are other ways to squeeze more performance out of your current setup too. Because running games at higher FPS takes a lot of resources, lowering the resolution and turning off some of the visual effects is helpful in maximizing the frame rate.

Frame Rate and Refresh Rate: Important Companions

Now you understand frame rate, refresh rate, and how they affect PC gaming. Like most aspects of games, it all comes down to the hardware. Your monitor dictates its display rate, and more powerful PC components enable your system to push more frames per second to the monitor.

If you haven't ever played games at a high refresh rate, especially fast-paced titles like shooters, it's worth upgrading for. But don't forget that frame rate is just one measure of PC game performance.