The term "refresh rate" is hardly new, but over the last few years it's become a prominent specification boasted by televisions and monitors. Each manufacturer wants to offer a higher refresh rate because it's another big number they can put on the box.

But how does this specification actually impact your experience?

What Does Refresh Rate Mean?

Let's start at the beginning, because you won't be able to understand the rest of this article if you don't know what refresh rate means. Fortunately, the term is not particularly complex. Refresh rate is simply the number of times a display refreshes the image it shows per second.

An easy way to understand this is by comparing it to frame rate in films or games. If a film is said to be shot at twenty-four frames per second, you'd understand the source content shows twenty-four different images each second. Similarly, a display with refresh rate of 60 Hz shows sixty "frames" per second. "Frames" is in quotes, though, because it's not actually a frame. The display will refresh sixty times each second even if not a single pixel changes, and the display only shows the source fed to it. Still, the analogy is an easy way to understand the core concept.

A higher refresh rate therefore translates to the ability to handle a higher frame rate. Remember, though, that a display shows what is given to it. As such, a higher refresh may not always improve your experience if your refresh rate is already much higher than the frame rate of your source.

Games and Refresh Rate

All video games, no matter their platform or graphics, are rendered by computer hardware. In most cases (particularly on the PC platform) frames are spit out as quickly as they can be generated. This is because higher frame rates usually translate to less delay between each individual frame. That in turn means more realistic gameplay and less input lag.

A problem occurs, however, when the rate frames are being dished out doesn't sync well with the rate at which the display refreshes. Let's say, for example, you have a 60 Hz display that's used to play a game rendering at 75 frames per second. In this situation, the display, which accepts data from the GPU at regular intervals without communicating with it, is very likely to catch the hardware between frames. The result is screen tearing and jerky, uneven motion. Most games let you artificially cap the frame rate to prevent this problem, but that means you can't enjoy your PC's full potential.

http://youtu.be/jVAFuUAKPMc

The solution is a higher refresh rate. Today, this typically means buying a 120 Hz monitor. Such a display can manage up to 120 frames per second. It also nicely handles lower V-Sync caps like 30 and 60 FPS, as they are even multiples of the 120 Hz refresh rate. Upgrading from 60 Hz to 120 Hz is a very noticeable difference, but it's also something you have see for yourself. You can't see the advantage of a 120 Hz display by viewing a video of it on a 60 Hz display.

A new, cutting-edge refresh technology important to gamers is adaptive refresh rate. NVIDIA calls this feature G-Sync, while AMD calls it FreeSync. In either case, the idea is the same; unlike a conventional monitor, a display with an adaptive refresh rate asks the video card how quickly it is delivering frames, then adjusts its refresh rate to match. This eliminates screen tearing at any frame rate up to the monitor's maximum refresh rate.

Watching Video

Viewing a video is much different then playing a game. A game is rendered in real time; a video is played back from a source. While film is generally shot at 24 frames per second, it is often converted to 30 or 60 frames per second by repeating certain frames. Only certain source files, played on certain hardware (typically, a Blu-ray disc on a capable Blu-ray player) have the ability to output 24 FPS. Even in those cases, though, the original 24 FPS is not exactly reproduced, but rather a higher frame rate is reproduced in a cadence that replicates the original 24 FPS motion.

http://youtu.be/VUjugzbY1II

The conversion of frame rate is purposely introduced to better match modern home displays, which typically operate at 60 Hz or some higher multiple thereof. It also reduces flicker, which is apparent if content is viewed in its 24 FPS form. Other video sources, like YouTube, typically operate at 30 frames per second (though YouTube recently announced support for 60 FPS video).

You may wonder, then, why you'd need a 120 Hz, 240 Hz or 480 Hz display for video. The cynical answer is "you don't." If the display in question is a monitor, there is usually no benefit derived from the improved refresh rate. If the display is a television, it will likely "enhance" the original content by using an algorithm to generate entirely new frames between those fed by the original source content. This technique, known as motion interpolation, can smooth video, but critics often dislike the artificial look it adds to the source.

lghdtv

A less cynical answer, however, will correctly argue that displays with better refresh rate tend to have better motion performance. This is not an attribute of the refresh rate, but is instead a benefit derived from the fact more expensive displays with higher refresh rates often have high-quality panels that show less ghosting and lag than their cheaper siblings. This is not quantified by any specification, however, and there's no guarantee a 120 Hz display will show better motion detail than a 60 Hz model. You need to dig into professional reviews to know the details. Even if a display claims proper reproduction of 24 FPS film cadence, for example, it may not actually deliver when tested.

A Note About Plasma

Modern plasma displays are often quoted as having a 600 Hz refresh rate. This has nothing to do with the refresh rate figures quoted by other technologies. A plasma, in order to create a picture, has to turn individual pixels on and off extremely rapidly. Usually this is done ten times per second, and manufacturers multiple that by a typical 60 Hz cadence to deliver - presto! - 600 Hz.

panasoniczt60

In fact, the comparison is apples-to-oranges because plasmas are fundamentally different from LCD monitors and displays. What you really need to know is this; plasmas do not have motion issues, such as ghosting, which often plague LCDs. This is because unlike an LCD, which will wait until the next scheduled refresh to do anything at all, a plasma refreshes within a fraction of a millisecond.

Not that it matters today. Plasma, for better or worse, is a dying technology.

Conclusion

The answer to "do I need a higher refresh rate?" will always be "it depends." Let's cover a few common scenarios.

If you play games, but generally don't see a frame rate higher than sixty per second, then a higher refresh rate is not going to benefit you. You'll also see little improvement if you like to keep things capped at a V-Sync maximum of 60 FPS.

Readers interested in video quality will not see any direct correlation between a better refresh rate and improved quality. Panels that handle higher rates may have better overall motion performance, but the refresh rate specification does not speak to that. You need to read a professional review of a television to properly understand how a set handles motion.

As such, refresh rate is mainly applicable to PC gamers who want the best motion performance possible from games. Anyone who fits into this category will find that better refresh rates are extremely important, and that goes double if you can snag a monitor and video card combo that supports an adaptive refresh rate.

Do you think about refresh rate when you buy a new display, and if you do, what do you hope a quicker rate will provide for you? Let us know in the comments.