Affiliate Disclosure: By buying the products we recommend, you help keep the lights on at MakeUseOf. Read more.
High-definition televisions are great for viewing Blu-Ray movies, but that’s hardly their only purpose. They also serve as gaming displays for millions of people. You’d expect, then, for gaming performance to be a major focus – but it often isn’t. Some manufacturers pay only the slightest attention to how their television handles a game, leaving it up to console makers and game developers to try to accommodate the display’s flaws.
The HDTV you purchase for gaming matters, and it effects more than just image quality. Response time, refresh rate and preset modes can also change how you experience a game. Unfortunately, manufacturers purposely do a poor job of explaining what features and settings do and how they might impact gameplay. Let me help you cut through the bull and get down to what really matters.
Resolution – 720p, 1080p or 4K?
Gamers looking to purchase a television may think resolution is extremely important because a higher resolution is generally linked to better image quality. In fact, resolution is not quite as important as you might think.
Console games, unlike their PC brethren, usually render at a reduced resolution and are then up-scaled by both the console and the television. The console version of Diablo III, for example, runs at a native resolution of 1120×584. Reducing the resolution can help a game achieve its performance target without sacrificing features the developer wants, like physics calculations or particle effects.
This does not mean that a 1120×584 game will look the same on a 720p television as it does on a 1080p television – the problem is more complex than that. Different sets have different image processing chips, which in turn output different results, and a higher resolution will translate to a sharper image in most situations.
The point is not that all resolutions look the same, but rather that resolution isn’t worth much thought. You should simply purchase the best available in your price range, which for most people will mean 1080p.
This is assuming that you own a current-gen console or plan to buy a next-gen console. Gamers with an old game console (like a PlayStation 2) may want to stick with 720p, as the extreme scaling required to display an old console’s output on 1080p or 4K television can kill image quality.
Input Lag – Hard To Judge, But Very Important
Games are interactive media, which means the way hardware responses to the user is absolutely critical. If the time between player input and on-screen reaction is too high the player may become frustrated by controls that seem sluggish or ineffective –yet the game itself is not at fault.
Unfortunately, the fancy image processing that modern televisions perform to improve image quality can also tack on extra response time. The fast sets available today have an input lag of about 16 milliseconds, while the slowest can exceed 100 milliseconds. That’s a huge difference, and it can absolutely have a negative impact on gameplay.
Many televisions offer a “game mode” which turns off image quality features to reduce input lag. While game mode usually does what it says, turning off features can compromise image quality, and some televisions disable image adjustments when game mode is on. Gamers should play with a set in-store to see what features become unavailable with game mode turned on, or read reviews that discuss the issue.
There’s no easy way to gauge input lag in-store, and manufacturers generally don’t report a number. The only way to know input lag is to read reviews or visit DisplayLag.com, a website that catalogs input lag results.
Refresh Rate – 60, 120, 240 or 600 Hz?
Refresh rate is the number of times per second that the television can update its image. Technically, a higher refresh rate is better, but this advantage is generally irrelevant for gamers because console games almost never play at a frame rate greater than 60 frames per second, and many target 30 FPS instead. This means the game is only sending a new frame to your television 60 times per second (or less) so there’s no advantage to be gained with a display that can refresh more frequently.
Televisions with a high refresh rate have the ability to make up for a low source frame rate by inserting new, dynamically generated frames in between those sent by the source device. This feature introduces significant input lag, however, so it’s typically turned off when a television is used in game mode. There’s also some viewer dissent about whether the resulting ultra-smooth image is preferable; some have taken to calling it the “soap opera effect.”
Image Retention Worries Are Valid, But Overblown
Plasmas are in many ways the perfect gaming display. They have great image quality, almost no motion blur, and usually score well in input lag tests. There’s just one problem; image retention.
Image retention is the plasma’s tendency to retain a “ghost” of an image after the frame has changed. This typically occurs when a static image has been displayed consistently for ten minutes or more and is more likely to happen with a high-contrast image. This is a concern for gamers because most games have a head-up display which remains fixed on-screen.
Though generally a minor concern, image retention is still noticeable and can become a problem after very long periods of time. After playing Diablo III for about twenty hours on my own plasma, for example, I noticed that the status and attack icons had left behind a slight shadow. The effect was difficult to see on anything but a pure-white screen, and disappeared as I played other games, but it could have turned into a more serious issue if Diablo III had continued to hog the majority of my time.
Most gamers won’t have to worry and can rest well knowing that retention, if it does appear, will go away with time. However, gamers purchasing a television only for gaming, who also have a tendency to play the same title for tens or hundreds of hours, should avoid plasma.
Does Size Matter?
There’s just one more issue to cover, and for many, it’s the most important; size.
Televisions come in sizes as small as 22 inches or as large as 92 inches (and larger, counting some special-order models). Average size has been on a bit of an upward trend as HDTVs have dropped in price, and a $1,000 budget can now purchase up to 60 inches of glorious display.
Size should not be dictated by desire, however, but instead by distance. The closer you intend to sit to a display the smaller you’ll want it to be. Why? Because sitting closer makes pixels easier to identify. A 1080p 50-incher looks great from six feet away. But from a foot away? Not so much!
There’s no hard-and-fast rule for determining optimal distance, but there are some calculators, such as those from HDTV Test, My Home Theater and Samsung’s “Find the Perfect HDTV” wizard. You also should keep in mind mounting size and weight. A large television often has a big base and can weigh 60 to 100 pounds.
So, to summarize; a gaming television will ideally be a 1080p set with no noticeable motion blur, low input lag and a game mode that doesn’t turn off all the set’s image quality features. A TV that checks all these boxes can be hard to find, but Sony’s KDL series is a good place to start, as are most Panasonic Plasmas. Reviews will be crucial to making your decision because judging input lag in-store usually isn’t possible.
Do you have any tips for gamers who need a new television? Be sure to leave them in the comments!
Image Credit: FlatPanelsHD, Gametrailer Forums