Affiliate Disclosure: By buying the products we recommend, you help keep the site alive. Read more.
A monitor isn’t the most exciting piece of equipment you can attach to a computer, and plummeting prices on low-end models have encouraged many buyers to opt for whatever offers a big screen at a budget price. For the average user, that’s fine; but for a gamer, that’d be a mistake.
Gaming is a primarily visual experience, after all. The display you use to play a game can not only determine its beauty, but can also determine how quickly you’re able to react to opponents. Here’s what you need to know before pulling the trigger on your next gaming monitor.
The resolution dilemma
Most monitors that a gamer will consider are going to be between 24 and 30 inches, and displays sold in that range can ship with several different resolutions. 1920×1080, also known as 1080p, has become the standard, but 2560×1440 is common on more expensive models.
A higher resolution absolutely leads to better image quality because smaller, less visible pixels create a noticeably sharper image. However, jumping from 1080p to 1440p increases total pixel count from 2,073,600 to 3,686,400. More pixels to drive means a greater strain on your video card, and even today’s best hardware struggles to run a cutting-edge game like Battlefield 4 at maximum detail and 1440p.
This means you need to consider your system’s capabilities when selecting resolution. If you have a new mid-range video card, or an older high-end video card, going 1440p would be unwise, particularly if you like to play the latest games. If you can afford the newest, quickest graphics hardware, however, 1440p is the obvious choice.
You will also run into 16:10 resolutions like 1920×1200 and 2560×1600. These resolutions don’t differ enough from their 16:9 cousins to make a big performance difference, and they’re perfectly adequate for gaming. If you play games that require a large interface, like MMOs, you may even prefer the extra real estate a 16:10 monitor provides.
Making the most of graphics
Some games offer striking visuals and a monitor can make them look their best without imposing any extra load on your hardware. The gap between the best and worst displays is immense and can completely change the “wow” factor of a game’s graphics.
Unfortunately, determining which display is the best can be difficult. The numbers quoted by manufactures, like dynamic contrast, are useless because there’s no enforced standardization of test procedures or equipment. They may as well just make them up.
So, how do you know what’s good? Reviews! The best review sites are those that use calibration hardware to provide an objective gauge of image quality. CNET, Anandtech and TFT Central are great resources, and the latter is particularly in-depth.
You should pay particular attention to a display’s contrast, black levels and color accuracy. Good results in these areas contribute to a punch, vivid picture that can create a perception of depth. Color gamut, though it can contribute to good color accuracy, is less important because game developers often assume (correctly) that players will be using a display with a relatively narrow range of color. Brightness can also be largely ignored, unless you play your PC games in a sunlit room – in which case you should probably buy some curtains instead of a new monitor.
Games often move at a rapid pace. The player can change perspective at a moment’s notice and, in some titles, objects can zoom across the screen in the blink of an eye. This really puts a monitor to the test.
The first motion-related specification you should think about its refresh rate. Most monitors have a rate of 60 Hz (or sixty refreshes per second), Wlodi/Flickr