Affiliate Disclosure: By buying the products we recommend, you help keep the site alive. Read more.
NVIDIA sits at the forefront of the graphics we see in video games. It designs the technology used in some of the most popular graphics processors, pushing gaming graphics to a higher level. So, it’s hard not to pay attention to NVIDIA’s G-SYNC technology.
Of course, like any new technology, this one is a little confusing. We’re here to explain exactly what G-SYNC is, what you need to use it, and whether it’s something you should be running out to get right now.
Before we get into G-SYNC and its technicalities, it’s best to understand the line of communication between a monitor and the GPU. The image that appears on a monitor is refreshed at a fixed rate, and it’s known as the refresh rate (read this article to learn more about refresh rate). Commonly, monitors have a refresh rate of 60Hz, which translates to sixty frames a second.
GPUs, on the other hand, render graphics at variable rates. Traditionally, the generated graphic signal from the GPU is fed to the monitor regardless of whether their rates are synchronized. So if the GPU renders at a frame rate above 60Hz, at some point the monitor will display two or more frames in a single refresh cycle, resulting in tear lines — this is known as screen tearing.
If you’re familiar with the terms used to describe PC gaming settings, you’ve probably heard of VSync. This is a setting that forces the GPU to sync with the monitors refresh rate, hence preventing screen tearing. However, that introduces input lag and stuttering, a problem that can be truly devastating in any game where speed and timing are critical to success. Although VSync can fix screen tear, you end up trading one visual problem for another, which really isn’t of much benefit.
What Is G-SYNC?
Basically, G-SYNC is a technology designed to solve all of that and create far smoother motion for PC games. What is does sounds simple, as it’s just forcing the monitor to refresh at the same rate as the GPU. So unlike traditional monitors that refresh at fixed rates, say 60Hz, 100Hz or 120Hz; a G-SYNC-capable monitor will sync its refresh rate with the GPU’s variable rate, which could be rendering anywhere between 30Hz and 144Hz.
In order to accomplish that, a hardware logic board is added to the monitor, and that communicates with the graphics card to keep everything locked down and in sync. Because the graphics card is communicating directly with the monitor, the technology promises to completely do away with screen tearing of any kind.
It all sounds like a bit of magic, but really all the G-SYNC module is doing is forcing everything to work together. Instead of the monitor doing its own thing while the graphics card does its best to keep up, G-SYNC keeps everything working together, which results in a smooth experience with no input lag.
What Do You Need To Use It?
To start with, you need a graphics processor that can actually handle G-SYNC technology. We aren’t talking about top-of-the-line cards here, but you do need some newer models, with the lowest model being the GeForce GTX 650 Ti BOOST. However, you won’t find many of them for sale, so the cheapest model you can reasonably buy is the GTX 750, which Amazon is currently listing for $115 (at the time of writing). Definitely a reasonable price for a technology that can greatly improve your gaming experience. Check out NVIDIA’s website for a full list of supported cards.
While you may get the impression that it’s not that expensive to run G-SYNC technology, at least from a graphics card perspective, unfortunately the monitors are not as affordable, and they are actually quite hard to come by. As this is a very new technology, not a lot of companies have jumped on and started selling devices with support for G-SYNC just yet. In fact, most of what you will find is a modified ASUS VG248QE. You can also get a DIY kit and put in your own VG248QE, but that will required a little more technical skill than just buying a modified one from one of the sites listed by NVIDIA.
NVIDIA did announce a bunch of models that were supposed to have been released, but most are not available from stores. The original press release cited a worldwide relase in Q3, so you may soon see some models from companies like Acer, AOC, ASUS, BenQ, and Phillips.
As of this writing, the Phillips 272G5DYEB is available on Amazon for $600. Acer’s XB270H 27-inch Full HD display is available fro $600 as well. BenQ offers a hybrid G-Sync monitor that’s also capable of their proprietary syncing technology — this BenQ XL2420G monitor can be had for $540.
Is It Ready For Prime Time?
Right now, the technology is still young, and not a lot of people have had the chance to experience it. In some previews you will find around the web, press is claiming that they can’t go back to a traditional monitor, but it’s one of those situations where you won’t miss what you’ve never had. At this point, unless you want to be on the forefront of a still developing technology, you might better off waiting until more monitors are released. Currently, your choices are very limited, which means you’ll be paying a premium. Given some time, you’ll have a wider range of monitors to choose from.
One small problem reported by early G-SYNC adopters is a slight reduction in frame rate. Users may see a 3-5 percent reduction in frames per second depending on how hard the game is pushing the GPU. For frame rate junkies, this could be a deal breaker. NVIDIA is working on the issue, and as time goes on, we expect to see any performance trade offs disappear.
In the end, G-SYNC promises to be one of the biggest changes to the way we play games, but it’s still new. Few monitors are out there to purchase, and while the graphics cards are reasonably priced, the monitors are quite expensive. Still, if you’re a PC gamer, you need to keep an eye on G-SYNC, because it just might change the world of PC gaming as we know it.
Are you interested in G-SYNC technology? Are you going to run out and get a monitor as soon as you can? Hit the comments section and let us know!
Image Credit: Vanessaezekowitz via Wikimedia Commons