When it comes to gaming, there is nothing that beats the PC Master Race. Allow me to explain. The level of customizability, as well as the raw power that a perfectly designed custom gaming rig can achieve, is something consoles can only dream of. That being said, even PC gaming is prone to certain loopholes and these loopholes can ruin one’s gaming experience. If you’re an ardent gamer or someone that keeps a keen eye on gaming forums, surely you must have heard about one the major troubles for any gamer – screen tearing. While there is a traditional solution to it in the form of V-Sync, newer technologies have brought on other solutions in the form of NVIDIA’s G-Sync and AMD’s FreeSync. Today, we’ll pit these two G-Sync vs FreeSync to see which one comes out on top. But first, let us throw some light on what the exactly the problem here is.
What is Screen Tearing?
If you’ve been gaming on a rig that doesn’t have a very powerful monitor, surely you must have come across this annoying phenomenon that’s screen tearing. Screen tearing is an effect that occurs on a video source where 2 or more frames of video are shown together in a single frame causing a torn effect. You see, as the GPUs become more and more powerful, they’ll want to push as many frames as they can in the shortest span. While this sounds great, if your monitor’s refresh rate is fixed at say 75Hz, even if the multiple frames for an animation are pushed, your monitor won’t be ready for it.
For instance, consider you’re playing a game on a GPU that is able to push 100 frames per second. That means that the monitor is updating itself 75 times per second, but the video card is updating the display 100 times per second which is 33% faster than the monitor. What happens is that in the time between screen updates, the video card has drawn one frame and a third of another one. That third of the next frame will overwrite the top third of the previous frame and then get drawn on the screen. The video card then finishes the last 2 thirds of that frame, and renders the next 2 thirds of the next frame and then the screen updates again.
You will only see a portion of what’s happening: a part of the current frame and a part of the next frame(s). As a result, it looks as if the picture on your screen is split into multiple parts, thus disrupting the entire look of the game. Another reason that this might be taking place is when the GPU of the system is under pressure from large amounts of graphical processing or poor programming. When the GPU is under a lot of pressure, it will fail to keep the output video in sync causing the screen to tear.
V-Sync and The Need For An Alternative
For any gamer, screen tearing is an annoying occurrence. A perfectly-rendered title can totally be ruined by gross horizontal lines and frame stuttering. Developers soon realized this problem and brought out V-Sync. Vertical Sync or V-Sync aims to solve the screen tearing issue with the help of double-buffering.
Double-buffering is a technique that mitigates the tearing problem by providing the system with a frame buffer and a back buffer. Whenever the monitor grabs a frame to refresh with, it pulls it from the frame buffer. The video card draws new frames in the back buffer, then copies it to the frame buffer when it’s done. As per V-Sync’s predefined rules, the back buffer can’t copy to the frame buffer until right after the monitor refreshes. The back buffer is filled with a frame, the system waits, and after the refresh, the back buffer is copied to the frame buffer and a new frame is drawn in the back buffer, effectively capping your framerate at the refresh rate.
While all this does sound good and helps in removing screen tearing, V-Sync comes with its own set of drawbacks. In V-Sync, your frame rate can only be equal to a discrete set of values equal to (Refresh / N), where N is some positive integer. For example, if your monitor’s refresh rate is 60Hz, the frame rates that your system will work in will be 60, 30, 20, 15, 12 and so on. As you can see, the drop from 60 fps to 30 fps is a big one. Also, using V-Sync, any frame rate between 60 and 30 that your system could probably push would be dropped down to 30 only.
Furthermore, the biggest problem with V-Sync is the input lag. As mentioned above, in the case of V-Sync, the frames that the GPU wants to push will first be held in the back buffer and will be sent to the frame buffer only when the monitor gives access to it. This means whatever input you give to the system will also get stored in that back buffer along with the other frames. Only when these frames will get written to the main frame, will your input be shown. As such, systems can suffer input lags of up to 30 ms, which can really disrupt your gaming experience.
The Alternatives: G-Sync and FreeSync
You see, be it traditionally or be it with the help of V-Sync, it has always been the monitor that has caused the issues. The main power has always been given to the monitors, and they’ve misused it to limit the frames being pushed to them. No matter how many software level changes you make, the hardware will always have its limits. But what if there was a different solution, something that made the GPUs get the supreme power? Cue – Variable Refresh Rate Monitors.
As the name suggests, variable refresh rate monitors are display monitors with refresh rate cap value, but without a fixed refresh rate. Instead, they rely on the GPU front to alter their refresh rate. Now, this feat is achieved with the help of either of the two technologies – NVIDIA G-Sync or AMD FreeSync.
Launched back in 2013, NVIDIA’s G-Sync aims to solve the problem by giving the GPU the ultimate right in deciding how many frames will be pushed onto the screen. The monitor, rather than having a fixed refresh rate, adapts to the GPU’s processing speed and match the outputted fps rate. So, for example, you’re playing a game at 120 fps, then your monitor will also be refreshing at 120 Hz (120 times per second). And in the case of high graphical processing requirement, where your GPU drops the frames to 30fps, the monitor will accordingly change its refresh rate to 30 Hz. As such, there is no loss in the frames, and the data is being directly pushed to the display, thus eradicating any scope for tearing or input lag.
Now, while NVIDIA is the king when it comes to gaming, its biggest competitor AMD is not that behind. So when NVIDIA brought out G-Sync, how could AMD stay behind? To stay in the competition, AMD brought out their solution to V-Sync technology – FreeSync. Brought out in 2015, AMD’s FreeSync works on the same principle as NVIDIA’s G-Sync by allowing the GPU to be the master, and control the refresh rate of the monitor. While the aim of the both G-Sync and FreeSync is the same, the difference between the two lies in how they go about achieving it.
G-Sync vs FreeSync: How They Work?
NVIDIA designed the G-Sync to fix problems on both the ends. G-Sync is a proprietary adaptive sync technology, which means that it makes use of extra hardware module. This additional chip is built into every supported monitor, and it allows NVIDIA to fine-tune the experience based on its characteristics like maximum refresh rate, IPS or TN screens, and voltage. Even when your frame rate gets super low or super high, G-Sync can keep your game looking smooth.
As for AMD’s FreeSync, no such module is required. In 2015, VESA announced Adaptive-Sync as an ingredient component of the DisplayPort 1.2a specification. FreeSync makes use of the DisplayPort Adaptive-Sync protocols to allow the GPU to take control of the refresh rates. Furthermore, it later on expanded its support to HDMI ports as well, making it appealing to a higher number of consumers.
Ghosting
In the aspect of displays, ghosting is used to describe an artifact caused by a slow response time. As the screen refreshes, the human eye still perceives the image previously displayed; causing a smearing or blurring visual effect. The response time is a measure of how fast a given pixel can change state from one color to another color. If your display’s response time is not in sync with the frames that the GPU is pushing, you are most likely to experience ghosting. This effect is prominent amongst most LCD or flat-screen panels. While it’s not essentially screen tearing, ghosting is not far away from the concept, considering the fact that new frames are overlayed on the previous frames without them completely disappearing from the screen.
Since NVIDIA’s G-Sync module operates with the help of an add-on hardware module, it allows the G-Sync to prevent ghosting by customizing the way the module operates on each and every monitor. With AMD’s FreeSync, these adjustments are made within the Radeon driver itself, taking the task away from the monitor. As you can see, it is a hardware vs. software control module here, and NVIDIA easily wins here. While ghosting isn’t common on FreeSync monitors, it is still there. On the other hand, since each monitor is physically tweaked and tuned, the G-Sync experiences no ghosting on its panels.
Flexibility
In the pursuit to solve screen tearing, the solution has been to give ultimate control to the GPU. But as Uncle Ben once said, “With great power comes great responsibility”. In this case, the GPU takes away all the powers from the monitor, more or less. For example, you must be aware of the fact that most monitors, aside from normal brightness and contrast adjustments, also come with their own functions which allow the display to dynamically adjust the settings based on the input being supplied to them.
Since NVIDIA’s G-Sync makes use of an extra proprietary module, it takes away this function from the display screen, by giving the ability of dynamic adjustments to the GPU. On the other hand, AMD’s FreeSync makes no such changes and enables the screen to have a dynamic color adjustment feature of its own. Having your personal modifications as an option is important for any manufacturer, as it helps them get the edge over other manufacturers. That is why many manufacturers prefer opting for FreeSync over G-Sync.
G-Sync vs FreeSync: Compatible Devices
For any device to be compatible with NVIDIA’s G-Sync module, it must embed NVIDIA’s proprietary module chip within their displays. On the other hand, AMD’s FreeSync can be utilized by any monitor that has a variable refresh rate and either a DisplayPort or HDMI port.
That being said, your GPU also needs to be compatible with their respective technologies (yes, you cannot mix and match the GPU of one manufacturer with the sync technique of the other). Having been introduced almost 2 years earlier than its competitor, the NVIDIA G-Sync has rather a lot of GPUs under the supported tag for G-Sync. All the mid to high-end GPUs from the 600 to the 1000 series, carry the mark of the G-Sync on them.
Comparatively, at the time of this writing, AMD supports only 9 GPUs which make use of the FreeSync technology, as compared to NVIDIA’s 33. Furthermore, NVIDIA has also extended its G-Sync support to laptops and notebooks as well, a feature currently missing from AMD’s FreeSync.
-
NVIDIA G-Sync Compatible Devices
GTX 600 Series | GTX 700 Series | GTX 900 Series | GTX 1000 Series | Titan Series |
---|---|---|---|---|
GeForce GTX 650 Ti Boost | GeForce GTX 745 | GeForce GTX 950 | GeForce GTX 1050 | GeForce GTX Titan |
GeForce GTX 660 | GeForce GTX 750 | GeForce GTX 960 | GeForce GTX 1050 Ti | GeForce GTX Titan Black |
GeForce GTX 660 Ti | GeForce GTX 750 Ti | GeForce GTX 965M | GeForce GTX 1060 | GeForce GTX Titan X |
GeForce GTX 670 | GeForce GTX 760 | GeForce GTX 970 | GeForce GTX 1070 | GeForce GTX Titan Xp |
GeForce GTX 680 | GeForce GTX 770 | GeForce GTX 970M | GeForce GTX 1080 | GeForce GTX Titan Z |
GeForce GTX 690 | GeForce GTX 780 | GeForce GTX 980 | GeForce GTX 1080 Ti | |
GeForce GTX 780 Ti | GeForce GTX 980M | |||
GeForce GTX 980 Ti |
-
AMD FreeSync Compatible Devices
GPUs | APUs |
---|---|
Radeon R7 260X | Kaveri |
Radeon R7 360 | Kabini |
Radeon R9 285 | Temash |
Radeon R9 290 | Beema |
Radeon R9 290X | Mullins |
Radeon R9 380 | Carrizo |
Radeon R9 390 | Bristol Ridge |
Radeon R9 390X | Raven Ridge |
Radeon R9 Fury X |
Design Cost and Availability
NVIDIA’s G-Sync makes use of an extra hardware proprietary, which basically means that display makers are required to make more room inside the monitor enclosure. While that may not seem like a big deal, creating a custom product design for one type of monitor raises development costs considerably. On the other hand, AMD’s approach is much more open, in which display makers can include the technology in their existing designs.
To show you a bigger picture (no pun intended), LG’s 34-inch Ultrawide monitor with FreeSync support will only cost you $397. Whereas, one of the cheapest ultrawide monitors currently available, the LG’s 34-inch alternative with G-Sync support will set you back at $997. That’s almost a $600 difference, which can easily be a deciding factor while making your next purchase.
SEE ALSO: 4K vs UHD: What’s The Difference and How It Affects You?
G-Sync vs FreeSync: The Best Variable Refresh Rate Solution?
Both NVIDIA G-Sync and AMD FreeSync successfully eradicate the problem of screen tearing. While the G-Sync technology is definitely more expensive, it is supported on a wider range of GPUs and it offers zero ghosting as well. AMD’s FreeSync on the other end is aimed at providing a cheaper alternative, and while the number of monitors that support it is quite high, not many mainstream GPUs are supported, as of now. Ultimately, the choice is in your hands, though you couldn’t go wrong with either of the two. Tell us about any other queries that you may have in the comments section below, and we’ll try our best to help you out.