NVIDIA Just Killed The Need For Expensive G-Sync Monitors

NVIDIA Just Killed The Need For Expensive G-Sync Monitors
January 08 03:51 2019 Print This Article

G-Sync, a name synonymous with overpriced proprietary miniaturized Death Star tech type of monitors, is now a dead meme. Nobody needs them anymore. NVIDIA just killed them off – or more specifically – the need for the expensive proprietary chips inside them. Because at CES today, they’ve just announced that much cheaper FreeSync monitors with nothing proprietary inside them will NOW WORK EXACTLY THE SAME WAY WITH NO NEEDED MODS.

Amidst a few GPU-specific news like RTX GPUs for laptops and RTX 2060 for desktops today at CES, NVIDIA also dropped this truth bomb that monitors with Variable Refresh Rate (VRR) capabilities using the DisplayPort Adaptive-Sync Protocol will soon be able to turn G-Sync on straight out of the box.

By definition, those FreeSync monitors you see just about everywhere that cost companies virtually nothing to implement will all be compatible because they’re all based on Adaptive-Sync. FreeSync is royalty free and and is free to use by anyone as opposed to NVIDIA’s G-Sync which forces manufactures to buy an expensive proprietary chip to run.

For years NVIDIA has told us that the huge “G-Sync tax” we had to pay – a cost of around US$200 on average vs. a FreeSync equivalent monitor – was important to the performance and image quality of these displays. But now, they’ve pretty much just told us we’ve been drinking too much of their own Kool Aid because FreeSync could do the same all along.

That tax, and anyone who paid it so far, is now completely optional and you don’t have to pay for it anymore due to the fact that you no longer need that proprietary G-Sync chip.

Only one word could describe G-Sync Monitors: EXPENSIVE

What’s more, FreeSync was developed by AMD, NVIDIA’s primary competitor in the graphics card market. And for a while now, you weren’t allowed to mix and match the two. If you had an NVIDIA GPU, you could only use a G-Sync Monitor for any sync technology; and for AMD GPU’s you’d want a FreeSync monitor to be compatible.

All three of these types of Sync are used to coordinate data between the speed that your monitor can display a full screen frame without overlapping with the previous frame displayed, and the speed at which your PC can run the game. If your game runs too slow, the display will also run slower to prevent any uneven pacing between display refreshes. If your game runs too fast, the adaptive sync technology will make it run only at the max speed the display can go – preventing any waste of processing power and halfway overlapping of images causing screen tearing.

Now, due to the extreme high cost of G-Sync displays, people were really forced into either paying for the G-Sync Tax, or going the all AMD route, or ultimately opting out of any adaptive sync altogether. I like to call this choice the “NoSync”.

FreeSync compatible monitors are much more affordable, and offer much more variety

This was such a big deal for consumers especially once NVIDIA’s dominance in the graphics card market became apparent with their 10-series Pascal GPUs – something AMD still arguably can’t match but hope they will soon.

This bit of news essentially puts all of these tangently AMD-powered FreeSync monitors as G-Sync compatible. Technically NVIDIA has tested 400 out there and claims that only 12 are good enough to certify and will have G-Sync turned on by default, but for anyone else you can still enable G-Sync in the options for the upcoming NVIDIA Graphics Driver update on January 15th. This feature will work on 10-series GTX cards and of course the newer 20-series RTX lineup.

Admittedly we can’t immediately tell if this is a good thing or a bad thing for either brand.

NVIDIA users would now have the advantage of the pick of the lot of FreeSync monitors to accompany their green-team GPU. This de-incentivices people from building out AMD systems because you no longer need an AMD GPU to combo with a FreeSync monitor to get adaptive sync.

For AMD, I would call this a moral victory. They were the ones who developed FreeSync – and didn’t ask for any money to use it. For a while, one could argue that those G-Sync chips made the monitors so expensive because they added value in image quality and performance to the display. But now we know we really could have just gotten by if NVIDIA would have just let us use G-Sync without the chip and certification.

Its clear this is a win for the consumer though, unless that consumer bought a G-Sync monitor recently. If so, I would imagine they would be rightfully pissed at the moment.

NVIDIA however, isn’t really killing the G-Sync brand. Admittedly, they do have a stringent quality control for displays that do have those chips. They are also doing something more in that space by certifying higher end G-Sync technologies. They currently have 3 tiers: G-Sync Ultimate, standard G-Sync, and G-Sync compatible.

Compatible means just that. You get no flicker, no blanking, and no artifacts. Standard G-Sync gets you the extra 300+ tests for Image Quality that the monitor has supposedly passed, and Ultimate means it passes their HDR 1000-nits standard with advanced image processing.

Honestly though, most of us just need the compatible tag so we can just build our affordable FreeSync 144hz NVIDIA GPU Rig in peace without having to pay an exorbitant tax.

As if we didn’t have enough taxes to pay these days already anyway.

Source: NVIDIA

Facebook comments:

write a comment

0 Comments

No Comments Yet!

You can be the one to start a conversation.

Add a Comment

Leave a Reply