Between Nvidia's G-Sync and AMD's FreeSync, variable refresh monitors--which sync the refresh rate of your monitor and your graphics card, to eliminate stuttering and screen tearing--are finally here in full force, and let me tell you, the results are absolutely glorious.
Well, unless you're the owner of a system powered by an AMD Radeon Crossfire setup with multiple graphics cards, that is.
When AMD and its hardware partners launched the first FreeSync monitors in March, Crossfire setups weren't supported, but it was promised by the end of April. Well, the end of April has passed, but Crossfire support still isn't here.
"After vigorous QA testing, however, it's now clear to us that support for AMD FreeSync monitors on a multi-GPU system is not quite ready for release," AMD stated in a forum post first noticed by Anandtech. "As it is our ultimate goal to give AMD customers an ideal experience when using our products, we must announce a delay of the AMD Catalyst driver that would offer this support. We will continue to develop and test this solution in accordance with our stringent quality standards, and we will provide another update when it is ready for release."
The impact on you at home: Without Crossfire enabled, enabling FreeSync means you're limited to using a single Radeon graphics card, rendering your other one inert. Getting multi-GPU setups working correctly is always tricky from a technical standpoint, but it must be disheartening for Crossfire users--who, by their very nature, are some of AMD's most loyal customers--to be kneecapped by the AMD-exclusive FreeSync technology out of the gate.
Nevertheless, it's better to wait for a solid Crossfire support implementation than for AMD to rush out something borked and buggy.
One of these things is not like the other
AMD and Nvidia's respective approaches to variable refresh technology differ.
While AMD's FreeSync is an optional protocol built atop DisplayPort 1.2a's Adaptive-Sync specification, Nvidia's G-Sync requires manufacturers to plop a proprietary hardware module into their displays. Those modules cost monitor manufacturers an extra $100 to $150 or so, though official pricing has never been released.
Both work like a charm as long as your graphics card is pumping out frame rates within the monitor's specified variable refresh rate window--for example, 48Hz to 75Hz for LG's 34-inch 34UM67 IPS display--though superb, extensive testing by PCPerspective has shown that G-Sync's implementation is superior when you drop below the minimum rate. Dropping below the minimum rate shouldn't be an issue with multi-GPU Crossfire setups, but depending on the type of graphics card you're using, hitting 48 fps could be difficult in some modern games when you're only using a single Radeon with FreeSync.
Further reading: Tested: Nvidia GeForce and AMD Radeon graphics cards for every budget
But none of those technical details dismiss my true worry about these variable refresh rate monitors: The lack of a common standard. Unlike graphics cards, which PC gaming enthusiasts switch out every few years, monitor purchases are for the long haul. When you're dropping dough on a display--especially a pricier FreeSync or G-Sync monitor--you're expecting to use that monitor for 5 to 10 years in most cases. Given FreeSync's and G-Sync's reliance on different underlying technologies, ponying up for one of the displays essentially locks you in to Radeon- or GeForce-brand graphics cards for the life of the screen.
That's troubling. And that's why, as deeply enthusiastic as I am about variable refresh rate monitors--using one is an utter revelation, and far more beneficial to everyday gamers than 4K displays--I'm reluctant to recommend a FreeSync or G-Sync display to anyone except true Nvidia or AMD diehards, even once all these bugs are worked out.