Introduction to FreeSync and Adaptive Sync

The first time anyone talked about adaptive refresh rates for monitors – specifically applying the technique to gaming – was when NVIDIA demoed G-SYNC back in October 2013. The idea seemed so logical that I had to wonder why no one had tried to do it before. Certainly there are hurdles to overcome, e.g. what to do when the frame rate is too low, or too high; getting a panel that can handle adaptive refresh rates; supporting the feature in the graphics drivers. Still, it was an idea that made a lot of sense.

The impetus behind adaptive refresh is to overcome visual artifacts and stutter cause by the normal way of updating the screen. Briefly, the display is updated with new content from the graphics card at set intervals, typically 60 times per second. While that’s fine for normal applications, when it comes to games there are often cases where a new frame isn’t ready in time, causing a stall or stutter in rendering. Alternatively, the screen can be updated as soon as a new frame is ready, but that often results in tearing – where one part of the screen has the previous frame on top and the bottom part has the next frame (or frames in some cases).

Neither input lag/stutter nor image tearing are desirable, so NVIDIA set about creating a solution: G-SYNC. Perhaps the most difficult aspect for NVIDIA wasn’t creating the core technology but rather getting display partners to create and sell what would ultimately be a niche product – G-SYNC requires an NVIDIA GPU, so that rules out a large chunk of the market. Not surprisingly, the result was that G-SYNC took a bit of time to reach the market as a mature solution, with the first displays that supported the feature requiring modification by the end user.

Over the past year we’ve seen more G-SYNC displays ship that no longer require user modification, which is great, but pricing of the displays so far has been quite high. At present the least expensive G-SYNC displays are 1080p144 models that start at $450; similar displays without G-SYNC cost about $200 less. Higher spec displays like the 1440p144 ASUS ROG Swift cost $759 compared to other WQHD displays (albeit not 120/144Hz capable) that start at less than $400. And finally, 4Kp60 displays without G-SYNC cost $400-$500 whereas the 4Kp60 Acer XB280HK will set you back $750.

When AMD demonstrated their alternative adaptive refresh rate technology and cleverly called it FreeSync, it was a clear jab at the added cost of G-SYNC displays. As with G-SYNC, it has taken some time from the initial announcement to actual shipping hardware, but AMD has worked with the VESA group to implement FreeSync as an open standard that’s now part of DisplayPort 1.2a, and they aren’t getting any royalties from the technology. That’s the “Free” part of FreeSync, and while it doesn’t necessarily guarantee that FreeSync enabled displays will cost the same as non-FreeSync displays, the initial pricing looks quite promising.

There may be some additional costs associated with making a FreeSync display, though mostly these costs come in the way of using higher quality components. The major scaler companies – Realtek, Novatek, and MStar – have all built FreeSync (DisplayPort Adaptive Sync) into their latest products, and since most displays require a scaler anyway there’s no significant price increase. But if you compare a FreeSync 1440p144 display to a “normal” 1440p60 display of similar quality, the support for higher refresh rates inherently increases the price. So let’s look at what’s officially announced right now before we continue.

FreeSync Displays and Pricing
Comments Locked

350 Comments

View All Comments

  • JarredWalton - Friday, March 20, 2015 - link

    FYI, ghosting is a factor of the display and firmware, not of the inherent technology. So while it's valid to say, "The LG FreeSync display has ghosting..." you shouldn't by extension imply FreeSync in and of itself is the cause of ghosting.
  • chizow - Friday, March 20, 2015 - link

    So are you saying a firmware flash is goiing to fix this, essentially for free? Yes that is a bit of a troll but you get the picture. Stop making excuses for AMD and ask these questions to them and panel makers, on record, for real answers. All this conjecture and excuse-making is honestly a disservice to your readers who are going to make some massive investment (not really) into a panel that I would consider completely unusable.

    You remember that Gateway FPD2485W that you did a fantastic review of a few years ago? Would you go back to that as your primary gaming monitor today? Then why dismiss this problem with FreeSync circa 2015?
  • chizow - Friday, March 20, 2015 - link

    Who said no ghosting? lol. There's lots of ghosting, on the FreeSync panels.
  • TheJian - Sunday, March 22, 2015 - link

    You're assuming gsync stays the same price forever. So scalers can improve pricing (in your mind) to zero over time, but NV's will never shrink, get better revs etc...LOL. OK. Also you assume they can't just lower the price any day of the week if desired. Microsoft just decided to give away Windows 10 (only to slow android but still). This is the kind of thing a company can do when they have 3.7B in the bank and no debt (NV, they have debt but if paid off, they'd have ~3.7b left). They could certainly put out a better rev that is cheaper, or subsidize $50-100 of it for a while until they can put out a cheaper version just to slow AMD down.

    They are not equal. See other site reviews besides and AMD portal site like anandtech ;)

    http://www.pcper.com/reviews/Displays/AMD-FreeSync...
    There is no lic fee from NV according to PCper.
    "It might be a matter of semantics to some, but a licensing fee is not being charged, as instead the vendor is paying a fee in the range of $40-60 for the module that handles the G-Sync logic."
    Which basically shows VENDORS must be marking things up quite a lot. But that is too be expected with ZERO competition until this week.

    "For NVIDIA users though, G-Sync is supported on GTX 900-series, 700-series and 600-series cards nearly across the board, in part thanks to the implementation of the G-Sync module itself."
    Not the case on the AMD side as he says. So again not so free if you don't own a card. NV people that own a card already are basically covered, just buy a monitor.

    Specs of this is misleading too, which anandtech just blows by:
    "The published refresh rate is more than a bit misleading. AMD claims that FreeSync is rated at 9-240 Hz while G-Sync is only quoted at 30-144 Hz. There are two issues with this. First, FreeSync doesn’t determine the range of variable refresh rates, AdaptiveSync does. Second, AMD is using the maximum and minimum refresh range published by the standards body, not an actual implementation or even a planned implementation from any monitor or display. The 30-144 Hz rating for G-Sync is actually seen in shipping displays today (like the ASUS PG278Q ROG Swift). The FreeSync monitors we have in our office are either 40-144 Hz (TN, 2560x1440) or 48-75 Hz (IPS, 2560x1080); neither of which is close to the 9-240 Hz seen in this table."

    Again, read a site that doesn't lean so heavily to AMD. Don't forget to read about the GHOSTING on AMD. One more point, PCper's conclusion:
    "My time with today’s version of FreeSync definitely show it as a step in the right direction but I think it is far from perfect."
    "But there is room for improvement: ghosting concerns, improving the transition experience between VRR windows and non-VRR frame rates and figuring out a way to enable tear-free and stutter-free gaming under the minimum variable refresh rate."
    "FreeSync is doing the right things and is headed in the right direction, but it can’t claim to offer the same experience as G-Sync. Yet."

    Ok then...Maybe Freesync rev2 gets it right ;)
  • soccerballtux - Friday, March 20, 2015 - link

    you must be a headcase or, more likely, are paid for by NVidia to publicly shill. Gsync requires a proprietary NVidia chip installed in the monitor that comes from, and only from, NVidia.

    It's much easier to simply set a flag-byte in the DisplayPort data stream that says "ok render everything since the last render you rendered, now". There's nothing closed about that.
  • chizow - Friday, March 20, 2015 - link

    And? Who cares if it results in a better solution? LOL only a headcase or a paid AMD shill would say removing hardware for a cheaper solution that results in a worst solution is actually better.
  • soccerballtux - Friday, March 20, 2015 - link

    wellll, if it's cheaper and a better solution, then the market cares.
  • chizow - Friday, March 20, 2015 - link

    Except its cheaper and worst, therefore it should be cheaper
  • bloodypulp - Friday, March 20, 2015 - link

    Oh darn... so what you're saying is that I have to purchase the card that costs less, then I have to purchase the monitor that costs less too? Sound like a raw deal... ROFL!!

    And as far as your bogus oppenness argument goes: There is nothing preventing Nvidia from supporting Adaptive Sync. NOTHING. Well, unless you count hubris and greed. In fact, Mobile G-Sync already doesn't even require the module! I guess that expensive module really wasn't necessary after all...

    And lastly, Nvidia has no x86 APU to offer, so they can't offer what AMD can with their Freesync-supporting APUs. Nvidia simply has nothing to compete with there. Even gamers on a tight budget can enjoy Freesync! The same simply cannot be said for GSync.
  • Denithor - Friday, March 20, 2015 - link

    ONLY closed because NVIDIA refuses to support freesync. Much like OpenCL. And PhysX.

Log in

Don't have an account? Sign up now