So if your screen runs at 60Hz but you’re playing a graphics-intense game that your graphics card struggles to output at 45Hz, FreeSync will adjust the monitor downward to 45Hz. That stops the monitor from waiting for the graphics card to catch up, as the waiting is what causes visible screen tearing. Newer, premium versions of FreeSync know how to adjust upward, effectively compensating for low framerates. It’s worth noting that in any scenario, FreeSync has almost no performance cost since the majority of related processing takes place on the monitor, not the device generating the graphics.
Back on track, AMD updated FreeSync in early 2017 with an additional tier known as FreeSync 2. The biggest difference was the incorporation of HDR and wide color gamut support in FreeSync 2, since at that time 4K and HDR were at long last becoming mainstream.
To sum this part up, basic and original 2014/2015 FreeSync focused almost entirely on adaptive sync to prevent screen tearing. It also generally lowered input lag due to taking computational load off the PC or console and moving that load to the monitor, but this part’s not that noticeable. FreeSync 2 handled screen tearing the same way but added support for HDR, meaning it could adapt refresh rates for HDR content in addition to regular (SDR) feeds. Note we use the past tense in this paragraph, that’s because FreeSync 2 is now a legacy term and doesn’t apply anymore.