FreeSync is an open software standard for PC games that has been devised by AMD to provide smoother motion and eliminate image tearing, all without reducing frame rates and impacting gameplay. It does this by synchronising the graphics adapter's frames with the monitor's refresh rate to ensure that frames from the graphics adapter are displayed on the screen as soon as they are ready.

It's a feature that is similar to the v-sync setting that can normally be enabled when gaming to prevent tearing, but v-sync h has the effect of restricting a game's frame rate to 60 per second and can impact smoothness. With FreeSync, the aim is to provide the benefits of v-sync, but without restricting the amount of frames per second and overall gaming performance. With FreeSync, the frame rate remains dynamic.

What's the difference between an Intel Core i3, i5, and i7?

A few things are required in order to take advantage of FreeSync: you need an AMD graphics adapter with DisplayPort Adaptive-Sync 1.2, the latest Catalyst driver (15.2 Beta or better), and you need a monitor that supports the FreeSync standard.

FreeSync is an open standard that makes use of the Adaptive-Sync feature in the DisplayPort specification, and any manufacturer can implement it without charge. So far there are a handful of displays listed as being available from the likes of Acer, BenQ, LG, Samsung, and Viewsonic. AMD claims up to 20 monitors will have the technology in them by the end of 2015.

The current list of AMD graphics adapters and APUs that support FreeSync is as follows:

Is there a competing standard?

NVIDIA has its own standard called G-Sync, but this is a proprietary standard that relies on hardware being installed in a monitor. To make use of that feature, you need both an NVIDIA graphics card, and a monitor that has the G-Sync circuit board installed.

Who is it for?

Read more:AMD goes punch-for-punch with Intel's top-end i7 processors

Technologies like AMD's FreeSync are primarily for gamers who require the highest frame rates, smoothest image quality, and minimal latency from their input peripherals.

Read More: