Playing games at a higher refresh rate can have a substantial impact on your gaming experience. This is especially relevant with fast-paced, competitive games where every frame counts. However, simply buying a 144Hz or 240Hz display is not enough to see the benefits.
As illustrated above, a higher refresh rate refers to the frequency that a display updates the onscreen image. The time between these updates is measured in milliseconds (ms), while the refresh rate of the display is measured in hertz (Hz).
The refresh rate of your display refers to how many times per second the display is able to draw a new image. This is measured in Hertz (Hz). For example, if your display has a refresh rate of 144Hz, it is refreshing the image 144 times per second. When paired with the high frame rates produced by a GPU and CPU working together, this can result in a smoother experience and potentially higher FPS.
The graphics settings used will also impact how hardware-intensive the experience ends up being. Lowering the resolution to 1080p will result in a higher refresh rate at less of a cost to performance, as will turning off or lowering graphical settings. As with attaining higher resolutions, the less taxing the gameplay experience is on the hardware, the easier it will be to push the frame rate high enough to see the benefits of a high-refresh rate display.
Adaptive SyncIf your system is struggling to achieve your desired FPS, a display with adaptive sync might be useful. Many modern displays incorporate this technology. Adaptive sync enables a display to communicate directly with the GPU so that the refresh rate of the display is synchronized as each frame is produced, even if the FPS is inconsistent.
Choosing the Right MonitorHigh-refresh rate monitors are available at many different refresh rates, with 144Hz being a considerable improvement over standard 60Hz monitors and 240Hz being a popular high-end option. Check out our breakdown of gaming monitors by refresh rate and resolution to learn more.
Upgrading Your System for Smoother GameplayA high-refresh rate display can have a substantial impact on your gaming experience, assuming your hardware is powerful enough to meet the higher requirements.
Variable refresh rate (VRR) technology is a feature some displays like monitors and TVs have to match their refresh rate with the frame rate of the signal when it's constantly changing. It's especially useful for gamers when the frame rate of the game fluctuates on the fly, and it helps reduce screen tearing. Usually, only higher-end TVs have VRR support, but it's becoming more of a norm with mid-level TVs too.
Variable refresh rate support is most important for gamers. It's necessary to make sure both the source and the TV support the same VRR format, or it won't work. The TV matches its refresh rate with the frame rate of the game, even if the game is dropping frames, which is why it's a variable refresh rate and not a static one. Most PC games and graphics cards support VRR, and while only Xbox has implemented it with their consoles, the PlayStation 5 should get it in a future firmware update, so VRR support on TVs is important for most gamers.
We test the variable refresh rate support of TVs after checking the supported resolutions, so we know which resolutions to check. Most of the time, we'll know if a TV supports VRR solely based on the marketing, but we always check to see if it works and which formats it supports. We use a PC with an NVIDIA GeForce RTX 3070 graphics card, another with a Radeon RX 580 card, and also our Xbox Series X.
Before performing any VRR tests, we list the native refresh rate of the TV, which we find out by using the supported resolutions test. Currently, the native refresh rate of TVs is always either 60Hz or 120Hz.
Once we check to see if it supports G-SYNC, we use the RTX 3070 PC and the NVIDIA Pendulum test program to check for the VRR range. If it doesn't support G-SYNC, we use the Radeon PC instead. With the RTX 3070 PC, we make sure V-SYNC is disabled, we open the test program, and we set the resolution to our desired resolution, and we set the frame rate to something where it shouldn't tear, like 55 fps. We adjust the frame rate of the signal and refresh rate of the display at the same time, setting it higher and always checking to see if it's still tearing. Most TVs will have the VRR maximum as its max refresh rate.
Variable refresh rate is a feature that allows the TV to display frames as they are sent, without requiring a constant fixed frame rate, and adjust the refresh rate on the fly. This results in reduced screen tearing. We test TVs for variable refresh rate support, including the maximum and minimum frequencies at which the TV can stay synchronized without screen tearing. If you want to avoid frame skipping or screen tearing while gaming, look for a TV that supports a variable refresh rate standard.
Now, let's talk about how we measure the input lag. It's a rather simple test because everything is done by our dedicated photodiode tool and special software. We use this same tool for our response time tests, but it measures something differently with those. For the input lag, we place the photodiode tool at the center of the screen because that's where it records the data in the middle of the refresh rate cycle, so it skews the results to the beginning or end of the cycle. We connect our test PC to the tool and the TV. The tool flashes a white square on the screen and records the amount of time it takes until the screen starts to change the white square; this is an input lag measurement. It stops the measurement the moment the pixels start to change color, so we don't account for the response time during our testing. It records multiple data points, and our software records an average of all the measurements, not considering any outliers.
When a TV displays a new image, it progressively displays it on the screen from top to bottom, so the image first appears at the top. As we have the photodiode tool placed in the middle, it records the image when it's halfway through its refresh rate cycle. On a 120Hz TV, it displays 120 images every second, so every image takes 8.33 ms to be displayed on the screen. Since we have the tool in the middle of the screen, we're measuring it halfway through the cycle, so it takes 4.17 ms to get there; this is the minimum input lag we can measure on a 120Hz TV. If we measure an input lag of 5.17 ms, then in reality it's only taking an extra millisecond of lag to appear of the screen. For a 60Hz TV, the minimum is 8.33 ms.
This test measures the input lag of 1080p signals with a 60Hz refresh rate. This is especially important for older console games (like the PS4 or Xbox One) or PC gamers who play with a lower resolution at 60Hz. As with other tests, this is done in Game Mode, and unless otherwise stated, our tests are done in SDR.
The 4k @ 60Hz input lag is probably the most important result for most console gamers. Along with 1080p @ 60Hz input lag, it carries the most weight in the final scoring since most gamers are playing at this resolution. We expect this input lag to be lower than the 4k @ 60Hz with HDR, chroma 4:4:4, or motion interpolation results because it requires the least amount of image processing.
Like with 1080p @ 60Hz Outside Game Mode, we measure the input lag outside of Game Mode in 4k. Since most TVs have a native 4k resolution, this number is more important than the 1080p lag while you're scrolling through the menus.
We also measure the input lag with any variable refresh rate (VRR) support enabled, if the TV has it. VRR is a feature gamers use to match the TV's refresh rate with the frame rate of the game, even if the frame rate drops. Enabling VRR could potentially add lag, so that's why we measure it, but most TVs don't have any issue with this. We measure this test by setting the TV to its maximum refresh rate and enabling one of its VRR formats, like FreeSync or G-SYNC.
Input lag is the time it takes a TV to display an image on the screen from when it first receives the signals. It's important to have low input lag for gaming, and while high input lag may be noticeable if you're scrolling through Netflix or other apps, it's not as important for that use. We test for input lag using a special tool, and we measure the input lag at different resolutions, refresh rates, and with different settings enabled to see how changing the signal type can affect the input lag.
Gaming monitor options keep growing, with new brands, features, resolutions and display sizes hitting the market quickly. It's an exciting time to be a PC gamer, but that also means that selecting the best gaming monitor for your rig is growing more complicated. The array of specs to consider is dizzying, from adaptive-sync technologies (Nvidia G-Sync or AMD FreeSync) to refresh rates, panel types, screen curvatures and HDR support.
The S3222DGM's delivers an enviable contrast ratio thanks to its 1800R curved VA panel. While the IPS competition often struggles to break much past 1,000:1, the S3222DGM's VA panels shot to 4209:1 in our tests. The display also reproduced 122 percent of the sRGB color gamut and 85 percent of the DCI-P3 gamut on our tests along with an incredibly accurate gamut error rate of 2.07dE.
The Dark Matter 42770 offers a 1ms GTG response time and tops out with a 144 Hz refresh rate. Another feather in its cap is that the monitor supports both AMD FreeSync and NVIDIA G-Sync Adaptive-Sync technologies.
The Dell G3223Q is a stellar entry in the 4K gaming monitor segment, offering a 32-inch panel size, low total input lag (measured at just 30ms) and an excellent balance between response and motion resolution. As you might expect for a 4K gaming monitor, we have a 144Hz refresh rate with support for both AMD FreeSync and Nvidia G-Sync Adaptive Sync technologies.
The HyperX Armada 27 is one of those standout monitors that offers the whole package. This is a 27-inch QHD display with a 2560 x 1440 resolution, 165Hz refresh rate HDR and premium built quality that we expect from HyperX. 2b1af7f3a8