Nvidia has made concerted efforts as of late to expand their GeForce feature set, so that raw performance is not the only deciding factor when buying a new GPU. Features like DLSS and ray tracing have also been heavily pushed, and today we’ll be checking out whether Reflex is something you should care about. Reflex is currently split into two similar, but separate features. One is just called Nvidia Reflex, and it’s the feature you’ll find added to games with the goal of improving latency. If you fire up a game’s settings and see the option to turn on or off Nvidia Reflex, this is what we’re talking about.

The second is Nvidia’s Reflex Latency Analyzer, which is a collection of hardware and software tools you can use to analyze game and total system latency. The goal is to provide latency information to gamers, so they can optimize their system for the best responsiveness. In this article we plan to cover both. First we’ll look at how well Reflex works across some of the supported games so far like Fortnite, Valorant, and Call of Duty Modern Warfare - in a variety of conditions. Then we’ll take a look at some of Nvidia’s Reflex Latency Analyzer tools including the GeForce Experience overlay and the new monitors with built in latency tools. We have the Asus ROG Swift PG259QNR on hand, so we’ll take a brief look at what’s on offer.

Latency is a complicated matter, so we’re going to break it down to the simplest terms possible. Existing low latency modes are driver based, including Nvidia’s Ultra Low Latency mode (otherwise known as NULL), as well as the regular low latency mode. They work by adjusting the way the GPU buffers frames, usually reducing the number of frames in the buffer, and by modifying the render queue. However, as they are driver based, features like NULL only work with older DirectX 11 titles. DirectX 12 games, which are becoming more popular, have full control over most aspects to queuing and buffering, so driver modes are not compatible. Reflex is the next step for low latency modes. This is a feature that is built into the game with the goal of further reducing latency, beyond just modifying queues and buffers. Nvidia is being a bit cagy over exactly how it works, but it can be summarized as this: through Reflex, your Nvidia GPU tells the game engine what it is doing, and the game engine responds by looking at this info, and doing its work just before the GPU is ready to render. This means the game engine is doing just-in-time processing, which allows it to grab the freshest inputs from your system, and deliver that to your display with the least latency.

It’s only possible to get this latency reduction when there is good communication between the GPU, CPU and game engine, which is why something like Reflex is required. You can think of this a bit like a kitchen at a restaurant, you don’t want to have your steak cooked well before the chips are ready or it’ll get cold; good communication in the kitchen allows both to be ready at the same time for the freshest, hottest and tastiest meal. That’s what Reflex is doing, but with your game.

Due to this, you might be able to realize why Reflex requires game integration, and why it’s exclusive to Nvidia’s GeForce 900 series GPUs and newer. It won’t work as a driver solution as it needs deep integration with the game engine, and it also requires knowledge of how the GPU is operating.

The Nvidia GPU driver provides that information and apparently it goes beyond just the regular telemetry that performance overlays can access. So this isn’t an open solution, it doesn’t work with AMD GPUs, and it requires game-level integration. But, it doesn’t require new RTX 30 GPUs, so it works with older GeForce cards from the past few generations which will let you access Reflex.

Time to look at how much of a benefit we actually get from Reflex. Our test bed will be performing some CPU-limited testing today at 1080p, so we’re using a Core i9-10900K test rig for all benchmarking. The system has been equipped with 16GB of DDR4-3200 memory and the MSI Z490 Unify motherboard.

For Reflex testing we’re using Nvidia’s LDAT or Latency Display Analysis Tool, which is essentially a photodetector you can place on the screen to measure game latency. This is a better version of the tools we were already using for latency measurements, which were also based on a photodetector, and we’ve confirmed it to be very accurate. Later we’ll be exploring the Reflex Latency Analyzer, but for now, we’re measuring total system latency from mouse click, to action on screen, using LDAT.

Benchmarks

We wanted to start off by looking at how Reflex improves latency when we are gaming with the maximum in-game quality settings. This is where Nvidia says we should see the most benefit; the more GPU bound you are, the more likely Reflex will deliver a latency improvement. First up we are testing in Fortnite’s Creative mode, using DirectX 12, the Epic Preset and maximum ray tracing. we’re also using the DLSS Performance mode and a GeForce RTX 3090 FE GPU, with the display being a 4K 144Hz panel with adaptive sync enabled, the LG 27GN950 in this instance.

At 4K, we saw the most benefit from using Reflex, and in fact the gains are quite impressive. Total system latency was around 104ms with Reflex disabled, but when turning on the feature, we saw latency halved to under 50ms. There wasn’t much difference between Reflex On, and Reflex On + Boost - which keeps the GPU clocked up high to further improve latency - but with either Reflex mode we saw a significant latency improvement. In contrast, Nvidia’s Ultra Low Latency Mode doesn’t improve the experience, as NULL doesn’t work with DirectX 12 titles. At 1440p, gains were still present, but not as significant as 4K. While at 4K we were gaming around 45 FPS, reducing the resolution to 1440p saw us jump up to 85 FPS. We are no longer seeing half the latency with Reflex enabled, but a 12ms reduction is still decent and something that we could notice while gaming. This is all with G-Sync enabled and Vsync disabled, so we’re getting the optimal Vsync related latency. There are also gains to be had at 1080p, with about a 10ms improvement in our testing. Given the RTX 3090 is quite powerful, still being able to achieve better latency at 1080p with Reflex enabled is decent. But the largest gains, and the most notable improvement in responsiveness, was observed when we’re more GPU bound at below 60 FPS.

However you aren’t always going to be GPU bound. The RTX 3090 makes easy work of Valorant for example, running at over 600 FPS in our test area at all resolutions with the Core i9-10900K. It’s here that, even when playing on the maximum quality settings, that Reflex delivers next to no improvement. With latency around the 16ms mark in all situations, which is very responsive, there isn’t much else Reflex can do in terms of optimization.

We also saw less of a gain when playing Call of Duty: Modern Warfare’s Warzone mode. Even when playing at 4K, using the highest settings with ray tracing enabled, Reflex only provided about a 3ms reduction to input latency with all of the resolutions we tested. There was a consistent gain, but with this sort of GPU power at hand, the improvement wasn’t in the same league as with Fortnite.

While you do clearly need to be GPU bound to see large improvements with Reflex, and that isn’t as possible with the RTX 3090, if you have an entry-level GPU like the GeForce GTX 1650 Super, you are much more likely to get a latency improvement using the Reflex mode…

In Fortnite, with the Epic preset but without ray tracing as the 1650 Super doesn’t support it, Reflex provided a similar benefit to what we saw with the 3090 at 4K, but this time at 1440p and 1080p. In both of these modes, the game is pretty GPU bound and runs below 100 FPS on our test island. Here we see Reflex halving the input latency, which makes quite a significant difference to gameplay on these systems where you still want all the visual effects cranked up.

We also saw a latency improvement in Valorant using this class of GPU. We’re no longer seeing over 600 FPS, in fact at 4K the 1650 Super is more like an 80 FPS GPU using the highest quality settings. In this situation, we saw a healthy 19ms latency improvement, which is a significant jump in this sort of title. Gains were more modest at 1440p, with a 9ms drop to latency, and at 1080p, with just a 6ms drop. But still, in these conditions it doesn’t appear the game is fully GPU bound, so we don’t see a situation like with the 3090 where Reflex is useless.

Call of Duty Warzone consistently delivered the least impressive latency improvements. Even with a GTX 1650 Super, which runs below 60 FPS at 1440p with high quality settings, we only saw about a 10-12ms improvement to system latency when enabling the feature, and this was at both 1440p and 1080p. To me this was pretty hard to notice, especially with a frame rate around 50 FPS which isn’t the most responsive, but for more highly tuned competitive gamers this might be a significant difference. In general though, our testing shows that when you are playing on high to ultra quality settings, and you’re mostly GPU bound, Reflex will provide a latency improvement, especially in titles like Fortnite. The gains are going to be more pronounced on lower end cards as your system in general will be more GPU limited, and that seems to be key for Reflex: the more your CPU is sitting idle, the more potential there is for a latency improvement. We would expect that on systems with a weaker CPU, like say a Ryzen 5 1600, you’d get the reverse effect and there would be less of a gain. With this in mind we wanted to test Reflex with competitive esports settings. Most of the time if you’re a serious competitive gamer, you’ll be playing using mostly low settings. This allows you to achieve a higher frame rate, which inherently lowers total latency, but it can also often make spotting enemies easier without the distractions of shadows and other effects. So if you’ve already done a lot to optimize your system for latency, you’re getting very high frame rates and are playing at 1080p, what can Reflex do for you?

The answer to that is… practically nothing. We’re not even using a high-end GPU for this testing. This is a system with a Core i9-10900K to give us the least amount of CPU bottlenecking and an RTX 2060 GPU. We’ve hooked it up to the Asus PG259QNR as well, to get that sweet 360Hz goodness.

In Fortnite, using the lowest settings, no ray tracing, the DLSS Performance mode and epic draw distance, we were able to achieve around 400 FPS in the test area. At this sort of super high frame rate, latency was already very low without Reflex, at just 14ms of total system latency. Reflex did not improve this result, as we are fully CPU bound.

Same story in Valorant. we were achieving between 600 and 1000 FPS using the game’s lowest settings, and latency was consistently around the 13ms mark whether Reflex was enabled or disabled. The “Boost” mode hasn’t done much so far, even though it’s supposed to help a bit with latency in more CPU bound situations.

And then we also have Warzone, where there was a small latency improvement of around 4ms when using Reflex in combination with the lowest settings. However in this title with the lowest settings, we did appear to be GPU bound rather than CPU bound, so we’d expect results to fall more in line with the other titles if we tested with a higher end GPU like an RTX 3090 using the lowest possible settings.

What’s also important to note is that the gains you see from Reflex are independent of your monitor’s refresh rate when gaming with Vsync disabled, as you should be for the lowest latency possible. For example, in Fortnite with a 1080p display, Reflex consistently gave a 9ms improvement in our testing across the board with the RTX 2060 using epic settings, at the same frame rate, regardless if the monitor was set to 360Hz or 60Hz. The main difference is that you’ll see additional latency added at lower refresh rates, so it’s always better to be gaming at the highest possible refresh rate A lot of these latency results fall in line with what we’ve seen previously from latency-reduction modes. When GPU bound, especially when heavily GPU bound, there is often scope to reduce system latency. The extent of the reduction will depend on the game and how it works, but when CPU bound, your system is basically flat out and you won’t see much, if any, of a latency improvement. The closer to being CPU bound you are, which generally correlates with higher frame rates, the less useful Reflex is.

Reflex Latency Analyzer

Before giving our final thoughts on Nvidia’s Reflex, it’s worth going through the tools Nvidia are providing through their Reflex Latency Analyzer. While the LDAT tool we used for our Reflex analysis isn’t available to end users or buyers, every aspect of the Reflex Latency Analyzer will be available, though some elements have a cost. Some of the Reflex tools are available to use without any specific latency hardware. In some Reflex-enabled games like Fortnite and Valorant, you can enable a latency overlay in the game which shows various metrics. In both cases we see what’s known as “game to render” latency – in other words, the time it takes from when the game receives an input, to when the frame is output to the display. This is not total system latency - which also includes the latencies from your display and input peripherals - but for a lot of people this game to render latency will be a good enough metric for optimizing latency.

You can also use GeForce Experience’s new latency performance overlay, however this only shows GPU-side latencies such as render latency, not the full game to render latency, so its usefulness without additional Reflex Latency Analyzer hardware is limited. In games like Fortnite though, you can use the built in tool, mess around with your settings, and see how that impacts the total latency number presented.

For hardcore enthusiasts, Nvidia is offering a hardware ecosystem in conjunction with various partners that allows for further, more in-depth latency analysis. This comes in the form of two components: the Reflex Latency Analyzer built into some displays, and Reflex Latency Analyzer Compatible mice. We could spend a ton of time detailing how all of this stuff works, but to be honest that would take a while and wouldn’t be that interesting, so if you’re interested in the nitty gritty details you can read Nvidia’s documentation on it.

It basically boils down to this: when you have both a Reflex Latency Analyzer equipped monitor like the Asus ROG Swift PG259QNR and a Reflex Latency Analyzer Compatible mouse, like the Asus ROG Chakram Core, you can measure total system latency. This includes not just the game to render latency (like Fortnite reports), but mouse latency and display latency as well. This gives us the most valuable and accurate latency information possible.

It takes a bit of time to set up and it’s really only suitable for benchmark conditions - it won’t give you good readouts in general gameplay as you have to position the capture area for the latency analyzer over a muzzle flash, for example, and that could change in dynamic gameplay. But with this hardware in conjunction with the GeForce Experience Latency overlay, you can benchmark total system latency and optimize your configuration around that metric.

Generally speaking we found the Reflex Latency Analyzer to be within 1ms of LDAT, so it’s a pretty accurate tool, although it pulls data from the G-Sync module’s frame buffer as opposed to getting a reading directly from the display’s pixels. In that sense LDAT is the more complete tool. But for most people, the Reflex Latency Analyzer will be accurate enough and certainly much better than any other tool currently available that’s this easy to use.

The Asus PG259QNR is also just a Reflex Latency Analyzer version of the PG259QN that we already reviewed recently, an excellent 1080p 360Hz display. We believe there are other monitors coming from Acer and Alienware that will support the tool.

What We Learned

Nvidia’s Reflex ecosystem is a tale of two separate, but related entities. On the one hand you have the Reflex mode integrated into games, and on the other you have the Reflex Latency Analyzer. You could argue these are designed for two different groups of people, with little crossover. The Reflex mode found in supported games is most useful in GPU limited situations. In these competitive titles, most of the time you’ll be GPU limited when you’re either playing on ultra quality settings, using a lower-end GPU, or some combination of both. The more CPU limited your system becomes, and the higher the frame rate pushes, the less useful Reflex is for enhancing system latency.

This makes Reflex a useful feature primarily for casual competitive gamers; the people that want to play Fortnite but don’t want to sacrifice visual quality, don’t want to play at 1080p or just don’t have the most powerful hardware. Turn on Reflex, your system latency will drop, and that improvement to responsiveness might make you a bit better at the game. On the flip side, Reflex is next to useless for competitive gamers. If you’re the sort of person that already has a latency optimized setup - so you’re already playing at 1080p on powerful hardware with low settings to ensure you’re getting the highest possible frame rate - Reflex will have no benefit whatsoever. That’s because you’re almost certainly CPU limited, and Reflex is ineffective in those situations. This holds true even in situations with mid-range GPUs like an RTX 2060 in titles like Fortnite and Valorant when gaming at low quality settings.

There’s no downside to having Reflex enabled in those situations, but don’t expect to see even lower latencies when you are already in a fully optimized environment. This is really for the casual gamers that don’t want to sacrifice ray tracing, or 4K resolutions while gaming, but still want a nice and responsive experience. And that’s fine. Then the Reflex Latency Analyzer is for serious competitive gamers that want to do everything possible to reduce system latency. Having the tools available to do that are neat and potentially useful for tweaking hardware setups and game configurations, but realistically this isn’t something a casual gamer will be doing.

We can see the Reflex feature in games being widely used, but the Reflex Latency Analyzer and compatible mice seems like a really niche feature that probably won’t get much traction outside a very small user base. With that in mind, we do have question marks over how long Nvidia will support something like this, which in turn may throw up question marks over whether it’s worth investing in Latency Analyzer hardware. It’s certainly neat and works well, but without that broad user base, we can see it getting the chop after just a few years. The other question we’re sure some of you will be asking is: is it worth buying an Nvidia GPU over the competition specifically for Reflex? After all, this is being pushed heavily as the next best thing, similar to how Nvidia was positioning DLSS and ray tracing with Turing GPUs. We guess that’s part of the good news. If you’re already in the Nvidia ecosystem, Reflex works with GTX 900 series GPUs and newer, so don’t go out and feel that you have to upgrade to Ampere to get Reflex.

At this point, we don’t think it’s worth considering Reflex into your buying decision. Clearly, this isn’t a DLSS 1.0 situation where the feature is kind of useless. Reflex works properly and can give you a latency improvement, but it’s restricted to a small handful of settings right now. You have to be playing one of the few supported titles, and be playing in a GPU limited situation to see the benefit. Also, not all games benefit in the same way, with some titles giving large gains and others less noticeable ones. If the only thing you do is play Fortnite, or Valorant, or Apex Legends, then by all means, buying a GeForce GPU might be the best way to go. But if you’re one of the many gamers that plays a variety of titles, or other competitive games for that matter, then like DLSS or ray tracing we think it’s better to view this as a neat bonus feature for now. If the ecosystem continues to grow and Reflex becomes a key feature in most competitive titles, then maybe that will change, but for now that ecosystem is too small to be a must have feature.