About two years ago, NVIDIA started talking about a shader rendering method that applied varying levels of quality to different objects in the game world. That method was called Variable Rate Shading, or VRS. When applied skillfully and artfully in different areas of a scene that are less prominent in the gamer's field of view, the resulting rendered image quality wouldn't look all that different than if the whole scene had been fully shaded, though the graphics workload savings would improve performance significantly.
Fast forward a year or so, and benchmarks became available for measuring the performance differences with and without VRS enabled. Indeed, there was performance to be had without making the resulting scene look like crayon scribbles. Variable Rate Shading affords developers much finer-grained control of shading rates in any given scene. Each pixel region of a scene can now have different level of shader detail and a different shading rate as a result, thus allowing for a possible increase in shader rates in certain areas, while conserving shader processing resources in others. This feature made its way into DirectX 12, and now GPUs from all three major graphics vendors — AMD, NVIDIA, and Intel, for those playing along at home — all have GPUs with VRS support in their respective line-up. We should point out, however, that the integrated Vega GPU in the current gen Ryzen 4000 CPU we tested does not.
Blizzard Adopts VRS For WoW, A Boon For Integrated Graphics
At the end of October when AMD announced the Radeon RX 6000 series, Blizzard was on hand to talk about the improvements to World of Warcraft's rendering engine. The headlining feature at the time was DirectX Raytracing (DXR), but VRS got a mention as well, and was largely forgotten. After all, VRS is really only useful if the GPU is rendering a game with settings too ambitious for its shading horsepower. That's not really a concern more often than not for Radeon RX 6000 and GeForce RTX cards, but it's a big deal for integrated graphics.
And then Intel entered the fray. Its Iris Xe graphics processor is an integral engine on board most of the Tiger Lake family of CPUs, and supports VRS as well. So does the Gen 11 GPU tech found in the Ice Lake 10th-generation Core processors, for that matter. Here's a scenario where variable rate shading could really pay off in a game that still has millions of active accounts. Since the latest WoW version 9.0.2 patch enables VRS we thought this was a great opportunity to see how it runs on the latest integrated graphics from Intel and AMD. This was to test performance, but also to look at image quality. We're mostly looking for what we don't notice here. If we don't see artifacts due to VRS turning on and off, then it would seem to us that Blizzard did its job.
Blizzard uses VRS in a couple of ways within World of Warcraft: Shadowlands. First, it's applied to the geometry, and by default, it's applied to tiles of pixels up to 4x4 each. That means the shading calculations on terrain could be just 1/16 of what they would normally be for the world around us, including houses, buildings, mountains, and water. That's just one part of the rendering pipeline on one part of the image, so we don't want to get our hopes up too much. However, that should gain back some performance, especially on integrated graphics. VRS can also be applied to particle effects. We left the game settings alone and let it manage whether the full scene needed to be shaded.
With the default settings, there's not really any hint of lower image quality.
We did force VRS on to get an idea of what our worst-case scenario might look like, though. Using the console to force VRS into an always-on state is not what Blizzard recommends, saying that the game should just manage it as it sees fit. With that caveat in mind, let's take a look at the most aggressive form of variable-rate shading in the game.
The game will never look like this, unless you force VRS on all the time.
This screenshot goes with the one at the top of this VRS section and shows, again, a worst-case scenario where VRS is applied to the entirety of special effects and world geometry. WoW will never look like this, and it only serves to illustrate what could be affected. First of all, that fire is just gross, and the grass in the background is all blocky. It's like we stumbled out of WoW and slipped into Minecraft. When we played the game, we never saw anything like this. In fact, we're not sure that VRS was applied at all, since it's meant to be invisible. These screenshots do serve, however, to prove that Intel's hardware supports the technique, and that's all we needed to verify.
Our WoW With VRS Testing Methods
We tested WoW in a Human starting area, Exile's Reach. Since it's a starting area, it's not crowded or full of complex geometry like Ironforge, so performance would drop once we get into a big city. On the other hand, this allowed for repeatable tests. WoW is a massively multiplayer experience, after all, and staying out of crowds means that our test should be repeatable. We wandered around for a few minutes slaying Murlocs and capturing performance data with CapFrameX. That gives us a raw output of the time to took to draw each frame, and thanks to some spreadsheet wizardry we can get averages and look at frame time variance.
We tested World of Warcraft with the settings slider set to 5 out of 10 (up from the default of 3), but we changed the spell density to "Half" rather than "Dynamic" so that was we could control the effects. We tested with anti-aliasing disabled and then again with FXAA set to High. The game is pretty forgiving in these sparsely populated areas, and has another set of toggles for more performance-sensitive areas. We think using the 5 slider setting most of the time, and dropping down when needed, will provide a good experience.
The laptop models we tested were the following:
The resolution was 1920x1080, and we turned off all the performance targets. Maximum and desired frame rate were all disabled. However, when we played the game outside of testing, we left these on. We're not entirely sure if a target frame rate is needed to get the game to turn on variable rate shading, so we left it enabled in case WoW needed it. However, for testing purposes, being able to predict what will be rendered and how is important, so we turned these targets off.
First up let's look at frame rates...
We can see that Tiger Lake's Iris Xe GPU found in the late 2020 edition of Dell's XPS 13 2-in-1 9310 not only provided the best experience, but it also hit up over 80 fps in the Exile's Reach starting area without FXAA and just a couple percentage points below that with FXAA turned on to the High setting. The Ryzen 7 4800U also turned in very smooth frame rates with a tighter spread, but around 12-15% lower than the Tiger Lake machine on average.
Meanwhile, both of those systems were more than 50% faster than Ice Lake's Iris Plus graphics solution found in the Core i7-1065G7 regardless of AA setting. The Iris Plus integrated GPU lost around 10% of its overall performance, compared to single digit percentage drops for the other two. Tiger Lake's Iris Xe was particularly impressive, giving up almost nothing for some effectively free AA.
The Core i7-1165G7 and Ryzen 7 4800U turned in very similar results, though the Intel system had a few more peaks and valleys. Overall the Ryzen system, while a hair slower with FXAA enabled, had a slightly more consistent experience. Just a single frame took longer than 30 milliseconds, which is our threshold for "badness". Because WoW is an online-only game with no canned benchmarks, there's no way to account for variance other than capturing plenty of data. Any inconsistency in the frame rates that we observed would be completely invisible to players since none of the systems we tested had a high refresh rate panel.
WoW With VRS Experience Take-Aways
Variable Rate Shading is a newer technology for most developers. The fact of the matter is that game studios are still tinkering with it, trying to figure out what's the best way to apply this technique to areas of a game world. The best application is one where it won't have a tremendous effect on visual quality, but still gains back some of the lost performance dedicated to fully shading areas that players won't even look at. Blizzard approached this in two ways: by only enabling it when performance suffered, and by limiting to the areas that were affected.
The result in our experience was pretty solid across a range of integrated hardware. The Ryzen's integrated Vega graphics did a nice job of drawing WoW, and performed relatively well, even without VRS support. The Iris Plus integrated processor in the Core i7-1065G7 offered fairly consistent performance too. However, performance leapt into another world on the Core i7-1165G7 and its Iris Xe graphics. It would seem that Intel's newest Xe graphics architecture is pretty potent, at least in this configuration, and frame rates responded in kind.
It makes sense that a game with millions of subscribers would be playable on integrated graphics, but it surprised us just how good WoW could look. It seems Blizzard has done a great job optimizing the engine, and the game's cartoony art direction makes sure that nothing falls into the uncanny valley. It seems that no matter how big or small your system is, Blizzard has put together a game and an engine that can make Warcraft run great, especially on current-gen hardware.