This site may earn affiliate commissions from the links on this page. Terms of use.

One question that's been popping up with increasing frequency when we talk about high-end graphics cards is whether 4GB of RAM is enough to power current and next-generation gaming. When we initially covered AMD's Fury 10 launch, I promised to render to this topic and cover it in more detail. Before we can hit the data, however, we demand to talk virtually how VRAM management works and what tools are bachelor to evaluate information technology in DirectX 11.

FuryCard2

While it might seem straightforward to examination whether or not any given title uses more than 4GB of RAM, the tools for doing this are rather inexact. The GPU itself does not control which data is loaded into memory. Instead, retentiveness management is handled by the operating system and the GPU driver. The GPU tells the Bone how much retentivity it has, but it doesn't make any decisions almost how data is loaded or which information is loaded showtime.

Ane way that game developers handle memory direction in software is past creating game presets that assume a item amount of VRAM is present on the card. Low detail might be tuned to run on 512MB cards, while ultra item assumes you accept at least 4GB of VRAM. If y'all choose a detail level that calls for more VRAM than is present on your carte du jour, yous'll likely see a heavy performance hitting as the system is forced to load information from main memory.

Memory usage in Shadow of Mordor

Retentivity usage in Shadow of Mordor

SoM2

Shadow of Mordor frame rate scaling by resolution between all three cards

Some games won't utilise much VRAM, no matter how much y'all offering them, while others are more opportunistic. This is critically of import for our purposes, because there'south non an automatic link between the amount of VRAM a game is using and the amount of VRAM it really requires to run. Our showtime article on the Fury X showed how Shadow of Mordor actually used dramatically more VRAM on the GTX Titan X as compared with the GTX 980 Ti, without offering a higher frame rate. Until we hit 8K, there was no performance reward to the huge memory buffer in the GTX Titan X — and the game ran so slowly at that resolution, it was impossible to play on any carte.

GPU-Z: An imperfect tool

GPU-Z claims to report how much VRAM the GPU actually uses, but there's a significant caveat to this metric. GPU-Z doesn't really written report how much VRAM the GPU is really using — instead, information technology reports the amount of VRAM that a game has requested. Nosotros spoke to Nvidia'southward Brandon Bell on this topic, who told us the post-obit: "None of the GPU tools on the market report memory usage correctly, whether information technology'south GPU-Z, Afterburning, Precision, etc. They all report the amount of memory requested by the GPU, non the actual retentivity usage. Cards will larger memory will request more than memory, but that doesn't mean that they actually use it. They simply request it considering the retention is available."

GPU-Z

At that place are other tools, similar Process Explorer, which can also capture GPU memory requests — but they don't confirm actual memory usage, either.

Our ain testing backed upwardly this claim; VRAM monitoring is subject area to a number of constraints. Resolution switching or visiting more than than i surface area earlier beginning testing can significantly increase total retentiveness "in use" without actually impacting performance at all. There's also a moderate amount of variance between exam runs. Nosotros tin say that the GPU requested around 4.5GB of RAM, for example, but one test run might evidence a GPU topping at iv.3GB while the side by side showed a maximum RAM consumption of 4.5GB. Reported VRAM consumption can also vary during the game; logging and playthroughs must be carefully managed.

Finding >4GB games

When we started this process, I assumed that a number of high-finish titles could readily be provoked into using more than 4GB of VRAM. In reality, this proved a tough nut to crack. Plenty of titles top out around 4GB, but most don't exceed it. Given the lack of precision in VRAM testing, we needed games that could unambiguously break the 4GB limit.

Nosotros tested Assassin's Creed Unity, Battleground 4, BioShock Space, Civilization: Beyond Earth,  Company of Heroes two, Crysis 3, Dragon Age: Inquisition, The Evil Within, Far Cry 4, One thousand Theft Auto 5, Metro Last Light (original), Rome: Full State of war 2, Shadow of Mordor, Tomb Raider, and The Witcher 3: Wild Hunt. Out of those 15 titles, simply four of them could be coaxed into significantly exceeding the 4GB limit: Shadow of Mordor, Assassinator'south Creed: Unity, Far Cry 4, and Grand Theft Auto 5. Fifty-fifty in these games, we had to use extremely loftier detail settings to ensure that the GPUs would regularly report well over 4GB of RAM in use.

Our testbed for this project was an Intel Core i7-5960X with 16GB of DDR4-2667 running Windows 8.1 with all patches and updates installed. While Windows 10 has simply recently launched, we began this project on Windows 8.one and wanted to finish it there. Transitioning operating systems would accept necessitated a complete retest of our titles. We tested AMD's Radeon R9 Fury X, Nvidia's GTX 980 Ti, and the peak-end GTX Titan X. With Titan 10, we were curious to see if we'd come across any benefits to running with a 12GB RAM buffer over and above the 6GB buffer on the GTX 980 Ti.

In every case, the games in question were pushed to maximum detail levels. MSAA was not used, since that incurs its own operation penalty and could warp results, but the highest not-GameWorks settings were used in all standard menus. GW-specific implementations bachelor but on Nvidia hardware were left disabled to create a level testing field. The one exception to this was 1000 Theft Auto V, where we used Nvidia'due south PCSS shadows for its cards, and AMD's preferred CHS shadows for the Fury X.

GameWorks, performance, and 4GB VRAM

In that location's one common factor that ties three of our 4 >4GB titles together — GameWorks. Iii of the iv games that we've tested (Far Cry 4, Assassin's Creed Unity, and Grand Theft Auto Five) are Nvidia GameWorks titles, meaning they make use of Nvidia-provided libraries to provide primal DirectX xi functions similar ambient occlusion, soft shadows, and tessellation. AMD cannot optimize for these games to the same degree and AMD GPUs tend to perform significantly worse in GameWorks titles than in other games. We've discussed GameWorks, its implications, and its impact on various titles at multiple times in the past few years.

One affair I want to stress is that while we'll exist looking at performance data in this article, its primary purpose isn't to compare how the Fury X stacks up, performance-wise, against Nvidia'south highest-terminate GPUs. Such comparisons are inevitable, to some extent, but this isn't a standard review. We've created specialized test cases designed to test a theory and used settings that significantly depart from what we consider playable or appropriate for 4K testing. As such, the 4K functioning results in this story should not exist treated as typical results for Nvidia or AMD. The goal of these tests is to create a worst-case scenario for a GPU with 4GB of VRAM and run into what happens equally a result.

Nosotros covered Shadow of Mordor in our initial Fury 10 coverage, and then this article will concern itself with the three new games we tested. Let's kick things off with Far Cry 4.

  • i of 5