How much memory do you really need? With the introduction of the GTX 900 series graphics from NVIDIA, we’ve seen a leap in two things: performance, and especially on the mobile graphics side, memory size. Simply put, the higher your resolution is, the more memory you’re going to need in addition to high raw processing power. But how much?
For example, Futuremark recently updated its latest 3DMark to support 4K benchmarking. Trying to run this benchmark on any graphics card with what is now the de facto standard amount of graphics card memory, 2GB, will give you a really hard time with the FPS figures generally below 2 (yes, two) FPS in Game Test 2, whereas 3GB will be barely enough to get you by.
This is with 4K (3840×2160) of course, and just one benchmark, but it gives you an indication of where things are heading. That’s why we’ve chosen to equip our GTX 970M and 980M graphics cards with double the amount you’d expect – future proofing, and for high-res external monitors.
A real-world example on the subject was written up by Swedish media NordicHardware on Nov 7th. During its testing of Call of Duty: Advanced Warfare, they discovered that the game, even with a perfectly normal 1080P (1920×1080) resolution, would use all graphics memory at its disposal, all 6GB of the NVIDIA GeForce Titan Black.
Other games are going in the same direction as well. Games such as Shadow of Mordor also require a whole lot of VRAM, also eclipsing the 4GB barrier with 1080P resolution (tests once more courtesy of NordicHardware):
With even regular games, 2GB of VRAM is barely enough right now, and is simply not going to be enough in a future that is closer than you might think. The high-resolution revolution is closing in fast, do you have enough graphics memory?