Taking on the Dark Lord, Mobile Style

There have been a few recent product launches, with more to come in the near future, from AMD, Intel, and NVIDIA. On the CPU side we have Intel’s Ivy Bridge and AMD’s Trinity, both arguably more important for laptop users than for desktops—and in the case of Trinity, it’s currently laptops only! The two products both tout improved performance relative to the last generation Sandy Bridge and Llano offerings, and in our testing both appear to deliver. Besides the CPU/APU updates, NVIDIA has also launched their Kepler GK107 for laptops, and we’re starting to see hardware in house; AMD likewise has Southern Islands available, but we haven’t had a chance to test any of those parts on laptops just yet. With all this new hardware available, there’s also new software going around; one of the latest time sinks is Blizzard’s Diablo III, and that raises a question in the minds of many laptop owners: is my laptop sufficient to repel the forces of Hell yet again? That’s what we’re here to investigate.

Before we get to the benchmarks, let’s get a few things out of the way. First, Diablo III, for all its newness, is not a particularly demanding game when it comes to graphics. Coming from the same company as World of WarCraft and StarCraft II, that shouldn’t be too surprising: Blizzard has generally done a good job at ensuring their games will run on the widest array of hardware possible. What that means is cutting edge technologies like DirectX 11 aren’t part of the game plan; in fact, just like StarCraft II and World of WarCraft (note: I'm not counting the DX11 update that came out with Cataclysm), DirectX 10 isn’t in the picture either. Diablo III is a DirectX 9 title, and there should be plenty of GPUs that can handle the game at low to moderate detail settings.

The second thing to bring up is the design of the game itself. In a first person shooter, your input is generally linked to the frame rate of the game. If the frame rate drops below 30 FPS, things can get choppy, and many even consider 60 FPS to be the minimum desired frame rate. Other types of games may not be so demanding—strategy games like Civilization V and the Total War series for instance can be played even with frame rates in the teens. One of the reasons for that is that in those two titles, mouse updates happen at the screen refresh rate (typically 60 FPS), so you don’t feel like the mouse cursor is constantly lagging behind your input. We wouldn’t necessarily recommend <20 FPS as enjoyable for such games, but it can be tolerable. Diablo III takes a similar approach, and as a game played from a top-down isometric viewpoint, 30 FPS certainly isn’t required; I have personally played through entire sections at frame rates in the low to mid teens (in the course of testing for this article), so it can be done. Is it enjoyable, though? That’s a different matter; I’d say 30 FPS is still the desirable minimum, and 20 FPS is the bare minimum you need in order to not feel like the game is laggy. Certain parts of the game (e.g. interacting with your inventory) also feel substantially worse at lower frame rates.

Finally, there’s the problem of repeatability in our benchmarks. Like its predecessors, Diablo III randomizes most levels and areas, so finding a section of the game you can benchmark and compare results between systems and test runs is going to be a bit difficult. You could use a portion of the game that’s not randomized (e.g. a town) to get around this issue, but then the frame rates may be higher than what you’d experience in the wilderness slaying beasties. What’s more, all games are hosted on Blizzard’s Battle.net servers, which means even when you’re the only player in a game, lag is still a potential issue. We had problems crop up a few times during testing where lag appeared to be compromising gameplay, and in such cases we retested until we felt the results were representative of the hardware, but there’s still plenty of potential for variance. Ultimately, we settled on testing an early section of the game in New Tristram and in the Old Ruins; the former gives us a 100% repeatable sequence but with no combat or monsters (and Internet lag is still a potential concern), while the latter gives us an area that is largely the same each time with some combat. We’ll be reporting average frame rates as well as providing some FRAPS run charts to give an overall indication of the gaming experience.

And one last disclaimer: I haven’t actually played through most of Diablo III. Given what I’ve seen so far, it would appear that most areas will not be significantly more taxing later in the game than they are early in the game, but that may be incorrect. If we find that later areas (and combat sequences) are substantially more demanding, we’ll revisit this subject—or if you’ve done some informal testing (e.g. using FRAPS or some other frame rate utility while playing) and you know of an area that is more stressful on hardware, let us know. And with that out of the way, let’s move on to our graphics settings and some image quality comparisons.

Update: Quite a few people have pointed out that later levels (e.g. Act IV), and even more so higher difficulty levels (Hell) are significantly more demanding than the early going. That's not too surprising, but unfortunately I don't have a way of testing later areas in the game other than to play the game through to that point. If performance scales equally across all GPUs, it sounds like you can expect Act IV on Hell to run at half the performance of what I've shown in the charts. Give me a few weeks and I'll see if I can get to that point in the game and provide some additional results from the later stages.

Diablo III Graphics Settings and Image Quality
Comments Locked

87 Comments

View All Comments

  • DanNeely - Monday, May 28, 2012 - link

    All joking about account sharing not withstanding, would AT buying a new D3 account for testing and letting a volunteer (not me) level it up for late game/hell testing be a viable option?
  • JarredWalton - Monday, May 28, 2012 - link

    We do have a couple people playing the game, so at some point we'll be able to test later levels. Give me a chance to: A) have a holiday (today), B) write a few other articles, C) enjoy the game without more benchmarking of it (which totally kills the fun!). Probably in a week or so I can come back with results from Act II or later.

    Unless you can talk Anand into your idea? ;-)
  • DanNeely - Tuesday, May 29, 2012 - link

    My diplomacy skills are of the Europe 1914 level; the odds of my being able to sweet talk someone I don't know well into anything are slim to none.

    Better results in a week or so isn't that bad a delay. I'm just mildly frustrated since I've had a few people ask what sort of hardware they needed to play the game; and it seems that all the numbers I can find are from very early in the game and thus non-representative of what's really needed.
  • damianrobertjones - Saturday, May 26, 2012 - link

    I'm hoping that Windows 8 Metro games bring a stable platform and for once I'm glad that at we'll at least have HD4000 as a base platform.
  • dagamer34 - Saturday, May 26, 2012 - link

    I'm wondering how the HD 4000 compares to the GPUs in ARM SoCs as that will be the actual low mark if slower.
  • tipoo - Saturday, May 26, 2012 - link

    I'm curious about that as well, SoC GPUs like the SGX 543MP4 are getting pretty complex and Intel themselves used to use integrated PowerVR GPUs in their chipsets.
  • tipoo - Saturday, May 26, 2012 - link

    The GMA 3600 is based on the PowerVR SGX545.
  • JarredWalton - Saturday, May 26, 2012 - link

    And the Windows drivers for it are crap right now. I'm just saying....
  • Penti - Saturday, May 26, 2012 - link

    Actually Metro/WinRT won't be used for gaming, If you want a restricted environment there already is XNA so. Games will be too difficult and too less of an incentive or anything to gain to port to the WinRT framework. Or Windows Runtime as they call it. Game developers will never target Metro/WinRT if they don't have to and they don't on x86 machines, desktop is there, you can still build for Windows 7 etc where most users are and so on. Won't happen that much here until next gen consoles either. Plus macs have gotten a whole lot better in the department and plenty of game engines are available now. Taking those kind of engines and porting to C++/WinRT isn't something taken lightly it probably won't actually be possible without a rewrite which defeats the purpose. The performance wouldn't be good. The sandbox is probably too restrictive. It also means in practice it is a more restrictive environment then the mobile sandboxed OS's, several mobile OS's run Firefox for example. WinRT never will. WinRT never will run even IE.
  • oopyseohs - Saturday, May 26, 2012 - link

    Did I miss the part where you talk about using an external monitor, or how else were you able to run all of these GPUs at all three resolutions? I'm not saying the data isn't important, as it could be relevant to different notebooks that use the same or similar hardware just with higher-res screens.

    Also, I've played this game on an old desktop with with GTX 285 @ 1080p and everything turned up. While that is fairly smooth and playable, I still get quite a few moments of "stuttering" in Hell difficulty. I also play on basically the same Acer book with the GT 540M, and even at the lowest possible graphics settings and resolution in normal mode, it's hard for me to characterize that performance as anything other than horrible in comparison to the desktop.

Log in

Don't have an account? Sign up now