Taking on the Dark Lord, Mobile Style

There have been a few recent product launches, with more to come in the near future, from AMD, Intel, and NVIDIA. On the CPU side we have Intel’s Ivy Bridge and AMD’s Trinity, both arguably more important for laptop users than for desktops—and in the case of Trinity, it’s currently laptops only! The two products both tout improved performance relative to the last generation Sandy Bridge and Llano offerings, and in our testing both appear to deliver. Besides the CPU/APU updates, NVIDIA has also launched their Kepler GK107 for laptops, and we’re starting to see hardware in house; AMD likewise has Southern Islands available, but we haven’t had a chance to test any of those parts on laptops just yet. With all this new hardware available, there’s also new software going around; one of the latest time sinks is Blizzard’s Diablo III, and that raises a question in the minds of many laptop owners: is my laptop sufficient to repel the forces of Hell yet again? That’s what we’re here to investigate.

Before we get to the benchmarks, let’s get a few things out of the way. First, Diablo III, for all its newness, is not a particularly demanding game when it comes to graphics. Coming from the same company as World of WarCraft and StarCraft II, that shouldn’t be too surprising: Blizzard has generally done a good job at ensuring their games will run on the widest array of hardware possible. What that means is cutting edge technologies like DirectX 11 aren’t part of the game plan; in fact, just like StarCraft II and World of WarCraft (note: I'm not counting the DX11 update that came out with Cataclysm), DirectX 10 isn’t in the picture either. Diablo III is a DirectX 9 title, and there should be plenty of GPUs that can handle the game at low to moderate detail settings.

The second thing to bring up is the design of the game itself. In a first person shooter, your input is generally linked to the frame rate of the game. If the frame rate drops below 30 FPS, things can get choppy, and many even consider 60 FPS to be the minimum desired frame rate. Other types of games may not be so demanding—strategy games like Civilization V and the Total War series for instance can be played even with frame rates in the teens. One of the reasons for that is that in those two titles, mouse updates happen at the screen refresh rate (typically 60 FPS), so you don’t feel like the mouse cursor is constantly lagging behind your input. We wouldn’t necessarily recommend <20 FPS as enjoyable for such games, but it can be tolerable. Diablo III takes a similar approach, and as a game played from a top-down isometric viewpoint, 30 FPS certainly isn’t required; I have personally played through entire sections at frame rates in the low to mid teens (in the course of testing for this article), so it can be done. Is it enjoyable, though? That’s a different matter; I’d say 30 FPS is still the desirable minimum, and 20 FPS is the bare minimum you need in order to not feel like the game is laggy. Certain parts of the game (e.g. interacting with your inventory) also feel substantially worse at lower frame rates.

Finally, there’s the problem of repeatability in our benchmarks. Like its predecessors, Diablo III randomizes most levels and areas, so finding a section of the game you can benchmark and compare results between systems and test runs is going to be a bit difficult. You could use a portion of the game that’s not randomized (e.g. a town) to get around this issue, but then the frame rates may be higher than what you’d experience in the wilderness slaying beasties. What’s more, all games are hosted on Blizzard’s Battle.net servers, which means even when you’re the only player in a game, lag is still a potential issue. We had problems crop up a few times during testing where lag appeared to be compromising gameplay, and in such cases we retested until we felt the results were representative of the hardware, but there’s still plenty of potential for variance. Ultimately, we settled on testing an early section of the game in New Tristram and in the Old Ruins; the former gives us a 100% repeatable sequence but with no combat or monsters (and Internet lag is still a potential concern), while the latter gives us an area that is largely the same each time with some combat. We’ll be reporting average frame rates as well as providing some FRAPS run charts to give an overall indication of the gaming experience.

And one last disclaimer: I haven’t actually played through most of Diablo III. Given what I’ve seen so far, it would appear that most areas will not be significantly more taxing later in the game than they are early in the game, but that may be incorrect. If we find that later areas (and combat sequences) are substantially more demanding, we’ll revisit this subject—or if you’ve done some informal testing (e.g. using FRAPS or some other frame rate utility while playing) and you know of an area that is more stressful on hardware, let us know. And with that out of the way, let’s move on to our graphics settings and some image quality comparisons.

Update: Quite a few people have pointed out that later levels (e.g. Act IV), and even more so higher difficulty levels (Hell) are significantly more demanding than the early going. That's not too surprising, but unfortunately I don't have a way of testing later areas in the game other than to play the game through to that point. If performance scales equally across all GPUs, it sounds like you can expect Act IV on Hell to run at half the performance of what I've shown in the charts. Give me a few weeks and I'll see if I can get to that point in the game and provide some additional results from the later stages.

Diablo III Graphics Settings and Image Quality
Comments Locked

87 Comments

View All Comments

  • JarredWalton - Sunday, May 27, 2012 - link

    Yes, all of the higher than 1366x768 results were done on an external LCD where required (which was the case for Llano, Trinity, VAIO C, TimelineX, and Vostro; the other laptops had 1080p displays, except for quad-core SNB which has a 1600x900 LCD and I didn't run the 1080p tests).
  • PolarisOrbit - Saturday, May 26, 2012 - link

    Good review for what it is, but I think it could have been a little more complete with some additional information:

    1) Use Act 3 Bastion's Keep for the "intensive" case instead of Act 1 Old Town. I think this would be better representative of the game's peak demand. (probably just a run through of the signal fires quest since it's easy to get to)

    2) Include a brief section on how much of an impact additional players put on the game. I find it can actually be quite significant. This doesn't have to be full-depth review just a quick.

    Overall, I'm using an A8-3500M + 6750M crossfire (overclocked to 2.2GHz) @1366x768 and my framerates during battles (ie. when it counts) average about 1/2 to 1/3 what the reviewer posts because the game gets much more intensive than Act 1, and having a party also slows it down significantly compared to solo.

    Just some ideas to expand the review if you want =)
  • drkrieger - Saturday, May 26, 2012 - link

    Hey folks, I've got an older Asus G71Gx which has a Nvidia GTX260M, I can play it on medium/low at about 40 fps @ 1920x1200.

    Hope this gives some idea of older mobile graphics stack up.
  • waldojim42 - Saturday, May 26, 2012 - link

    I have been testing this out on my W520 for the sake of seeing what I can do to play diablo and maintain decent battery life.

    For what it is worth, turning off shadows, and playing @ 1366x768 on the HD 3000 results in roughly 28fps - more than enough to play the game through the first difficulty anyhow. I have been using this for some time now with 4 players in game. When running @ 1080P, it dips down into the low 20's, and occasionally is a problem in act 3 so I wouldn't suggest it.

    Point is though, that anyone that has a notebook with SB and no video card CAN still play this game, even if it isn't ideal.
  • Zoolookuk - Saturday, May 26, 2012 - link

    Given this is a cross platform game, it would have been interesting to provide Mac results with similar hardware. I play using a GT330m and i7 dual core, and it runs pretty well. I'd like to see how it stacks up to the latest AMD chips and HD3000 on a Mac.
  • egtx - Saturday, May 26, 2012 - link

    Yes I am interested in Mac results as well.
  • ananduser - Saturday, May 26, 2012 - link

    Provided the testing is done on a dual booting Apple machine, D3 under Windows will always run better.
  • JarredWalton - Saturday, May 26, 2012 - link

    Anecdotally, Brian and Anand have both commented that Diablo 3 on a MacBook Pro under OS X runs like crap. I'm not sure if they're running on latest generation MBP13 or something else, though, so that's about all I can pass along.
  • ananduser - Sunday, May 27, 2012 - link

    Was there any doubt? OSX is severely lacking in the graphical driver support. Apple never gave a rat's rear about this crucial aspect of gaming support. They are always late with drivers and with the latest OpenGL spec.
  • Penti - Thursday, May 31, 2012 - link

    The recommendations / minimum requirements on Macs are discrete graphics with good drivers though. I.e. no nvidia 7300 / 7600, ATi X1600 / X1900 etc. Starting point is 8600 GT. Obviously no integrated Intel graphics is enough there. OpenGL3.2 or OpenGL 2.1 with extensions should be fine for game developers and the drivers handle it, nVidia and AMD can put in performance improvements if they have the feedback. They could even launch their own "game edition" card for the Mac Pro with their own drivers outside of Apples distribution channel. Nvidia releases drivers on there site from time to time. That said both the game engine port and drivers are a bit less optimized then their Windows and Direct3D counterpart. They [drivers] are quiet robust and well working but might not be that fast. It's mainly a problem for the developers today though as most macs has somewhat decent graphics with maintained drivers and have pretty good driver support and support pretty much all the features you need any way.

    The OS is very dependent on OGL so the support it self is decent and fairly up to to date even if it is not OpenGL 4.2/3.3 yet. Latest OpenGL 4.2 is not even supported by much of any hardware that Apples uses either so. R700, R600, GF 8M, GF 9M and the desktop versions does not support more then OpenGL 3.3 any way which it self is a backport of as much as possible. 3.2 is a decent level there. Apple always support the whole API in the software renderer too so they have no joy hunting the latest features, though the vendors can use any extensions they wish to add those features, all the supported gpus supports the API too. Intel drivers on Windows do not have OpenGL 4.1/4.2 drivers. It's a lot better driver support then for say Intel graphics on Linux and in some regards even on Windows. Intel drivers on Windows don't support OpenGL 3.2 yet.

Log in

Don't have an account? Sign up now