Designing Our Next GPU Test Suite

by Ryan Smith on 3/14/2010 12:00 AM EST
Comments Locked

240 Comments

Back to Article

  • Sniffet - Saturday, March 27, 2010 - link

    I don´t play much games but I use GPU for computing and it would be very interesting to see a benchmark with the upcoming cards with renderers like Octane, Iray (not released yet) and Arion.

    Octane unbiased render would be a great indication on how the graphic cards perform in a workstation: http://www.refractivesoftware.com">http://www.refractivesoftware.com

    Also some Arion rendertimes would be nice: http://www.randomcontrol.com">http://www.randomcontrol.com

    Since all these renders are based on CUDA it leaves AMD out in the cold, and this I think is a very important reason to choose your graphic cards well. Also some benchmarks for Adobe Premiere CS5 would be interesting indeed (also based on CUDA)..


  • Werelds - Thursday, March 25, 2010 - link

    I'll just do this in bullet points for clarity.

    - OpenGL games; as said earlier in the comments, there are non-Windows gamers out there, and there's gradually more games supporting OpenGL again.

    - OpenCL/DirectCompute? I know most of it is still beta, but it's something that will hopefully see more use in games in the nearby future.

    - Despite being synthetic, Unigine's engine is actually going to be used for a number of games, some of which are already in development.
  • Werelds - Thursday, March 25, 2010 - link

    Forgot to mention another thing.

    Mention which games are based on which engine (and create a shortlist of other games based on it), and make small notes whether they're optimised for one side or feature. Even Crysis for example has a couple of CFG settings that are only enabled for NVIDIA by default, but which should be available for ATI as well (fixing these settings fixes the FPS drop caused by caching which ATI gets at the start - TWIMTBP eh?).
  • kevith - Wednesday, March 24, 2010 - link

    Is it on purpose there are these weird commercials, disguised as a post in the forum?
  • kevith - Wednesday, March 24, 2010 - link

    A thing that seem to miss in almost any review of GFXs are the quality of the rendered pictures. Is that because there really isn´t one?

    Because as I see it, I would love to sacrifice a few FPS, if the picturequality is better on one card over the other. Of course such a term is very hard to measure or describe, as it is a matter of personal taste.

    But - given there actually are diffrences - it would be something NEW in graphics cards tests. And not just another truckload of - admittedly nescessary - games and synthetics.

    Kent Thomsen, DK,EU.
  • Hadenman - Monday, March 22, 2010 - link

    I would like to see Metro 2033 and Crysis 2. I also like the idea of testing some MMO's at max settings. I have this (http://www.overclock.net/system.php?i=50333)">http://www.overclock.net/system.php?i=50333) setup, which ain't too bad imo, and I play Aion. I can turn everything all the way up except for AA. Once I push AA past 2x, it starts to turn into a slide show.
  • jiulemoigt - Monday, March 22, 2010 - link

    well an RTS with lots of particles like DoW II punishes some fairly up to date systems, so that is useful to see how much the CPU and how much the GPU is being pushed.

    As far a rpg go dragon age has some real time cut scenes you might be able to use. With a fast enough connection you could test several machines in an mmo by logging in and walking around the same area. Even rotate the cards around repeat the test.

    Last if you want a synthetic test that has relevance download UE3 create a sample level and test that, as many companies lease the unreal engine, and then add in their twists. Which gives you an idea of what the base point they are starting from.
  • BigMoosey74 - Saturday, March 20, 2010 - link

    I want to make a plug for Battlefield Bad Company 2. The frostbite engine is pretty GPU heavy and should be tapping into DX10.1 and DX11 to help test out the newer cards. Although the BF series has never been too graphically intensive, BadCo2 seems to be quite the opposite.

    CPU discussion came up but I think it is an interesting subject because it does effect how a game runs. Usually, reviews just go around this by overclocking to a point where the CPU isn't a bottleneck.

    For a total 180, have you considered doing a total game benchmark that would tackle both CPU and VGA performance? Because reviews usually use the uber top end systems and OC their CPU to extreme speeds, game performances usually don't translate for many people. Having targeted performance numbers (even if disabling cores and downclocking CPUs is the means of getting there) for a specific game is really helpful information. Thanks!
  • DarkUltra - Saturday, March 20, 2010 - link

    Hi!
    I just read the GF100 / Fermi would have multiple setup / geometry engines. This is a huge step forward. All previous and current 3d cards I know of, except the GTX 460 and 480, have a single setup engine.

    Let me quote the gf100 anandtech preview:
    - To put things in perspective, between NV30 (GeForce FX 5800) and GT200 (GeForce GTX 280), the geometry performance of NVIDIA’s hardware only increases roughly 3x in performance. Meanwhile the shader performance of their cards increased by over 150x. Compared just to GT200, GF100 has 8x the geometry performance of GT200, and NVIDIA tells us this is something they have measured in their labs.
    http://www.anandtech.com/video/showdoc.aspx?i=3721...">http://www.anandtech.com/video/showdoc.aspx?i=3721...

    Now the question I wonder is, can we test if the older cards and the new GF100 is bottlenecked by the geometry engine? I have never heard a review mention the performance of the setup / geometry engine or if it can be a bottleneck or not. To test this you would have run at a low resolution with a fast CPU and the GPU at different clock rates.
  • Azethoth - Friday, March 19, 2010 - link

    I pretty much only play WoW these days and with a 30" apple monitor I am interested in performance at max resolution.

    As is this means no max settings even on my Radeon 5970. So it is not a meaningless benchmark. Enabling max everything brings it to an unplayable crawl.

    PS: There is something evil about the stairs down to Blood Queen Lanathel's room. 60fps -> 17fps or less. (2-3fps on my old 4870 X2 at minimum settings). Each time I look at them I hear "Stare into the abyss..."
  • mindless1 - Thursday, March 18, 2010 - link

    I do not want to see benchmarks that only compare what each GPU has in FPS at the same game settings.

    NO, what is useful to the typical reader or potential buyer is, what, if any, quality settings are compromised to attain a playable framerate.

    If it just isn't possible to play at 20FPS (min) and at least 45FPS avg. I WANT to know that, but I don't care to know that a $200 card is faster than a $130 card at the same settings... I think we are past stating the obvious.

    So I want benchmarks that, if the above is too laborious, set an average framerate of 35FPS threshold, then note what needs disabled to attain that if it is possible at all.

    If it is not possible then the price of the card is pretty irrelevant isn't it? I mean, no matter how cheap it is, if it can't do the job at all there is no value in it for real world applications.
  • chadkellycolorado1 - Thursday, March 18, 2010 - link

    Don't leave out the growing croud of non-Microsoft gamers. Consider adding an OpenGL based game (open source with multi-platform support).
  • T2k - Thursday, March 18, 2010 - link

    It's a pretty well-written benchmark tool, with loads of option and customization options eg. switching between DX10/10.1/11 etc, capable to stress any high-end machine.
    It's using the latest X-Ray v1.6 engine of GSC Game World.
  • Browser - Wednesday, March 17, 2010 - link

    Engines not games.

    I think it would be a better idea to focus on engines rather than games, id Tech, CryEngine, Unreal, Gamebryo, Essence Engine, trying to get at least one of each game genre (RTS/FPS/etc) while covering the big boys.

    Also there are newer versions for some great engines due out in the near-ish future which you might want to consider waiting for or being ready to update to them, specifically the id Tech 5/6 and CryEngine 2/3

    Personally we don't need a gamut of resolutions tested but a good range focusing on the commonly used ones, even just 3 or 4 resolutions such as 1920x and below. Unrealistic or uncommon resolutions should be left to specific reviews, such as when comparing multi-monitor situations. The following might not be a bad idea, 1920x1080, 1680 x 1050, 1280x1024, as they are either common or high end.
  • Browser - Wednesday, March 17, 2010 - link

    (don't see edit)

    Someone mentioned OpenCL for it's physics which I have to say is a great idea, Anandtech could help publicising an open alternative to nvidia only closed PhysX. Since physics/OpenCL is both important to games and is affected by the graphics card I think it's a great idea.

    Something else I've wondered about, is a raw benchmark for the GPUs, showing FLOPS etc. It would give a nice idea of general power the GPUs have.

    These things could give you something over the other sites also benchmarking graphics cards.
  • tynopik - Tuesday, March 16, 2010 - link

    Why? For the lulz . . .
  • SofS - Tuesday, March 16, 2010 - link

    Firstly, a Linux overview is long overdue. From only an overall compatibility or a feature list to real CAD and maybe even some wine based tests (not to mention vdpau also). The wine tests can be helpful to the wine developers as I am sure yours are already to NVIDIA and ATI driver developers. On this regard, maybe even all open source developers could profit, from the X-server (Xorg-edgers) to interface (KDE, GNOME, XFCE, etc) and even driver (nouveau). Not to mention those nostalgic memories from 3ddesktop (nowadays resorting to COMPIZ).

    Secondly, GPGPU. My most pressing topic of interest as of now as I am myself researching in this area. Though you might need to dig through in order to find proper tests. Follow up comments with suggestions would be appreciated. My personal one would be folding@home GPU, as it is more disseminated and also runs over wine. By the way, it would be good to take double precision floating point (IEEE 754) into accont. From NVIDIA, as of now, only those 200 series GTX models offer such support and it is incomplete, it will be good to know how future ATI models and Fermi will compare.

    Thirdly, some CPU overhead tests. An example would be forceware running some game at x% CPU while catalyst at y% on the same game.

    Finally, testing those filters like AA and AF on and off are at least as important as how they scale at all levels (2x, 4x, 8x, 16x, transparency, et al). However, in order for those performance levels to make sense, the quality must be taken into account. Instead of using the number as an equivalent, use you quality analysis so that filters of nearly equivalent qualities are compared for performance.
  • marraco - Tuesday, March 16, 2010 - link

    I want an OpenCL benchmark.

    If it's a physic simulation, best.
  • PR3ACH3R - Tuesday, March 16, 2010 - link

    Games ? Games !?

    how about you test the current ATI DPC situation, DPC powerplay problems & DPC problems with software mode playback !

    http://www.avsforum.com/avs-vb/showthread.php?t=12...">http://www.avsforum.com/avs-vb/showthread.php?t=12...

    http://www.avsforum.com/avs-vb/showthread.php?t=12...">http://www.avsforum.com/avs-vb/showthread.php?t=12...

    How about you test what DXVA cards work with what formats,
    without BSODing like some currnet 5XXX series cards do.

    How about you test the horrible 2D performance with current ATI hardware.

    How about you make real articles like these people:

    http://translate.googleusercontent.com/translate_c...">http://translate.googleusercontent.com/...usg=ALkJ...

    http://translate.googleusercontent.com/translate_c...">http://translate.googleusercontent.com/...usg=ALkJ...
  • jive - Wednesday, March 17, 2010 - link

    chill out, the link you posted clearly states the solutions to the problems and for HTPC use the ATI seems to be the least bad choice anyway. No need to bash ATI on this front the 2D issue is already stated in previous posts.

    You're preaching to the choir here.
  • CharonPDX - Tuesday, March 16, 2010 - link

    Since Microsoft has abandoned Flight Sim, I'd say move to X-Plane.

    Which can also provide cross-platform testing, since it's available on Windows, Linux, and OS X. (And it has built-in benchmarking.)
  • qnetjoe - Tuesday, March 16, 2010 - link

    Please include some professional applications like Solidworks, Pro/E, and maybe some FEA applications that use GPGPU.
  • Urbanos - Tuesday, March 16, 2010 - link

    If its possible, to incorporate a notebook GPU to compare against desktop variants. ie one of the new i5 or i7 platforms with an nvidia card 9800GTS and 9600GT or whatever their refreshed smaller counterparts are.

    a couple older titles:
    world of warcraft - latest to current patches
    Oblivion - latest edition/patches
    Team fortress 2

    newer stuff: at typical res ie, 1280x1024, 1680x1050 & 1920x1200 or their widescreen alternatives

    dawn of war 2 +expansion if necessary
    Mass effect 2
    Bad company 2
    bioshock 2
    supreme commander 2


    having gpgpu stuff as a later project, seperate bench would be incredibly valuable
  • darkos - Tuesday, March 16, 2010 - link

    yes, Ok, I know it's 'old' but, it's the latest version of this game / simulator. I and many others still use this application, and would benefit from knowing what to expect with a given set of hardware. Also, since your site would be one of the few to include it, it would increase readership of your reviews.
  • pehrs - Tuesday, March 16, 2010 - link

    Please, do a test with some older versions of Dx as well as atleast one OpenGL based game. I have been burned badly by drivers and cards that can't run run old games (8800 GTS can't mannage over 5 fps in a game using Dx7...).

    I am also very interested in more measurements of the sound levels outputted by the cards.
  • fshaharyar - Tuesday, March 16, 2010 - link

    How about trying out the Unigine Heaven

    The Unigine engine is comparable to Epic's Unreal and id Software's Doom architectures. Unigine delivers for example DirectX 11 and Tessellation, one of the API's most important features. For DirectX 11 Tessellation polygons are dissected. Unigine currently has support for Open GL 3.2 as well. It supports hardware Tessallation , Screen Space Ambient Occulsion, Direct Compute and Shader Model 5.0.

    If Tessalation active, diverse objects like walls or bricks are modeled with a multiple amount of their original number of polygons. This way furrows and bumps get "noticeable” physically. The result looks excellent and has an andvantage in contrast to Parallax Occlusion Mapping (POM) and similar techniques: It is affected by Anisotropic Filtering and Multisampling Anti Aliasing.

    * For a fact to see the real potential of Direct X 11 cards On a Radeon HD 5870 in DX 11 mode the frames per second drop by 39 percent as soon as tessellation is activated. While a Radeon HD 5770 deals a little better with the additional work and drops only 31 percent.
  • T2k - Thursday, March 18, 2010 - link

    Correct me if I'm wrong but AFAIK Unigine engine powers zero games on the market so far which means under "No Synthetic Benchmarks Policy" the Heaven benchmark has no place in the basket.
  • fshaharyar - Saturday, March 20, 2010 - link

    a very good question posed by you T2K but if you want the relevance of unigine then just take look at this

    http://www.techpowerup.com/117934/NVIDIA_Claims_Up...">http://www.techpowerup.com/117934/NVIDI..._Hand_in...

    this news will definitely revv things up
  • poohbear - Tuesday, March 16, 2010 - link

    Hey, i NEVER look at any of ur synthetic benchmarks, i like real world results.

    i currently like how u bench a variety of different types of games. Eg u have cpu centric RTS like Dawn of war 2 (which ofcourse is still very much reliant on a gpu, but it DOES see huge gains from different cpus as well) , Crysis is gpu centric FPS, and include some games that are famous for their multi-core & DX 10 support (farcry 2). With DX11, there's really only 2 FPS that are out: Stalker Call of pripyat, and AvP. I dont see it necessary to include UE3 games cause we all know they run like gravy on all hardware. Just my 2 cents.
  • - Tuesday, March 16, 2010 - link

    I agree categorizing/rating games into their system stress levels ie system ‘stressability hogs' , or‘elegant resource users’, will give game developers as well as potential comp buyers useful information.

    asH
  • NT78stonewobble - Tuesday, March 16, 2010 - link

    Regarding the selection of games: I'd like to see a nice broad spectrum of graphics engines. If I remember them correctly Unreal Engine, farcry 2, crysis and other engines atleast that way we can ourselves atleast correlate a bit to older games using the same engine.

    Minimum FPS values if at all possible. Especially to discover if these are lower on the dual chip cards and or sli/crossfire setups.

    You can still keep running the synthetic tests. There are those who like them. Just don't focus on them in the conclusion.

    Keep the focus in graphics card reviews on the graphics.

    But maybe you could do a roundup once in a while with different processor / graphics card combinations to discover what processors are needed to drive what graphics cards at what resolutions. Personally I have an e8500@3.8ghz, 4 gb ram and a gtx 260(216 core) but even though my processor is only a dual core and old I'm still graphics card bottlenecked. Since I game at 1920x1200 with as much eye candy on as possible. AFAIK I'd only gain a few extra fps by making a major upgrade to the higher clocked i7's. It would be nice to get a confirmation that I am correct in this.

    An additional article with both your old and new games and popular ones and the performance and bug mentions of eg. the last 3 driver sets of either ATI and Nvidia. I know this is a big SOB article with lots of work. But as a consume I am sick and tired of having to switch drivers all the time to get a specific game working and then loosing 30 percent performance in other games. An article to discover who has the best drivers here and now would be nice. For a consumer perspective it doesn't really matter that a card is the worlds best at game xxxxx if it has alot of errors in others.

    So basically your graphics card reviews are good (needs those minimum fps numbers) but additional articles to help out buyers to get the best performance for their setup (processor vs. graphics bottleneck) and service/ease of use (state of drivers) would be really really appreciated.
  • XiZeL - Tuesday, March 16, 2010 - link

    AS many replies state i would also like to see Battlefield Bad Company 2 included, it one of the best selling games actually a verry demandig game for PC's unlike CODMW2.
  • XiZeL - Tuesday, March 16, 2010 - link

    sorry for the double post but would be also good if you show how each game scales in Crossfire and SLI
  • jive - Tuesday, March 16, 2010 - link

    whatever games you end up choosing please note the engine the game is based on. This gives a rough estimate on how other game titles based on the same engine will fare on the similar HW platform.

    You've probably done this before but don't test two games based on the same engine, unless for some strange reason those would scale differently. If this is the case an in-depth analysis why it is so, would be interesting.
  • Drazick - Tuesday, March 16, 2010 - link

    Please include some Direct Compute / Open CL tests.

    Thanks.
  • BoFox - Tuesday, March 16, 2010 - link

    Hey, I think that the games tested are great. The more games, the better! (As long as they are popular games, not crappy games, heh!)

    Anyways, I think it would be very helpful to include minimum frame rates like over at Xbitlabs. Even more useful would be to provide time-based graphs of frame rates like those shown at HardOCP. Some games that average at 30fps for a certain card would be pretty consistent, staying within 2-3 fps of the average.. on another card (of the different "make", be it NV or ATI) it would dip to 1/2 of the average or worse. Would it dip just once throughout a couple minutes of benching, or would it dip several times?

    About the Steam poll that shows most of the user base to still be gaming at 1280x1024 or lower--I do not think we should be alarmed at the statistics. Those who play at say, 1024x768 are most likely those who are using a rig that is around 4-5 years old on average, or older. They are most likely the ones who only play games and do not give a slightest amount of feces about the computer hardware/benchmarks. They do not even know what the display configuration settings like Anisotropic or AntiAliasing mean. I have a few friends and a couple of brothers who play computer games but do not even care about turning up the resolution. They are most likely not going to bother reading Anandtech articles ever.

    Those who bother reading the articles are definitely going to be interested at 1680x1050 resolution as the minimum, trust me.

    I guess that's all I have to say here. Just make sure to keep a couple of DX9 games around.. there will continue to be hugely popular DX9 games until Xbox360 and PS3 are finally replaced by the next-gen consoles (since most PC games are ported from the consoles).

  • fepple - Tuesday, March 16, 2010 - link

    Yeah, no point looking at the steam polls when they can check their web stats to see what people who actually visit the site use :)
    Though having said that I mainly read the site at work hehe
  • phuzi0n - Tuesday, March 16, 2010 - link

    There is one yet to be released game (~2 months until release) that you absolutely must include: STARCRAFT 2

    If you don't include SC2 then you will lose merit with a HUGE portion of gamers. You did good by finally including World of Warcraft in the tests, albeit too late to matter, but for SC2 you need to include it from the moment it is released. There will be many people looking to upgrade their cards to handle SC2 so make sure that you have the data for them!
  • glockjs - Tuesday, March 16, 2010 - link

    i really don't care what you use tbh. just has to be somewhat current. the only thing i would say NOT to use is any console port or any game made by blizzard/activision.

    btw mmorpg's are a bad idea to use..they're made to run on crappy hardware so they can capture a wider audience.
  • HillBeast - Monday, March 15, 2010 - link

    You should probably try to have as few if any NVIDIA The Way It's Meant To Be Played and ATI oriented games. I know this will be hard but there is alot of games out there that prefer specific brands and it makes it kind of unfair. Also seing you will be comparing NVIDIA and ATI, show results of w/ and w/o PhysX enabled and when games and stuff start using DirectCompute show the same thing, w/ and w/o.
  • mmatis - Monday, March 15, 2010 - link

    Pong:
    http://www.play.vg/games/130-Flash%20Pong.html">http://www.play.vg/games/130-Flash%20Pong.html
    -or-
    http://www.falstad.com/pong/">http://www.falstad.com/pong/
    ???
    }:-]
  • Froboz - Monday, March 15, 2010 - link

    I find that I often need to adjust the settings so that the game plays well during intense scenes where the system is taxed. Measuring average FPS doesn't give the best indication of what resolutions and quality settings can actually be used for regular gameplay.

    It would be nice to use some of the more intense scenes in games and to report the minimum frame rate or 5th percentile of frame rates.
  • Rogerdodge1 - Monday, March 15, 2010 - link

    ive seen a few good suggestions but had some of my own....
    most people agree EE cpus are no good for most of us. i think you need 3 tiers for your reviews.

    sub $100 cards get sub $100 cpus, maybe an athlon x2 at 2.7-3.2ghz. cards from $100-$250 get either a phenom2 945 or maybe a C2Q 8400. cards over $250 get something like an i7 870.

    resolutions scale with the cards as well.
    sub $100 cars are more likely to be used with lower resolutions like 1280*1024, 1440*900 and maybe 1680*1050
    $100-$250 cards will probably have at least a1680*1050 screen with them and possibly 1920*1080 or 1200.
    cards over $250 will usually have 1920*1080 or higher so use that, 1920*1200 and 2560*1600, or go with dual or triple 1080p displays instead of 2560*1600.
    when doing crossfire or sli you should definitely use 2560*1600 or dual/triple 1080p monitor setups(maybe even dual/triple 2560*1600 if you have the money to piss away, or can get them as freebies :P )

    as for the games?
    there are several game engines out there and i think you should try to find the 2 most demanding (graphically of course) for each of the top 5 modern engines. if performance is very close between the top 2 then only use 1 in you benchmarking suite, but if performance is significantly different use both. also include 1 title each from the most popular 5 older engines. that gives you a max of 15 games, and more likely only 10. also if you include a chart showing what other games use the same engines as the ones you tested,then people can get some idea where those would fall performance-wise without having to test 30 or 40 games every time you review a card. if a certain game is particularly popular but uses a less widely used engine, some consideration should be given to including it(maybe it could take the place of a popular but not graphically intense engine like the source engine?), but we all understand that you can only do so much in the time you have.

    many people seem to want gpgpu testing on these cards and i agree that having that information would be nice, but there really aren't many programs that use the gpu that are hardware agnostic, so.... if you can find some useful ones that are hardware agnostic(openCL, direct compute, whatever), great, let us know what they are. if not, oh-well.
    as far as 2d performance goes, i don't believe that we need it for every card you test. maybe 1 mid-range card from each series?
    Thank You for all your hard work keeping us up to date on things that matter in the world of computers, and i look forward to seeing what you guys decide to go with.
  • Tech on Napkins - Monday, March 15, 2010 - link

    1. Use a poll to find the median platform being purchased by your readers. If no one has a Core i7-980, it's not giving the buyer a true sense of what to expect if you use it as your video card test station. If you find a game IS cpu limited, mention it, which brings me to point...

    2. Have Battlefield: Bad Company 2 as a benchmark. I see it as an example of things to come, where physics takes a much stronger role in the experience. It pins my overclocked e8500 to the near 100% on both cores. While generally smooth, the occasional chaotic scene can get a bit choppy, which brings me to point...

    3. Minimum FPS is key. It was my CPU that was most likely the bottleneck in those scenes above, but I've seen video cards that benchmarked just fine, but they had severe performance issues when you looked at minimum FPS. Is it running out of RAM? Out of RAM bandwidth? CPU bottleneck?
  • Wixman666 - Monday, March 15, 2010 - link

    A lot of the people here that read your articles (including myself) are system builders.

    Run benchmarks on a couple of mid-range CPUs, preferably overclocked. Such as an AMD 550/555 BE OC to 3.6/3.8 and then an I5 750 OC to 3.6. Something easily attainable by anyone using a stock cooler with minimal effort.

    Software wise, I'd like to see a World of Warcraft benchmark. Why? Because 12 million people play, and I sell more computers to WoW players than any other single segment.

    I know someone is going to say WoW will run on a toothpick and a rubber band, but I'd still like to see it included. 12 million people is a big reader base.

    Not just a running around in the middle of nowhere benchmark, I'm talking a 25 man ICC raid measure.
  • MrSpadge - Monday, March 15, 2010 - link

    I'd like to see the GPU power consumption measured directly as XBit-Labs does it.

    It might also be interesting to check the power consumption under full load at auto fan speed (stock) and at maximum fan speed (for 24/7 GP-GPU crunchers). For GF100 rumors say that half of its power consumption is due to leakage, which increases at higher temperatures. For GT200 I've already seen load differences of ~20W just due to fan speed.
  • Ananke - Monday, March 15, 2010 - link

    Hello,

    I played Crysis when it came out. Since then I had GTX260 892MB, HD4850 1GB, HD4570 1GB, GTX285 OC, and currently HD5850. Crysis is beautiful game, and even if it sounds retarded, for my own needs I always play with my new card Crysis first - just to have the perception and be able to compare with previous card that I've own.
    I play on 1920*1280 (because the monitor native resolution) with all settings on highest. This test worked for me and allowed me to have a trend.

    My idea is simple - use a graphically sophisticated game on several resolutions 720p and above with average eye candy first, and maximum settings at second. Get the popular display resolutions from Steam usage statistics if you like. Use a game that heavily tax the GPU, maybe a DirectX 11 one (if available). The point is not to give me a benchmark in different classes of games, but ONE or few games on many cards - so I have consistency and ability to compare.

    IMPORTANT - please add same GPGPU applications. Does the cards have any computational value besides the gaming. If, for example, two cards have similar performance but one is accelerating handily a bunch of useful applications - this is a value for some users. Bench that, mention which software can use GPU for advantage, so users later comparing prices can draw their own purchase decisions.

    I personally, like to transcode with HD5850 an h.264 to PSP or phone video a 4.37 GB file in 17 minutes! This has value for me above the GTX285. I also find value in a card running at 42 C temp at load versus another working at 80 C at load and burning all my PC internals.

  • Ananke - Monday, March 15, 2010 - link

    my bad, the third card was HD4870, not 4570
  • isaac12345 - Monday, March 15, 2010 - link

    It would good to have different configurations at different price points. Not everyone has an extremely high end system and it would be good to get an idea of how the GPUs fare with different components at different price points. You could have 3 or 4 configs like midrange, lowrange, high end and no bottleneck config.
  • Ardemus - Monday, March 15, 2010 - link

    My main request isn't a particular game. I want some particular stats.

    1) Obtain a moving average of FPS/Second
    2) Find %time over 10, 20, 30, and 60 FPS
    3) Find average and standard deviation of the duration of sequential time spent under each rate.

    Data for one game would look something like this, and would make a nice pair of graphs in aggregate.

    GameX 94%>10fps 82%>20fps 75%>30fps 12%>60fps
    + 3s 11s 2m 14m
    Ave spans .1s<10fps 12s<20fps 6m<30fps 8m<60fps
    - 0s 11s 5m 7m

  • Hardin - Monday, March 15, 2010 - link

    I'd like to see a lot of modern dx 11 games. Like Call of Pripyat, Bad Company 2, Metro 2033, Aliens vs Predator and Battleforge. I know most of those are first person shooters, but they are usually the most graphically intensive.
  • Hardin - Monday, March 15, 2010 - link

    Oops I forgot to mention Dirt 2 sorry folks.
  • doittoit - Monday, March 15, 2010 - link

    I'd like to see video encoding benchmarks via CUDA/Stream, ideally baselined against a particular CPU. Also, you could use some portion of SETI@home as well (they have CUDA support).
  • bnajbert - Monday, March 15, 2010 - link

    Stop testing with rediculous processors that no one can afford. I understand that you don't want the CPU to become a bottle neck but honestly I don't need to read reviews about a processor being used in a video card review that no one in their right mind can afford.

    Just my $.02
  • MrSpadge - Monday, March 15, 2010 - link

    I'm fine with using very fast CPUs for these tests. I want to know what the GPU is capable of. And I don't care how much the CPU used for testing costs - this amount of power could be easily accessible via OC or could be affordable in 1 or 2 years. And there's no point in seeing all GPUs being capped at the same fps due to a CPU limit - other than "get a faster CPU". Showing CPU-limits belongs into CPU reviews.
  • CaptNKILL - Tuesday, March 16, 2010 - link

    Agreed.

    My Q9550 at 4Ghz is faster than any stock i7 (unless hyperthreading is fully utilized) and it only cost me $175. I don't think its unreasonable to show what a $500 graphics card is capable of when a top of the line CPU is used if a good CPU can be had for under $200 and overclocked beyond the speed of anything available at retail.
  • SloppyG - Monday, March 15, 2010 - link

    If they also included:
    -tools to easily compare picture quality between cards at different AA/AF levels
    -tests that might identify SLI/Crossfire setups that experience micro-stutter (if that's even possible...)
    -Automatically show performance changes with CF/SLI OFF, enabled at different resolutions/AA levels.. While the end number presented to the user is pretty pointless, the difference between 2x AA and 32x CFAA might be interesting.

    Presenting a user with some largely irrelevant 5 digit number doesn't really benefit anyone.
  • tynopik - Monday, March 15, 2010 - link

    I know Flight Simulator X is not the newest, but it's still top of its genre and quite demanding, especially at higher resolutions

    Also it gives some variety, testing the cards ability in other types of games.
  • OnTheWeb - Monday, March 15, 2010 - link

    Software like PowerDirector v8 utilizes the GPU to perform HD video encoding via CUDA or ATI Stream. This GPU-assisted encoding can take a process of hours and reduce it to minutes.

    This is an important aspect of aggressively expanding web video now that YouTube and Vimeo have begun supporting HD and Full HD formats.
  • Skiprudder - Monday, March 15, 2010 - link

    I've been pleased you've been including some World of Warcraft benchmarks in recent reviews, and I'd urge you to keep doing so for two reasons:

    1) It's just so popular, with something like 12 million current players worldwide, a hell of a lot of folks are playing it and want to know what sort of performance they can get in it when buying a new cpu/SSD/gpu.

    2) It places different demands upon a system than most games out there. WoW isn't particularly graphically demanding by modern standards, and although the graphics will improve with the Cataclysm expansion, I'm pretty sure they'll still be on the low end of things. However, because of the MMO nature of the program, and having to manage tons of other characters around you it places particular demands on the CPU, Memory, and SSD/HDD that other games do not. This is why I think it continues to deserve a place in future benchmarks.
  • tuskers - Tuesday, March 16, 2010 - link

    I'm in full agreement with point 1, but I think you're actually wrong about point 2. WoW is plenty graphically demanding. While the "core game" might be 5-6 years old, the graphics have been refreshed with each expansion-- while the engine is aging, lots of new effects are added all the time.

    While individual spell casts aren't that graphically intensive, a raid will often have more textures on-screen than anything. While an individual player model isn't the most complex, you're scaling it to cover 25-50 models at any one time with multiple textures, and that's before the environment. It's different, yes, but the load itself isn't small at all.
  • mountaingoat - Monday, March 15, 2010 - link

    I would be curious to read about the trade-offs, and see the performance of using the (cheaper) gaming GPUs for modeling and simulation software, SolidWorks for example.
  • Pessimism - Monday, March 15, 2010 - link

    some have been mentioned, i will add a second vote:

    -frugal gamer resolutions (1366x768 covers bottom-end LCD monitors and gamers using cheap/previous-gen 720P LCD TVs as giant monitors).

    -GPGPU: CUDA, OpenCL etc.

    -Flash-based video playback acceleration

    -2D GUI performance: Win XP, Vista/7 w/aero disabled, Vista/7 w/aero enabled,

    -2D flash game performance
  • Twsmit - Monday, March 15, 2010 - link

    Please bench at resolutions the majority of us play with. Most monitors are 1680x1050, 1920x1200, 1920x1080. How many people really have 30" monitors or triple eyefinity setups? I think super high res is great as a bonus to an article, but please continue to focus on what the rest of us have and can afford.

    I also wouldn't find focusing on the best bang for the buck CPU. For example the i7 920 stock speed, or an affordable stock clocked i5 cpu.
  • CaptNKILL - Monday, March 15, 2010 - link

    The most important thing to me, regardless of the game is that the card I'm reading about is achieving playable frame rates. For example, a lot of sites these days like to run all of their games at 2560x1600 with 4xAA minimum, even if this puts the average framerates in the 20s or 30s in some games.

    Its a waste of time to even read that benchmark because I'd never play a game with such terrible performance, especially on cutting edge hardware. It doesn't matter how high the settings are or how awesome the graphics card is, if the performance is not playable, scrap the benchmark and lower the settings until the results become useful. A simple note of "2560x1600 with 16xAA was not playable in this game on any cards tested" will suffice.

    Anandtech doesn't usually have this problem anyway though.
  • pjladyfox - Monday, March 15, 2010 - link

    After looking at your criteria here are a few that would be good:

    Battlefield: Bad Company 2 - FPS - DX11 title
    Dirt 2 - Racing - DX11 title
    Dawn of War II - RTS - DX10 title

    This gives a nice spread of different titles for each genre as well as those that support DX10/11 under Vista and/or Win7 which will be important as NVIDIA launches their DX11 hardware. In the case of Dirt 2 it's already got a built-in benchmarking tool which should also make things a bit easier I would imagine.
  • vshah - Monday, March 15, 2010 - link

    I'd be happy with one or two games based on each major engine. source / tech5 / cryengine / whatever else is being used.
  • Luminair - Monday, March 15, 2010 - link

    It will be easy to forget DX10. I don't like it, but I think you need to verify that it works.

    Consider continuing to use World in Conflict as the Direct X 10 benchmark. It might provide the best diagnostics of a DX10 game.
  • SydneyBlue120d - Monday, March 15, 2010 - link

    Just some examples:

    - Flash -> Youtube, Vimeo, Hulu, Farmville;

    - Silverlight;

    - HD Video Files made via modern camcorder;

    - Once they become available, Direct2D apps (e.g. IE9, FF4) especially running HTML 5;

    - OpenCL apps once they become available;

    I will also add to the review a very old low budget machine, I mean something with the first PCIE 16x slot, to see if chaningh the GPU cold be productive in similar enviroment (and Yes, I understand a person with a P4 631 will not buy a Fermi 480, but maybe a passive 5450 may be a good upgrade wih an old SIS chipset (as you can find in many Lenovo e.g.).
  • Negative Energy - Monday, March 15, 2010 - link

    My reasoning is that EVE is one of the online games with the most players, and it has some of the best graphics for online games. It is not a DX11 game, yet, but you often get the situation where the player has 2 or more screens with multiple games running on them at high resolution.

    Mass Effect 2 would also be nice.
  • xeopherith - Monday, March 15, 2010 - link

    I don't exactly think the games matter at all. I think the focus needs to be on running multiple GPU's for comparison, multiple resolutions likely including 1280x1024 and then higher widescreen resolutions, and eliminate the CPU as a bottleneck as much as possible by using games that really push the GPU rather than CPU.

    It would be nice to have a comparison between the GPU in both an AMD and Intel based system. Yes an Intel CPU will get a higher performance in most games than AMD but disregarding the CPU performance is the card reaching it's potential in each situation?
  • Targon - Monday, March 15, 2010 - link

    One thing that you almost never see is a comparison of the same video card on different motherboards. For Intel processors, you have Intel and NVIDIA chipsets, for AMD processors, you have AMD and NVIDIA chipsets available.

    Now, it is very possible that a NVIDIA video card will work better on a NVIDIA chipset for example, or that Intel may tweak things to intentionally or unintentionally hurt the performance of AMD/ATI Radeon cards. It would be nice to see more comparisons that try to expose where a given chipset is the best choice for a given video card.

    So, for Radeon cards, use a Phenom 2 X4 965 processor based system and see if the performance is better with an AMD chipset compared to NVIDIA.
  • Lockson - Monday, March 15, 2010 - link

    Two games that would be very relevant to me to be benchmarked are:

    -Supreme Commander 2
    -Dawn of War 2: Chaos Rising

    I'd prefer these, other strategy games are also welcome of course! I just post this because I sometimes feel strategy games are somewhat under represented.
  • gentlearc - Monday, March 15, 2010 - link

    I think benchmarks are often times misinterpreted by those new to AnandTech. I believe many people look at a review with one game in mind. Often times it simply isn't possible to accommodate everyone. However, I believe it would be beneficial to investigate a standard framerate/experience you are trying to achieve. This can be a bit subjective, but I do believe on the more popular older titles most people aren't asking if it will be able to run the game, rather, they are asking how well it will run.

    Take the hardly ignorable WoW or Crysis. Benchmarks are just graphs without the knowledge of AnandTech there to interpret them for your new viewers. I believe you would gain a larger viewership if you dedicated a bit more narrative and explanation for these games. Contrary to the popular phrase, "but will it run Crysis," people want to more than just that. They want to know what tangible benefits they may see from this new card.

    I think it would be beneficial to show a few major games with different options enabled to show how the playing experience is improved. Options such as viewing distance, AA, shadows, and resolution.

    On that note, reviews linked to articles describing how to optimize certain games would be helpful. Many gamers have little knowledge as to why you get a certain FPS and they don't. Just as you reference many of your articles with an article, that will really help to spread the education on how and why you are conducting your benchmarks as such.

    Which leads to "optimising games." I don't believe you have to actually review a game or provide an in depth technical overview of each game. Instead, I see this as an opportunity for AnandTech to educate viewers about the many options within most games. Perhaps this is a place to encourage your forum to tackle the task of showing for each game what are the different tiers of gaming options users should select when they want to improve their gaming experience.

  • slaveII - Monday, March 15, 2010 - link


    I would suggest that benchmarking popular games with built in benchmarking tools - such as Resident Evil 5 - would be useful since it is then relatively easy for a reasonable proportion of those reading the article to judge the relative benefit of the reviewed GPU to the GPU they are currently using. Repeatability at home is key to this suggestion.

    Also, I agree that a spread of game types is desirable to cover the interests of the maximum number of readers in a given review.
  • MrJim - Monday, March 15, 2010 - link

    I would love to see a new simulator, perhaps Black Shark or Rise of Flight. Maybe not perhaps DX11 but interesting for us who like sims.
  • Scali - Monday, March 15, 2010 - link

    I think that the next battle of GPUs will partly be fought on the GPGPU-front. Games are starting to use DirectCompute or OpenCL for post-processing effects, and nVidia has been using GPGPU-accelerated physics for a while...

    So I would like to see some GPGPU-related benchmarks in there. SiSoft Sandra has some, GPU Caps Viewer has some... You could probably also use the samples from the DX SDK, nVidia Cuda SDK and ATi Stream SDK (they contain OpenCL and DirectCompute samples aswell, to keep it on a level playing field).

    If an OpenCL- or DirectCompute-based physics solution arrives (Bullet?), it will be very interesting to do some benchmarking on that aswell.

    Aside from that, it will be very interesting to have some tessellation-heavy tests in there, as it seems to be the biggest difference between ATi's and nVidia's upcoming DX11 architecture, graphics-wise.
  • LordSojar - Monday, March 15, 2010 - link

    I agree with prior suggestions of adding Aliens VS Predator (2010) as one of the DX11 test beds. It has excellent use of tessellation, and certainly can stress modern cards enough to get a good gauge of performance.

    STALKER: Call of Pripyat is another great DX11 to test, as it can really bring cards to their knees at full settings/high resolution.

    Synthetic benchmarks are useless to most users, and I disagree with adding them, as it will only distort some readers' views of actual performance. They are so dependent on system settings, tweaking, and overclocking to push figures up, they don't represent realistic performance in any way to most end users. They simply will make reviews take longer, and skew the data in some cases for most readers.
  • Overmind - Monday, March 15, 2010 - link

    AvP3 because it has Tesselation.
    S.T.A.L.K.E.R. CoP - the updated Xray engine (with DX11).
    UT 3 - old but still relevant.
    CoD MW2, BFBC2 - two currently very played games.
  • Nathelion - Monday, March 15, 2010 - link

    How about Shattered Horizon? Pretty demanding if you turn the eye candy up, and built on DX10 from the ground up.
  • vectorm12 - Sunday, March 14, 2010 - link

    Now I for one would like to see the very compute intensive Football Manager 2010 make it into your suite.

    As it's more or less completely CPU based it's not very good for seperating one GPU from another but on the other hand it's shown to be fairly well threaded in later versions suggesting it would be an interesting choice when determening CPU performance.

    Other than that I'm gonna agree with most other here. No more synthetic benchmarks.
  • Baron Fel - Sunday, March 14, 2010 - link

    BFBC2
    Napoleon Total War
    Metro 2033

    all intensive, system stressing games.

    Mass Effect 2 and TF2 would be relevant but they dont need much power at all.
  • pensuke89 - Monday, March 15, 2010 - link

    I'd suggest relevant and popular games to be benched in feature article. That way viewers will be able to see the performance of the game and know which is the minimum required GPU to play the game at satisfactory settings.

    Just leave those graphical intensive games to the GPU benchmark suite.
  • pensuke89 - Sunday, March 14, 2010 - link

    3DMark should always be included in the bench. Why? Because its free and provide a good base on the graphic card performance.
    1) Sure, graphic card vendor optimize the drivers for those synthetic benchmarks, but if both nVIDIA and ATI optimizes it, then its fair.
    2) Its available for free to download. If the benchmark comes out with games like Crysis and someone don't actually own Crysis, then how do they compare their cards and the benched cards?

    I second the suggestion to include some function to auto-detect the user's graphic card and display the scores for the card in the benchmark.

    Its also important to keep using the top-end CPU to avoid bottlenecking. If I want to check for CPU performance, I'd go to the CPU section. If they include too many processors, that would consume a lot of their time.

    If its possible, I would like for Anandtech to give viewer to be able to download benchmark files (so user can run it themselves using the same suite).

    Yes
    1) S.T.A.L.K.E.R.: Call of Pripyat
    2) Crysis or Crysis 2
    3) Programs that use GPU for other tasks beside gaming (Its GPGPU era now, so I think this should be included.)
    4) DIRT2 (DX11)
    5) Metro 2033
    6) Graphic Card power usage (not whole system)
    7) Battlefield BC2
    8) Synthetic benches (read above)

    No
    1) Basically any UE3 engine games (OK, maybe keep 1 for representing the much used UE3 engine)
    2) Source games
    3) COD4MW, COD4MW2
    4) API specific programs (like Badaboom CUDA)
    5) enabling PhysX (unless its a dedicated section, say, Physics GPU)
  • Dracusis - Sunday, March 14, 2010 - link

    The beauty of PC gaming is that we get a massive variety of games to play on our video cards. We get a lot of the big budget blockbuster cross-platform titles that also hit the 360/PS3 and we also get lots of nice exclusive RTS, MMOs, RPGs and heaps of Indie titles too.

    2 x MMO
    2 x FPS
    2 x RTS
    2 x RPG

    You'd want games that stress a machine, work with all features if possible (AA ect) and you'd want to cover some of the more popular 3D engines like Unreal Engine 3, VALVe Source ect.

    For RTS you could use Supreme Commander II, Napoleon Total War or Starcraft II if it's released before your benchmark update and Sins of a Solar Empire to cover the Indie aspect of PC gaming.

    For RPGs you'd probably want to mix subgenres, Bioshock/Bordelrands Action RPG paired with Fallout/DragonAge.

    For MMOs you'd want to stick to solid players that have been around for a while like WoW, Warhammer and Age of Conan or maybe risk a newer Cryptic game like Star Trek Online (same engine as Champions Online).

    For FPS you'd want to test both ends of the scale in terms of scope, Bad Company 2 and Left 4 Dead 2 would make good choices here.


  • Paulman - Monday, March 15, 2010 - link

    +1 agree

    Although I would weight the FPS-genre a little more heavily, since so many more people play it. Maybe at the expense of RTS and RPG benchmarks, if necessary.
  • OzoZoz - Sunday, March 14, 2010 - link

    I'd like to see more OpenGL benchmarks. As an OpenGL developer, I would like to see how stable and optimized the different GPUs are with respect to OpenGL. If there are not enough good OpenGL games, you could use some real-time simulation softwares, such as something based on OpenSceneGraph or Gizmo3D.

    Also, I would like to see some OpenGL driver compliance analysis (with respect to the OpenGL specification), as well as performance comparisons for precise OpenGL features that are not typically used by popular games (ex: glReadPixels, glTexSubImage, arb_multisample, complex GLSL shaders, etc.).

    In other words, I would like to see how these graphics cards (and drivers) can be used for OpenGL development of real-time, performance-oriented applications. And at the same time, push these companies into providing better OpenGL support.
  • Luminair - Monday, March 15, 2010 - link

    I endorse the recommendation of including an OpenGL game benchmark.
  • Alastayr - Sunday, March 14, 2010 - link

    I'll try and keep it short.

    Resolutions to bench:
    - 1680x1050
    - 1920x1080
    - 2560x1600

    Don't include Eyefinity / the SLI alternative yet, make it a separate one-off test. At the moment it's way too niche to be useful. Enthusiasts can already gather a lot of easily available data from various forums and other sites if they're interested.


    Games to bench:

    - keep Crysis: Warhead (just for the sake of it)

    - maybe drop Far Cry 2; the engine has had no impact on the market and it's a forgotten game already

    - keep Dragon Age: Origins; it's a bit less taxing than The Witcher but more contemporary

    - keep Dawn of War II; maybe upgrade to Chaos Rising if it has an in-built benchmark especially since DoW II was a TWIMTBP and Chaos Rising is ATi sponsored now (implications for CF?)

    - add Dawn of Discovery (Anno 1404); it's a beautiful game and it's taxing as hell

    - add Battlefield: Bad Company 2; Frostbite is finally on PC and it's here to stay (MoH MP; BF3)

    - keep Batman: Arkham Asylum; it's representative of UE3 games (Mass Effect 2, Mirror's Edge etc.)

    - consider the GTA IV - Episodes; the stand-alone addon should prove quite taxing to many GPUs

    - keep the Source Engine and go with L4D2; especially viable for low-end and mid-range

    - add DiRT 2; you guys need a racing game, the engine is the basis for the new F1 game and it's DX11

    - add Metro 2033 or Stalker: Call of Pripyat; exotic, possibly less polished engines and overall good indicator for new unusual games as they're heavily into DX11

    - consider Just Cause 2 when it comes out; it looks like a benchmark waiting to happen...


    One really nice feature that I'd appreciate would be sound clips of different cooling solutions, maybe in a YouTube video that includes talking so we'd get a good measure of relative loudness and tone. The current dB(A) measurements are not very meaningful in my opinion.

    I really love the idea of a comprehensive overview and I'm happy with the current benchmark suite but only half of the games in there seem indicative to me. And good call on excluding synthetics; I don't want this to turn into the early noughts again. Many people seem to have forgotten that time period already.
  • rjc - Sunday, March 14, 2010 - link

    Hello,

    Sorry being late to this...

    Can anandtech please consider improving their measurment of graphic card power usage? Have a look at this German site for an example:
    http://ht4u.net/reviews/2009/power_consumption_gra...">http://ht4u.net/reviews/2009/power_consumption_gra...

    Also, as well as higher power use cards are now running at higher frequencies. This gives the potential that not carefully designed cards (especially if overclocked) will start emitting excessive EMI causing errors and/or failure in other components(ie they irradiate the other components in the computer). At the moment any failure of the computer is just assumed to be overheating. This needs to be studied more, sadly nobody seems to be checking this at present. If graphic cards are used increasingly for long running computational tasks which require high accuracy(ie Cuda or OpenCl applications) then reliability becomes much more important.

    The above 2 issues fit into basic requirements, i think they should come ahead of any performance related testing.

    Thankyou for considering the above.
  • Locrian - Sunday, March 14, 2010 - link

    I'd like to see a review or section of a review dedicated to testing mainstream gamer cards with professional art applications. Comparing Geforce 4XX and Radeon 58XX with some of the Fire and Quadro pro cards. Testing things like 3dsmax render times as well as general real time view window performance/responsiveness. Same for Maya. Test real time shader plugins for said programs to see if one brand handles that better than another. I dunno, maybe this is beyond Anandtech and more for a 3D-specific website. But its something I am interested in seeing. Then there are the sculpting programs also. I believe Mudbox is the one that currently uses the graphics card while I think Zbrush still does not. Photoshop, Aftereffects, etc. See if they offer worthwhile value for the pro visual applications vs the ridiculously expensive pro cards.
  • yyrkoon - Sunday, March 14, 2010 - link

    Refreshing your blacklist, so the rest of us can actually read the comments without a crap load of annoying spam.

    Maybe not important to some, but I personally like reading at least some of the comments to read about things that I may have/have not thought about myself.
  • LyCannon - Sunday, March 14, 2010 - link

    How about using applications that use the GPU for processing, like Photoshop, Badaboom, Boinc, etc?
  • charmedmeat - Sunday, March 14, 2010 - link

    Well, I know they're few & far between, but some Open CL benchmarks of some type (compression & cryptography come to mind) would be nice. Also, when they come out, some DirectX 11 tesselation heavy games would be good.
  • Ramon Zarat - Sunday, March 14, 2010 - link

    Hi.

    Anandtech already focus on the essential with a very good strategy. Only 3 resolutions that reflect the power of the card and only 2 set of resolution: High and lower end. AA is only used when necessary. It's simple, efficient and cover over 90% of real life scenario.

    I want to attest why Anandtech strategy is so spot on, so lets refer to STEAM data, which is probably the best data reference source you can get:
    http://store.steampowered.com/hwsurvey/">http://store.steampowered.com/hwsurvey/

    lets start with resolution. The following list represent 70.45% of all used resolution:

    1024 X 768 @ 12.93% (Very surprising!)
    1280 x 1024 @ 19.31% (THE most used resolution)
    1440 x 900 @ 10.75%
    1680 x 1050 @ 18.43% (Second most used)
    1920 x 1080 @ 9.03% (The highest resolution used by a SIGNIFICANT amount of users)

    Resolution below 10%, but higher than 5%:

    1280 x 800 @ 6.61%
    1920 x 1200 @ 5.86%

    Progressing and receding:

    All resolution at and below 1280 X 1024 are receding. The biggest drop is 1024 X 768. All resolution over 1680 X 1050 are progressing. The highest gain is 1680 x 1050.

    Now we can see why Anandtech has chosen those 2 sets of resolution

    Low end set:

    1024 X 768 NO AA
    1280 x 1024 With 4X AA
    1680 X 1050 With 4X AA

    High end set:

    1680 X 1050 with 4X AA
    1920 X 1200 with 4X AA
    2560 X 1600 with 4X AA

    I wouldn't change a thing. This 2 sets strategy is perfect. 1680 X 1050 is on the 2 sets and will be the most used resolution of 2010-2011, taking over the receding 1280 X 1024.

    Now moving to the test platform. Here, I have some comments to make. Let see what STEAM have to say first:

    69.06% of the player use Intel. So using Intel for the platform is right.

    Intel speed:

    1.7 to 2.29Ghz = 16.53% (Combined 1.7 to 1.99 and 2.0 to 2.29)Median speed is 2.0Ghz
    2.3 to 2.69Ghz = 26.72% Median speed is 2.5Ghz
    2.7 to 3.29Ghz = 19.51% (Combined 2.7 to 2.99 and 3.0 to 3.29)Median speed is 3.0Ghz

    Intel core:

    1 core = 17.97%
    2 core = 55.37%
    4 core = 25.62%

    HyperThreading:

    11.27%

    CPU/platform conclusion:

    In the light of this data, I think Anandtech should use 2 platforms, 1 for each resolution set. A dual core running at 2.5Ghz with HT Off for the low end resolution set and keep the current quad core 3.33Ghz for the high resolution set.

    It's either that or test both set with an overclocked quad running at 4.2Ghz (highest speed on air) and RAM at 2600Mhz to eliminate all platform bottleneck and isolate GPU performance.

    Which approach is best? With Crossfire / SLI / Dual GPU card solution, it's clear that they are CPU limited in some scenario. On the other hand, only a minority overclock their quad to 4.2Ghz. Maybe then provide only 1 result @ 4.2Ghz @ 2560 X 1600 and only with multiple GPU?

    Now lets talk about the games. There is 2 approaches. The best selling or the best rated. I would consider both and make a top 10 out of that.

    Top rated:

    http://www.gamespot.com/games.html?type=top_rated&...">http://www.gamespot.com/games.html?type...ge_type=...

    AND

    http://www.ign.com/index/top-reviewed.html?&so...">http://www.ign.com/index/top-reviewed.h...onths&am...


    Top sale:

    http://kotaku.com/5486293/pc-sales-charts-bringin-...">http://kotaku.com/5486293/pc-sales-charts-bringin-...

    AND

    http://www.amazon.com/gp/bestsellers/videogames/22...">http://www.amazon.com/gp/bestsellers/videogames/22...


    What I would like to be added is Directcompute and more importantly, OpernCL, GPGPU benchmark. SiSoftware for synthetic OpenCl benchmark would be a nice start. I've also found a little Directcompute benchmark HERE: http://www.ngohq.com/attachments/graphic-cards/250...">http://www.ngohq.com/attachments/graphi...mark-dir...

    If you insist on non-synthetic GPGPU benchmark, then try the folging@home GPU application that runs on both ATI and Nvidia : http://folding.stanford.edu/English/DownloadWinOth...">http://folding.stanford.edu/English/DownloadWinOth...


    That's it! :)

    Ramon
  • Nimiz99 - Monday, March 15, 2010 - link

    I have to agree with the OP on this, 3 resolutions and no more. I'd even argue only 2 resolutions, because two things happen:
    1) Running a game that relies on RAM as it fills the GPU's ram
    2) Running a game that pushes the limits of what the chip can do

    Higher resolutions generally focus on (1) and lower resolutions focus on (2)...so why bother having any more than that (granted the cutoff point will vary, but having 1280x1024 and a higher one like 1920x1080p or 2400x1600 could accomplish that.

    Earlier in the thread there was also the argument for more RTS games...most of the gaming I do is RTS/RTT and FPS. From that perspective I'd like to see Starcraft 2 upon release (and I know it doesn't push Graphics as hard as it could) and then any FPS that pushes graphical boundaries.

    thanks and keep up the good work

    Finally, I still like to see synthetics...I enjoy benchmarking and although I won't buy a card based on a 3Dmark score, I still like to see it. So please keep at least 3Dmakr in y'alls set-up.
  • Jamahl - Monday, March 15, 2010 - link

    69% of steam gamers use intel, but most of them are dual core pentiums. Should those be used for benchmarking too?

    i7's should not be getting used for gaming, that much is clear.
  • primagen - Sunday, March 14, 2010 - link

    I would like to see the following: Shattered horizon, [A very good looking pc exclusive game] Stalker: COP, AvP [DX 11], Battlefield [very popular- dx 11 3d/eyefinity], the newest total war game [Napolean?]. That about sums up my desire for new games. I would however, like to see older card/CPU comparions as well. I do not have an I7 so I would like to know what I can expect out of my cpu coupled with my gpu.
  • deathbycomputer - Sunday, March 14, 2010 - link

    I had a lot of trouble finding CoD MW2 performance reviews. It didn't make sense to me how a game so popular like that one did not get the attention of hardware testers; seems to me like only games that are demanding of high-end hardware are used on reviews, which makes sense of course. However, other popular/mainstream games that virtually everyone plays are not included in the Gaming Performance section of a CPU and/or GPU review.
  • RoseRendeR - Sunday, March 14, 2010 - link

    I dunno why ya just don't test da games in da "Top Rated PC Games"-list compiled by Gamespot?
    Since they're the top rated PC games, OBVIOUSLY!
    I mean, that's da way "I" would work!
    Go figure!

    Then again, I'm just lil' ol' me! YOU're da pro's babe!

    RoseRendeR
  • SinxarKnights - Sunday, March 14, 2010 - link

    Synthetics are important to have an easily comparable benchmark.
  • thurston - Sunday, March 14, 2010 - link

    I would like to see some benchmarks that are not games. I think you should also benchmark some video applications that use gpus.
  • Barneyk - Monday, March 15, 2010 - link

    I agree alot with this!
  • Patrick Wolf - Sunday, March 14, 2010 - link

    I'm pretty happy with your current line-up, but yeah you should probably get some newer games in there, but only if they're more demanding than an older benched game.

    I want to see:

    - GTAIV because it's.. GTAIV. I know after all the patches it's still probably not optimized for PC, but it can't be too bad now. And, it's GTAIV!

    - The Witcher Enhanced Edition cause it's pretty demanding I think.

    And whatever happened to the mystery of better performance when ATI cards are running with Intel CPUs and when nVidia cards are running with AMD CPUs (or something like that)? Has this been solved? If not, it's something I'd like to see tested.
  • MikeSz - Sunday, March 14, 2010 - link

    Metro 2033 - first game that seems to really push DX 11. also "optimal" requirements posted on Steam are simply insanely high, so the game has a chance to stay in the suite for years to come

    Shattered Horizon - first and only name I can remember that was written purely for DX 10 and requires a DX 10 card. While I dont think well of the game, it should provide a good benchmark of DX 10 cards

    PS my first post :) long time reader
  • Troll Trolling - Sunday, March 14, 2010 - link

    I would like to see Team Fortress 2, at least on low~med end.
    Anno 1404 also would be a nice game to add.

    Other thing, I think the reviews should have the min FPS, and some kind of FPS stability graph.
  • MuParadigm - Sunday, March 14, 2010 - link

    Tom's Hardware recently had an article where they tested 2D performance on recent GPUs and found that it was pretty abysmal.

    I'm not sure what apps outside of a benchmark you could use for testing, but it might help keep AMD and NVidia on their toes with up-to-date 2D implementations in their device drivers if they knew it would be tested.

    So I'm recommending the inclusion of some old-fashioned 2D testing.

    .
  • MuParadigm - Sunday, March 14, 2010 - link

    I see that GrizzledYoungMan got there first. Consider the above post seconding his points.

    .
  • JNo - Sunday, March 14, 2010 - link


    For all mainstream / high end cards, MINIMUM resolution of 1680x1050.
    I hate seeing reviews of 295s and 5870s etc with fps graphs for 1280x1024, as if anyone who could afford those cards would use a cheap 4:3 monitor...

    Also second a value graph / table i.e. £/fps or $/fps for you guys.
    For an example, see www.hexus.net
    They 'normalise' the numbers a bit first too e.g. for fps above 60, they half the additional fps as, well, they're half as useful above that sort of level (eg 200fps vs 100fps on L4D is not nearly as useful as 40fps vs 20fps on Crysis. First example both 200 & 100fps are highly playable, second example 40fps much more so than 20fps).
    And then, for icing, they also provide value tables for overclocked result too.

    Just my 2p...
  • exostrife - Sunday, March 14, 2010 - link

    The change I'd like to see has less to do with specific games, and more to do with how people use your reviews.

    Having 6 games reviewed at 2560 or even higher is pretty pointless. Budget and mid-range cards far outnumber the $600 king cards, and people who are buying mid-range or budget cards probably aren't running huge monitors, so the higher resolutions are less meaningful.

    I'd also suggest trying to keep a mix of new and old games in your reviews for two reasons; games on pc's have a lot of replay value and their engines form the core of multiple games, and it also helps create a point of comparison when looking at older reviews. I raise this point because many of us are reading reviews of new graphics cards because we are looking to upgrade our older card. If we can compare old reviews of our card with some of the same games/resolutions to the new card, then thats a bonus.

    The other thought I'd have is to try to keep a good mid-range card from each manufacturer from the last 1 or 2 generations in your reviews. This also would help the many of us considering a upgrade just how our current card (or at least a representative of the line) stacks up.
  • Nickel020 - Sunday, March 14, 2010 - link

    I would love to see more information on how exactly you benchmark. While I generally trust Anandtech to to use meaningful scenarios that somewhat reflect real game performance, you only provide very little info on how you get your data. Or when you do, it's often hard to find as you only mentioned in one article.
    Looking at what other sites use as benchmarks I know that they're using a lot of timedemos as benchmarks that have about as much meaning as 3DMark, as they do not reflect the actual game performance.

    If you say that you don't want to use synthetic benchmarks as they're "meaningless" then you should explain how you benchmark and why your numbers really do reflect actual in-game performance. More transparency is the way to go here imho.

    Maybe you could do a "How we test" article that explains your test methdology.

    Also, it's not always possible to choose a "worst case" scenario as a benchmark, so it would be nice if you said how many FPS are actually necessary for stutter-free gameplay. This is for example necessary to give your WoW numbers meaning as right now we don't how a system that gets say 80 FPS in your test performs under non-repeatable raid circumstances.
    The goal of benchmarks should be to assess relative performance and absolute performance (i.e. how much performance do I need to play X at details Y?), and your articles often lack a bit in the absolute performance part. It's relatively easy to answer "What GPU should I buy for $200?" after reading your reviews, but you can't really be sure about how much you need to spend to get the performance you want.
  • arnavvdesai - Sunday, March 14, 2010 - link

    I am not sure if you have seen the project out of Microsoft labs called Pivot. Its a way of seeing your data in different ways. If all of your data was compiled together it would be very beneficial to folks. For instance just adding your normal GPU reviews along with your chipsets processor etc. reviews would allow someone to combine all of them together in meaningful ways.
    For instance, it could allow someone to build a computer of the various parts you review and see how it would perform under the various scenarios you test against. This would mean someone could get a combination of various parts(some old, some new) and see where exactly how their computers would perform. I might sound like a shill but I seriously suggest you look at this technology. Its free to download atm I think. I am sure if you google Pivot+Microsoft Labs
    you would be able to check it out for yourself. I know this has got nothing to do with a new Test bed just a general suggestion on data presentation. I think you should add Metro 2033 when it comes out as it has Dx11 and supposed to be its own engine. Also Cry Engine 3 will be out and you should add that as soon as its available.
  • Nighteye2 - Sunday, March 14, 2010 - link

    I think starcraft 2 would be a good addition, as many people will be playing it.

    Also, I would like to see some synthetic benchmarks included - not their overall scores, but their scores on certain functions. It would help explain differences in performance between cards across games. It helps to know why a card performs the way it does in certain games, and from that you can better predict performance in newer games.

    Also, I'd like to see scores for medium graphic settings - that people buying mainstream cards will likely be playing at. Some like all settings at high, but with low AA and AF and a resolution of 1280*720 - which looks excellent on a 1920*1080 monitor. But requires a lot less graphic horsepower.
  • wxmanunr - Sunday, March 14, 2010 - link

    I'd love to see CPU Scaling results using several CPUs. A bargain one, mid-range, premium, and no holds barred.
  • Paulman - Monday, March 15, 2010 - link

    +1
  • GrizzledYoungMan - Sunday, March 14, 2010 - link

    I was recently burned pretty badly with my purchase of an ATI Radeon 5770, as part of a new Windows 7 build.

    To make a long story short, ATI's 5000 series is actually MANY TIMES SLOWER than most pre-2004 GPUs in 2D applications that rely heavily on 2D drawing, like Photoshop, AutoCAD and Illustrator. In short, programs I and many others use to make a living.

    Evidently a fix is available, but ATI isn't going to build it into their drivers till Catalyst 10.4 or 10.5.... nearly a full year after the release of the 5000 series. Utterly unacceptable.

    I think that part of the reason for this gimpy performance is the excessive focus on framerates and 3D performance in enthusiast sites. The ATI 5000 series was designed to win benchmarks, and it does. But some of us need to also us our PCs to work. And in that case, your options are to either buy a $1000 professional GPU that basically uses last years mid-range chip... or nothing.
  • dubyadubya - Sunday, March 14, 2010 - link

    +1 2d performance has suffered for years. My old 9800 pro would wipe the floor with any newer GPU's I have tested. ATI 3xxx series and up are so slow in 2d you can see it. Nvidia cards were much better until the 1xx.xx and higher drivers came out. Nvidia borked something and its still borked to this day. Now we have Windows 7 with its new graphical subsystem. Sure it uses less system memory but 2d hardware acceleration is nearly a thing of the past. Some good reading here. http://www.passmark.com/support/performancetest/2d...">http://www.passmark.com/support/performancetest/2d...
    The trouble with testing 2d performance is a lack of a real world way of doing it. I use and support Passmark Performance test even though its synthetic. It would be nice to see 2d results on all cards tested. 90%+ of PC's time is spent in 2d mode so 2d is important! http://www.passmark.com/products/pt.htm">http://www.passmark.com/products/pt.htm
  • GrizzledYoungMan - Sunday, March 14, 2010 - link

    Exactly - it's where we spend 90% of our time on PCs, and yet

    To give all you users an idea of what we're talking about here: if I try to select a bunch of objects in Illustrator, or move a complex shape in AutoCAD LT, my system slows to a halt as redrawing these elements takes forever. With my ancient 7800GTX, this worked beautifully - smooth and crisp and responsive.

    Nuts, I just made myself nostalgic for that 7800GTX. Single slot cooling, silent operation, drivers that WORKED, 4xFSAA at 1600x1200. It was like they had found the Terminator's head or something.

    But I digress. Point is: it might not even be necessary to design a test suite that accurately benches 2D performance, because that's not really what we're after. Rather, we just want a smooth user experience. So a single paragraph describing the subjective experience of those kinds of operations in Illustrator and AutoCAD relative to a GPU that is known to do it well (the 9800 series isn't a bad choice) might be good enough.
  • GrizzledYoungMan - Sunday, March 14, 2010 - link

    EDIT: But some of us need to also us our PCs to work. = But some of us also need to use our PCs to work.

    Evidently a hangover and righteous indignation is a bad combination.
  • superxero044 - Sunday, March 14, 2010 - link

    its one of the best looking games around
  • Barneyk - Sunday, March 14, 2010 - link

    I really wanna see old games tested more, especially for mid- and low-range cards.
    I would also like to see a combination graph where you add all the numbers together, or some other sort of graphic that shows the combines results.
    Its not really what you're asking for, but that is things I wanna see. :)

    And also, when it comes to CPU-testing I really feel that the lack of multitasking testing is hurting the overall picture I get.
  • f0d - Sunday, March 14, 2010 - link

    i believe bad company 2 would make a great addition to the next GPU test suite

    also i would like to see different cpus to be included not just clock speeds on the same designed cpu
    like

    i7 960
    i7 870
    q9550
    x4 965
    x2 255
    e8600

    but that maybe more to do with a cpu benchmark

    maybe a game test suite is what we need? test a single game over multiple resolutions with multiple video cards and multiple cpu's
    maybe its just me but i would like to see how the different hardware scales especially at higher resolution

    ive been so fascinated over it since i saw this http://techreport.com/discussions.x/18095">http://techreport.com/discussions.x/18095
  • Paulman - Monday, March 15, 2010 - link

    +1 agree

    Although, to be realistic I'm sure Anandtech would have to carefully choose how many additional CPU's they'd want to test, and for which specific games and GPU's they'd test it, since the possible combinations for testing will easily skyrocket.

    But it would be very valuable if Anandtech tried - especially so we can see the impact of dual-, tri-, and quad-core scaling with particular games and at particular GPU performance levels.
  • deathbycomputer - Sunday, March 14, 2010 - link

    Ah. My thoughts exactly. Different CPUs should be used because not everyone has a i7 975. A lot of the results on GPU benchmarks don't reflect those of which the consumer will realistically achieve.
  • Rebel44 - Sunday, March 14, 2010 - link

    New AvP
    S.T.A.L.K.E.R.: Call of Pripyat

    Please show us graf of FPS in time - just like HardOCP.
  • StormyParis - Sunday, March 14, 2010 - link

    Specifically, most tests answer the question "Which card should I buy ?", but not the seminal question: "Should I upgrade now, and what ?".

    So I'd trade a handful of game tests for a handful older cards in the tests, and a RAM / CPU impact test.
  • TGressus - Sunday, March 14, 2010 - link

    As others have surmised, average frame rate is not half the story. Other sites already use a line graph of their recorded frame rate dataset and it does give a great overview of that particular benchmark. The problem is that more than 2 cards on one chart becomes a mess to read

    You could improve upon the idea by using something similar to the Frequency function in Excel and graph the number of times frame rate was recorded at 10fps increments. This would clean up the graph and still reveal trends like, "Average frame rate is 84, but you will be spending most of your firefights 10-20 FPS south of that."

    Example:
    http://img229.imageshack.us/img229/3137/frequencyi...">http://img229.imageshack.us/img229/3137/frequencyi...

    At very least make sure to report both minimum and MEDIAN frame rate if you have the data.
  • makin moneys - Sunday, March 14, 2010 - link

    Bad Company 2 - It's a new, high production value FPS, many have voted for this already

    Supreme Commander 2 - Should provide a good RTS benchmark; Blizzard is notorious for making their games able to be ran on low end hardware so I think this would be a better benchmark choice than StarCraft 2

    Batman: Arkham Asylum - Another hardware taxing game, like BC2

    Bioshock 2 - A new, impressive looking RPG/FPS

    (when it's out) Crysis 2 - Needs no explanation

    I also roll my eyes whenever I see MMO benchmarks. There are way too many variables to get any reliable runs. Same for other games that don't have a benchmark built in, but they aren't as unpredictable as MMOs.
  • Ryan Smith - Wednesday, March 17, 2010 - link

    I actually had a chance to talk to the GPG guys at GDC last week. At this point we're not planning on using SupCom2; they told us straight-up that it's not going to be graphically intensive (so they're pulling a Blizzard here).
  • Dzban - Sunday, March 14, 2010 - link

    Since when Batman is hardware taxing? It runs great on whatever GPU you have. In full HD it plays great even on 9600GT. It's a waste of time to test it on newer GPUs.
  • Zappcatt - Sunday, March 14, 2010 - link

    Batman AA not taxing?
    Let me guess, you run AMD?

    I have a 9800GT SLI setup, and I get a minimum framerate of 1 if i enable Physx.

    Dark Void is similar, yes if you disable a major feature, it will fly, but if you want to see it "in all its glory" it can CRUSH current hardware.
  • Mr Alpha - Sunday, March 14, 2010 - link

    How about Metro 2033? From the screenshots I've seen it has some really fancy graphics. Might even give Crysis a run for its money as the king of graphics.
  • krumme - Sunday, March 14, 2010 - link

    Ryan, total respect for non syntetic bm, and even more respect for involving your customers. Lets have more of that. That means more to me, than the reviews themselves.

    Is the new methology going to be used for the fermi launch?

    If so:
    Dont you find it problematic? - and if, - how?
    You must know a lot speculated that you would alter your bm suite when fermi was comming. Controlling the anticipation like you do here, does not change that. Knowing we anticipated this situation, how do want us to interprete what is happening?

    I think we had the same problems with the intel ssd reviews, fortunately it ended with your own excellent bm suite, you made yourself, instead of the 4k random writes all over. Happy ending here. A great benefit for your customers. I hope your will get there to with your gfx suite.

    But changing it right now is highly problematic, because the comparison to earlier bm is weadk. I think Fermi should stand by itself, like the ssd should.
  • Ryan Smith - Wednesday, March 17, 2010 - link

    Why now?

    My goal as GPU editor is to refresh our benchmark suite roughly every 6 months. It's been made very clear to us that you guys like to see new games used, and this gives us the opportunity to rotate those games in.

    This is a particularly critical point since our last refresh was for the Evergreen launch, which means we don't have any DX11 games in our suite. With NVIDIA soon to ship DX11 cards, we finally will be able to do some DX11 performance comparisons, so we're going to go ahead and refresh our suite.

    We're not going to throw everything out, and every card is going to have to "stand by itself".
  • Earthmonger - Sunday, March 14, 2010 - link

    I think I'd drop from the list:
    - Any game optimized for the console market, which is hardware limited.
    - Any MMO; games that rely on data transmission beyond the local machine are prone to too many variables to be considered a trustworthy test of performance.

    I'd like to see more serious simulation games, or RPGs with realistic environmental effects.
  • LeeF - Sunday, March 14, 2010 - link

    For GPU and CPU both. Bioshock 2 is the game that's inducing me to upgrade my computer, and it'd be nice to see an Unreal-engine game in your suite.

    I just beat Bioshock 1, and it was like playing Myst on my C2D E6750/Geforce GTS 250. It's the first game I've ever played on this machine that hasn't run silky smooth at 1920x1080. So I'm definitely waiting till I upgrade before I try the sequel.

    Also some Distributed Computing benchmarks like RC5-72 or Folding@Home might be nice, if you can find a release client that runs on both Steam and CUDA. Your site still has a bit of a DC crowd (though not nearly as much as Arstechnica, judging by your OGR-27 ranking) ;) , and it would also be a good indicator of GPGPU performance.
  • samspqr - Sunday, March 14, 2010 - link

    Hi

    You already use 3dsmax, cinebench and povray in your CPU reviews. I know that in older versions the GPU didn't make a difference for those rendering times, but that may have changed: could you check if GPU power makes any difference in the versions you're currently using? if they do, I definitely want to see those included

    (I know you're supposed to use quadros and firepros when working with these apps, but the apps themselves have changed, and it's not really necessary anymore)
  • bdunosk - Sunday, March 14, 2010 - link

    WoW... regardless of it being an older game built on older technology, the player base is gigantic. I'd wager that a lot of us wonder if that newly released GPU will make any difference during peak hours in the major cities or with all the settings cranked in a 25-man raid boss fight. Though I'll also state the obvious that from what I've been able to piece together, the game is much more CPU limited than GPU.
  • samspqr - Sunday, March 14, 2010 - link

    not everybody plays games

    actually, by your reasoning, they should only test Word: nearly everybody uses that
  • samspqr - Sunday, March 14, 2010 - link

    (forget it: now that I re-read it, I don't even understand your post)
  • Paulman - Monday, March 15, 2010 - link

    I agree with bdunosk! WoW has the largest player base in the world for any game. Yes, it's typically more CPU limited than GPU limited, so benchmarks might be more applicable to mid-range to lower-end cards. Thus, perhaps WoW benches should only be featured in those performance classes of GPU reviews. However, I think benchmarking something more intensive like boss raids or Eyeinifinity setups (probably a lot more repeatable than boss raids) at high resolutions and high AA could present a more GPU-limited scenario that WoW players would be interested in.

    I would also like to take this opportunity to reiterate that for performance/mid-range to lower-end cards, it would be BRILLIANT if you also paired them up with mid to low-end CPU's as well, since that is more realistic user rig than pairing a Core i7 Extreme with a Radeon HD 5650 or even a 5770, for example.

    Thanks for listening!
  • moep - Sunday, March 14, 2010 - link

    Call of Duty: Modern Warfare 2 (FPS - everyone seems to play this)
    Team Fortress 2 (FPS - Source engine)
    Batman: Arkham Asylum (Action/Adventure - UE3 engine)
    Starcraft 2 beta (the RTS - bench using a taxing reference replay, use same beta version until release and then switch)
    DiRT 2 (Racing - DX11, tesselation)
    Crysis - Warhead (FPS - …but will it run crysis?)

  • faxon - Sunday, March 14, 2010 - link

    im going to start off by posting a link to toms hardware since they did a review on the game's performance which i found rather puzzling. http://www.tomshardware.com/reviews/star-trek-onli...">http://www.tomshardware.com/reviews/star-trek-onli... . in this review, nvidia cards literally walked all over ATI cards (it's a TWIMTBP game btw), but the general over all performance in the game is comparable to crysis performance across the GPU bench test. the game is definitely heavily GPU limited, and i want to see what anandtech's expert testers can do to pound out how the game performs. would be interesting to see a separate test set for space and ground combat as well, since im sure the 2 perform differently (ground combat is definitely laggier than space combat)
  • faxon - Sunday, March 14, 2010 - link

    ooh i almost forgot to add, you need to post max, avg, and min FPS for each benchmark. i base my card choice off min, avg, and % min FPS, since i want it to always perform at minimum of a certain level in my games for framerate stability. i also would be interested in seeing each video card tested on an amd and intel CPU based platform, as i know it has been proven here and elsewhere that some games behave differently with different cpu/gpu combinations even in games that werent CPU limited. it would also be good to test with different speeds of CPU within a certain generation to give a baseline for comparison since not everyone owns an EE i7 CPU. a lot of the users out there are using dual cores, and while im not saying that you should test with them, im saying that an i7 extreme isnt exactly a real world representation of what most users are playing with. most users are playing on stock clocked dual cores or cheap ass quads still, so if you really wanted real world data you would at least toss in an AMD and Intel bench, and at least 1 underperforming CPU so people can see how their CPU can effect their gaming experience. if anything, at least include the data in the anandtech bench at a later date, with a link to the bench in the article. this would get more users to use the bench when looking for information on products and it would be extremely useful for everyone to make recommendations based on an individual users level of hardware.
  • iwodo - Sunday, March 14, 2010 - link

    Splitting GPU into 5 categories, Enthusiast, Performance, Mainstream, Entry, and IGX.

    Every GPU review would have an representative ( or averages ) from other Categories presents. In the Case where some would ruling the graph. Just simply mention its results. And Other then Numbers, % would be nice as well.

    So in a Benchmark, I could see, how this Performance GFX ( 100%), Igx is 10% ( Meaning Performance GFX is 10 times faster then IGFX in general.

    Other then Games, 2D Benchmark would be nice as well, as well as performance on Windows Aero Acceleration. CAD, Rendering, OpenCL, Photoshop, etc...

    All in all, i dont think there is anything we need to add to the test. We just need a different way of presenting these data to consumers and let them easily find the answer they want.
  • derrida - Sunday, March 14, 2010 - link

    ...because gpus are not only for gaming.
  • Ryan Smith - Sunday, March 14, 2010 - link

    GPGPU benchmarks are something we want to do, but don't expect to see any kind of comprehensive benchmark for it in the near future. We've been poking in to this one looking for an OpenCL/DirectCompute benchmark and thus far have come up empty handed.

    There's a somewhat wider range of programs that can use Stream or CUDA, but that means we have to throw out cross-vendor comparisons. So if we used such benchmarks, it would be sparingly.
  • derrida - Sunday, March 14, 2010 - link

    Have not tried it but it could be a starting point:
    http://www.sisoftware.net/?d=news&f=opencl_rel...">http://www.sisoftware.net/?d=news&f=opencl_rel...

    For me the choice of a new gpu depends solely on the available flops [single|double] as I do work on hpc. Don't know if I am a minority here.
  • falc0ne - Monday, March 15, 2010 - link

    agree with no synthetic benchmarks and pointing out CPU/GPU bottlenecks where they appear
    regards
  • Holly - Sunday, March 14, 2010 - link

    I'd like to see fraps based benchmarks dropped. There are way too many "random" differences brought in to one run compared to other run of the test if the game is not running in benchmark mode on its own (ie. always exactly the same scene).
  • ottawanker - Sunday, March 14, 2010 - link

    Racing games are the only games I play on PC, and so they're the only ones that factor into my choice when buying a card. Often reviews don't have any racing games, or if they do its something from EA with extra motion blur and unrealistic effects.

    I understand that often racing games don't take up as many resources as some of the FPS, but often cards that are great at FPS seem to be crappy at racing games.

    At the moment Dirt 2 would be the obvious choice for a racing benchmark as its relatively recent and supports DirectX 11.
  • Computer Bottleneck - Sunday, March 14, 2010 - link

    I would also like to see minimum frame rates measured.
  • arpitnathany - Sunday, March 14, 2010 - link

    please try to include more RPGs and simulation along with FPSs.

    also a high end i7 does not represent real performance the majority are likely to get,i understand that this has to be done in fairness to the cards so as to bring out the max they can deliver,but no point if we the avg user do not realize that benefit, try using something upper midrange like an i5 750 or one of the low end i7s.
  • Paulman - Sunday, March 14, 2010 - link

    #1. I agree. I would like to see especially MMORPG's tested, because some (like WoW) hav huge user bases. Even though I don't play WoW, I'm curious because I know player base = potential marketability of the card, so it's important. Someone's suggestion that you try something taxing (like a 25-person boss raid, or maybe a 3-screen Eyeinfinity-like setup) with WoW is a good one, since it's less GPU demanding.

    #2. Bad Company 2, because it's very popular and adequately GPU intensive.

    #3. PLEASE include benches with an Athlon II X4 or Core I3 class CPU, or at least a mid range CPU if possible. Theoretical GPU performance isn't always as practical as anticipating the actual performance you'd see when a mid-range GPU is paired up with a low/mid-range quad core CPU, which is a much more realistic rig. Especially when most games are GPU limited, especially with AA and at today's widescreen monitor resolutions. For example, you might see a 20% FPS difference at 1080p 4xAA in Crysis Warhead, let's say, between an Athlon II X4 and a Core I7 depending on the video card, but potentially a 100% difference if you changed the video card. So lower end CPU configs are valid To pair up with many GPU's, but t would be god to see if there is a signiicant performance hit, and in which games.

    #4. Lower resolutions and/or no AA would be very useful as well, especially for mid-range and below cards.

    P.S. And Mass Effect 2, just because it's so cool :P EVE-Online, for the same reason. Lol.
  • slickr - Friday, March 19, 2010 - link

    I agree with this post. Having a mid range CPU to test the latest graphic cards would be great, as it would show just how much(or lack thereof)difference it makes.

    Maybe even a complete new medium system. Its different to see I7 980x ultra turbo extra 4ghz with 6gb of 2200mhz ram and GTX480, but its a whole different story when you put in a phenom II x4 955BE or core I3 CPU with 2GB of ram and then test from GTX480, down to 8800GT.

    Simply put, only 1% of people have such systems as the ones you test games on and thus a medium system would be much more real world like and give more realistic results people can identify with.

    I would also like benchmarks to be run only on bare operating system, meaning no software(antivirus, winamp, firewall, skype, msn, etc...), other that the required drivers, DX and game software.
  • Paulman - Saturday, March 20, 2010 - link

    I agree with you guys agreeing with me, even as I had agreed with the OP.
  • pepe2004x - Tuesday, March 16, 2010 - link

    Totally agree with number 3. I had spend ALOT updating a nice pc to make it the best I could for gaming (going from an EXCELLENT for general use Athlon le1640 to a athlon 2 x4 620, from 2 gigs of ram to 3, from a 4850 to a 5850, getting a xbox360 control, changing the mobo to one with a decent sound chip since I couldnt put a sound card near a gpu, etc, etc) and I just dont want to start thinking about getting a i5 or phenom and spend more. So, putting average cpus on the betchmatk would be cool. :S

    And yeah, I`m a little angry about having spent a lot on pc gaming :(
  • Computer Bottleneck - Sunday, March 14, 2010 - link

    I would like to see triple monitor resolutions tested.

    Unfortunately this would necessitate Anandtech getting two Fermis from Nvidia.

    Oh well, maybe at a later date this could be accomplished?
  • piesquared - Sunday, March 14, 2010 - link

    I agree, definitely need a multimonitor benchmarks, but even if 2 nvidia cards are not available. Eyefinity is on the market, and is defintitely not a gimmik, so it should be tested. Just like intel's turbo mode is constantly used because it is in hardware, the situation with Eyefinity is identical. And there's no doubt AT can afford 2 more monitors, so there's no excuse there. Benchmarking shouldn't just be about comparing competitor's hardware.
  • dgthug - Monday, March 15, 2010 - link

    Eyefinity and Nvidia's multi-monitor solution would be beneficial for everyone (mainly enthusiasts though). Also in multi-gpu set-ups.

    Initially and periodically it would be beneficial to run multiple tests on multi-gpu configurations to test the affect different port utilization has on framerate.

    I believe two standard configurations should be tested. One consisting of 3 monitors at 1280x1024 (for people using older displays) and then another 3 at 1920x1080.
  • philosofa - Sunday, March 14, 2010 - link

    Good going on the no synthetics policy. I'd suggest BC2 as it's a popular DX11 (ish) game with high demands, Empire: Total War or Napoleon as it's also a fairly demanding game but isn't an FPS.

    I would suggest more but I'm too knackered to put together another coherent sentence... night :)
  • tomppi - Sunday, March 14, 2010 - link

    Bring back Age of Conan please?
  • TonkaTuff - Sunday, March 14, 2010 - link

    There needs to be at least one game that can be utilised to research back through old benchmarks of GPU's for "historical" purposes.
    I like being able to see improvements in generations, eg. 8800gt to gtx 280.

    I think Crysis fits the bill nicely and will continue to for another 12 months. It is still capable of bringing the latest cards to there knees at high res, high detail also

    My only other recommendation would be Battlefield, Bad Company 2 as your modern fps game. Quite demanding on the video card compared to some other cough (mw2) console ports that a PC can punch out at over a 100 frames on high without breaking a sweat.
  • NA1NSXR - Sunday, March 14, 2010 - link

    Ideally, I'd like to see the best selling title of every major engine on the market and also keep the synthetics. I want to know why certain cards are strong in some applications and I also want to see how chip design translates into raw computing power in different ways.
  • moozoo - Sunday, March 14, 2010 - link

    Ideally a graph over time of fps for the different cards while playing real games.

    My impression is the Nvidia Fermi is a grunt that give solid performance under all kinds of tough conditions (i.e. higher minimum frame rates)
    and that AMD cards can sprint faster on less demanding scenes (higher maximum frame rate).

    Personally higher minimum frame rates matter more to me. A detailed pros and cons article needs to be written and benchmarks need to give useful information about this for it to be valued.

  • Ryan Smith - Wednesday, March 17, 2010 - link

    The big problem with minimum framerates is that they can often be inconsistent. Unlike averages that neatly average out any quirks, minimums quite often can vary for no good reason, which is a problem given that we shoot for consistency and repeatability here.

    For the games we have that do support minimums, I'm going to vet our results and see if it's consistent enough to meet our standards.
  • Headfoot - Sunday, March 14, 2010 - link

    Empire Total War: Heavy on both Cpu and Gpu, can easily become more difficult with larger battles.

    Battlefield Bad Company 2: Popular, DX11, challenging.
    =======================
    I would also love to see an increased focus on minimum frame rats. If a the minimum is low or the performance is very up and down it seems very choppy. I'd like to know if a card gives consistent performance rather than high averages.
  • kriztophr - Sunday, March 14, 2010 - link

    I'd like to see a return to more tweaking of graphical settings to achieve better framerates, in addition to the preset choices.

    For example, just how big of a performance impact using AA or a certain effect is on the tested GPUs, and perhaps a rollover screenshot to show the difference in image quality.
  • GeorgeH - Sunday, March 14, 2010 - link

    These aren't game requests, but:

    1) I would like to see one Eyefinity/NV MM dedicated benchmark geared towards a 3-display ~1920x1200 setup, and not just a dismissive "if you have the money for 3 monitors just buy a 5970 or GTX480 SLI" comment. I don't have a specific title in mind, I'd just like one that both actually works and for which you take "standard" benchmarks to make extrapolation to untested titles easier.

    2) I would like to see an OpenCL benchmark (or whatever GPGPU library is most applicable/relevant.) Nothing fancy, I'd just like something from which to get a general idea of where things are as we move towards shifting FP to the GPU.
  • Brandonr87 - Sunday, March 14, 2010 - link

    I'd love it if you guys did an After Effects bench. (It can utilize OpenGL with graphics cards, although I don't think it works with cf or sli)

    But yea, definitely would love to see an After Effects benchmark because I use it very often but can't figure out what would help me the most.
  • Brandonr87 - Sunday, March 14, 2010 - link

    I think Synthetic benchmarks are important, mainly because they can't test EVERY game, so for some people looking at what they can do on a standard scale may be helpful.

    As for games to test, I think there should be a wide breadth of games, but usually I just look at what games get similar frames to the ones I play, and go from there.

    I'd vote for Modern Warfare 2, and Dawn of Discovery (aka Anno 1404) but I know Anno probably won't make the list.
  • floobit - Sunday, March 14, 2010 - link

    I develop HPC applications for chemistry. I'd like a benchmark that does many matrix operations using very large (10k x 10k) matrices. Another useful application would 4 threads of iterated floating point calculations, a la monte carlo or an MD simulation. In all of these scenarios the floating point variables would all need to be double precision.
  • BitJunkie - Sunday, March 14, 2010 - link

    It would be really great if you could devise some kind of normalised performance rating that indicates performance relative to some reference hardware and can also be normalised for changes in the test suite.

    Something along the lines of:

    [(ave_rpg_perf / ref_rpg_perf) + (ave_fps_perf /ref_rpg_perf)]/test_suite_factor

    Everytime you devise a new test suite you run the old test suite and new test suite on the same range of hardware and take a statistical look at what the delta observe in the performance rating, then scale back the new results to be normalised with the old test suite. I might be missing some subtleties in this - but I just gave it 2 mins thought, sure you smart fellas can devise something a bit more robust and future proof...

    You'd then get a sense of absolute performance from direct FPS measurements and a sense of relative performance between generatiosn of hardware and test suites.

    I guess it would be the ATmark and ultimately not synthetic. You could insert other items into the equation such as render times and even weight these based on your perceptions of what people value.

    Would love to see somethign like this: you could circulate to key hardware vendors to gain their buy-in and then keep it confidential to avoid sniping from the side lines about your choice of weights and the validity of the formulation.

    Just an idea.

  • samspqr - Sunday, March 14, 2010 - link

    I agree that a summary rating would be really useful, but I know it opens an old debate that was never truly closed: the results (of what card is faster than which other one) will depend on the fine details:

    * simple average vs. geometric average: I'd use geometric, because it's least affected by a big difference in one particular game, but I'm not totally sure it's the best option, and it definitely will affect the comparisons

    * the reference card used: in the old days, it would make one maker look better than the other dependin on whether you used a reference that was good for DX or OGL; now I guess it won't matter so much, but I'd pick an ATI one, as they are the ones dominating the market; and pick a relatively high-end one, otherwise you risk having to redesign your rating system too often

    * the weight of each resolution: does 25x16 really matter? I'd only use 19x12, but 12x10 and 16x10 are still popular...

    * the weight of each setting: again, I'd love to use AA+AF always, but for the review of a mid-to-low-end card it makes no sense

    * the weight of each game: you may think one is more important than others (in the old days that would be quake3)

    still, I'd love to see anandtech take a stand on each of these issues, and publish that ATmark in their reviews
  • NetShroud - Sunday, March 14, 2010 - link

    I would like to see Team Fortress 2 and Mirror's Edge.
  • DigitalFreak - Sunday, March 14, 2010 - link

    Source engine games are useless for benchmarking. Over 60fps is easily achievable even on crappy hardware.
  • Hrel - Sunday, March 14, 2010 - link

    Bioshock 2 - Because I play it, I love it and it's new.
    Mass Effect 2 - Same reasons.
    3D Mark - because it's silly to not have an objective measurement to weigh the subjective games against.
    Then just keep up with the latest CryEngine game, not because I like the games, I don't enjoy them at all, but because it stresses systems.
  • JonnyDough - Wednesday, March 17, 2010 - link

    Bioshock and Mass Effect 2 aren't really new anymore. But then again, neither are any of the games I play. I have yet to find a real good reason to upgrade my system and its based on fairly old tech - although it is waning a bit. Good games take a good bit of time to play and there are so many out there that are old now and will run on my current system.
  • superccs - Sunday, March 14, 2010 - link

    1. performance/$
    2. silent and cool running heat sink designs availability
    3. special features that allow for greater usability, GPU video encoding, physics, or 3D type stuff
    4. Real in game detail, color, overall prettiness factor differences

    Things you can get rid of: SLI and Xfire setups, no one actually does that.

    Great hardware reviews, as always.

    :D
  • jordanclock - Sunday, March 14, 2010 - link

    I know the issue of repeatable test runs makes it difficult, but I recommend a couple of MMOs to be benchmarked. For instance, Champions Online seems to do pretty well at putting stress on even the most up-to-date systems.

    Maybe you could talk to the MMO developers about setting up a sandbox server just for running benchmarks? It would be very nice to see how a certain video card performs beyond the tutorial zones or main cities.
  • alphacheez - Sunday, March 14, 2010 - link

    I'm not sure this exactly falls into the category of the GPU test suite, but I'd like to see what games can be played well on lower end/older/integrated graphics.

    It's not very interesting to see that something (like integrated graphics) gets 5 FPS in Crysis Warhead; it's more interesting to see what games can be played, maybe Half-life 2, Quake 4, or even back to Doom 3 (or games from that sort of vintage) to give an idea of the age of games that give a good experience. This would especially be interesting to include in reviews of Ion-style notebooks (which are often CPU limited), CULV notebooks (which tend to be GPU limited) and other low performance/integrated systems.
  • therealnickdanger - Monday, March 15, 2010 - link

    +1
    +1
    +1

    As a gamer who's not too interested in the latest and greatest eye-candy, I really like the idea of legacy benchmarks for mainstream and low-end graphics cards/IGPs. One thing that drives me insane when shopping around for a new laptop or netbook is trying to find information on "what games can it play?". In all sincerity, the only PC games I play are Halo, WoW, Unreal Tournament (original and 2004), and some occassional Source-based games (CS, DoD).
  • dgschrei - Sunday, March 14, 2010 - link

    I would like to see Bad Company 2 in this lineup for two reasons.
    1: At this moment it is probably the best selling game on PC so there will be a large user base. This will give a lot of people a good comparison point since they already know what fps their card can do in this game.
    2: It's currently the only popular DX11 enabled game. They only use it for soft shadows, but still.

    I also disagree about the no synthetic benchmarks argument. They are important, because they allow us to see, which one of the 2 GPU companies has currently the better architecture.
    Just take Unigine. Although the reported power of the GTX480 is not so far away from a HD5870 it seems to annihilate it there because Nvidia focused on tesselation power. We wouldn't be able to see this in a simple game bench where anything from a certain shader to lazy programming to the CPU can limit the fps.
    They should however not factor into the final rating.
  • shadow2200 - Sunday, March 14, 2010 - link

    What's really needed is software that stresses the maximum amount of cores in a CPU,supports all the features within DX11,and is stressfull enough to put even multi-GPU setups to the test,and no software exists that can do all that...At least not yet unfortunately.(maybe the next version of 3Dmark supporting DX11?)


    Though as a special section,one thing that can make up for the lack of the above,at least in terms of being GPU demanding,is running benchmarks using 3 displays,as a single 30" display costs a fortune still(1000~1500$),and users can easily get 3,24 inch LCD's for much less than that(1/2 as much),and it will be much harder to run those 3 at 5760*1200 resolutions,than it will running a single 30" LCD at 2560*1600.


    Direct comparisons between brands in the above scenario should only be allowed when all GPU makers have a single card that can output to 3 displays though,though obvously it would be a moot point in multi GPU shootouts(2 HD 5870's versus 2 Fermi cards for instance),as both can do it from what i've seen.
  • grv - Monday, March 15, 2010 - link

    furmark
  • HillBeast - Monday, March 15, 2010 - link

    Too right. Furmark is a brilliant test. Even if they don't use it to bench it, they should at least run a system for say a few hours running benchmarks to get a rought idea of how reliable these cards are and how hot they will get under worst circumstances.

    Also they need to test it INSIDE a case so we can know how hot it will be in real life rather than on a bench.
  • Ryan Smith - Wednesday, March 17, 2010 - link

    We do exactly that. Our load temperature tests are done inside a full Thermaltake Spedo case.
  • kanabalize - Sunday, March 14, 2010 - link

    Often we will see only FPS types of games in the benchmarks.. although i feel they are important as many GPU intensive games are made for that genre, let there be some other type of game genre like RPG, MMORPG, RTS, Simulation and etc...

    Also, since graphics card today comes with many other features like physics engine and GPGPU functionality, It would be good if these features are taken into consideration when doing benchmarks as GPU are not used only for gaming nowadays.
  • HillBeast - Monday, March 15, 2010 - link

    The main thing is that while MMORPGs, RTS and the like are popular, they aren't intense enough in my opinion and you reach a point where it won't be hard enough on the GPU for it to be accurate.
  • HotFoot - Tuesday, March 16, 2010 - link

    But that should illustrate a point - that depending on your needs, a top-of-the-line 5970 may be no better than your current 8800GTS.

    Just like it was kind of silly to suggest the i7 980 was the best CPU yet for playing WoW because it pushed the frame rates from some rate too high for the human eye to pick up to an even higher rate.
  • HillBeast - Tuesday, March 16, 2010 - link

    Yeah, but the thing is, if WoW is working fine for you now and it is the only thing you run then why would you be looking at reviews? The point of the review is to show what it is a capable of and if you say wanted to know 'Will I be able to play Crysis because I'm getting bored of WoW?' you can see, yes I can, rather than 'I don't know but it sure as hell gets higher frames in WoW. I'm not trying to bash WoW but I'm 99% sure it is not good for benchmarking, especially seeing it has an FPS limit of like 160 or something. Most GPUs will reach that easily these days so all we'll get is 'The GTX 280 and the 9800GTX and the Radeon 5870 are all equally powerful because they got 160FPS in WoW'.

    Again, not bashing WoW. It's a fun game.
  • Exodite - Monday, March 15, 2010 - link

    I concur, please include some strategy and role-playing games as well.

    I know Guru 3D have been using Anno 1404 for testing and it taxed their rig decently enough on max settings. Something like Dragon Age might be interesting as well, certainly for me.

    Civilization V is coming at the end of the year and that might be a good example of a turn-based strategy games that's bound to be popular.

    I realize that strategy and role-playing games often tax the CPU more so than the GPU but modern incarnations do need a bit of both.

    Indeed, WoW may not be a good benchmark despite having 12 million players but it would be a good idea to keep an eye out for up-and-coming MMOs. Whenever something that's actually viable is outed, that is. *sigh*

    Obviously shooters are going to be the most popular ones still but I'd rather stick to no more than 2-3 different shooters and one game each from other popular genres than 5-6 shooters.
  • Targon - Monday, March 15, 2010 - link

    Dungeons and Dragons Online, as well as Lord of the Rings Online are two games that are popular enough to deserve some attention. With DirectX 10 support currently, and from an AMD press release, DirectX 11 support in 2010.

    My source on this(since some may not have heard):

    http://www.amd.com/us/press-releases/Pages/amd-pre...">http://www.amd.com/us/press-releases/Pages/amd-pre...

    The Free to Play really has saved DDO from being doomed to obscurity, even if it remained up for another ten years.
  • PrinceGaz - Sunday, March 14, 2010 - link

    Different genres is the most important requirement imo.

    There should be at least one turn-based strategy (these tend to often require a very zoomed-out view of what is going on), one RTS (where you tend to focus on individual smaller units closer in), and one RPG with a mainly above view (where you are usually looking at a group of characters in individual rooms or caves etc). Each type of view has its own unique requirements. So that's three non-FPS games.

    I'd add a further two simulation style games: one a flight type (either aircraft or spacecraft) where you're mainly looking at lots of distant objects except when landing or in combat, and one strictly ground level give or take the odd jump over a hill (probably driving but could be something like snowboarding). That makes for five non-FPS games.

    Only once a good graphically-demanding representative of each of those genres has been chosen, should the rest of the list be filled with FPS titles, otherwise we'll end up with the usual "all FPS games except the odd one or two of some other genre thrown in if the author happens to like it".
  • kanabalize - Sunday, March 14, 2010 - link

    maybe do some benchmark with software that uses this functions
  • codedivine - Sunday, March 14, 2010 - link

    It used to be included in some Anandtech benchmarks and I request that it be reincluded. RPGs remain popular with the PC gaming crowd. Not only is this a good and fairly popular game, it is also a good test of GPUs as it still brings down many midrange GPUs to their knees.
  • Ryan Smith - Wednesday, March 17, 2010 - link

    The plan is to include an RPG, so we'll see. RPGs are difficult to meaningfully benchmark, so it's not quite as easy as throwing in a FPS or RTS with a solid benchmark or replay mode.
  • velis - Monday, March 15, 2010 - link

    Heh, I threw my 8800GTS 640MB out the window because I had slideshow in the swamp :)
    Fine game it is.
  • JonnyDough - Wednesday, March 17, 2010 - link

    Mine overheats and gets a few anomolies in ATI tool and when I called EVGA they told me that it wasn't necessarily a sign of a bad card. Funny, my X1650XT doesn't do that...
  • Ryan Smith - Sunday, March 14, 2010 - link

    I'm going to start off the comments here with one condition: no synthetic benchmarks. Our editorial policy continues to be that we only want to use real games, as synthetic benchmarks just encourage AMD and NVIDIA to focus on optimizing for something people can't play. So please don't bother asking for 3DMark.
  • T2k - Thursday, March 18, 2010 - link

    [quote]
    I'm going to start off the comments here with one condition: no synthetic benchmarks. Our editorial policy continues to be that we only want to use real games, as synthetic benchmarks just encourage AMD and NVIDIA to focus on optimizing for something people can't play. So please don't bother asking for 3DMark. [/quote]

    EXCELLENT!

    MAY I SUGGEST A SIMPLE RULE?
    For any benchmark to be considered it must be based on an unmodified engine of a published (NA or EU or Asia) GAME.

    That's it.

    This means the end of 3DMark, Unigine Heaven, SiSoft Sandra etc in this benchmarking suite.
  • slickr - Wednesday, March 17, 2010 - link

    ok.
    In the RTS scene add: Starcraft 2(beta for now) and Napoleon Total War. These 2 titles, along with the current DOW2 are good enough.
    In the Action/FPS scene add: Uncharted 2, Assasins Creed 2, Aliens vs Predator and Metro 2033. Keep only Crysis.
    In the RPG scene add: Dragon Age Origins, Mass Effect 2 and Demon's Souls.
    In the driving/racing scene add: GTA4, NFS: Shift and Dirt 2.
    In the MMO/hybrid scene add: Star Trek, Sims 3, X3.
  • fepple - Tuesday, March 16, 2010 - link

    I'd be interested to see a one off article on synthetics. Not really the high level 3dmarks I'm hoping there are some real low level synthetic benchmarks around you could use to test specific parts of each card? Could then extend that to look at how each of those components go together to draw the finished scenes in games, with potential to explain the difference between certain games and then what bottlenecks are restricting each one. I realise this would be a load of work but your pretty much only people that could do it :)
  • Ryan Smith - Wednesday, March 17, 2010 - link

    For a major architecture article (e.g. Evergreen launch, GF100 launch, etc) we'll sometimes use tools like that to look at the capabilities of the architecture. But this isn't something we would do on a per-card level like we do with our GPU benchmark suite.
  • blasterrr - Tuesday, March 16, 2010 - link

    i would like to see minimum framerates and i would like to see most of the tests in a resolution and a quality setting that is playable.
    So in most of the tests minimum framerates should not in the 25-30 fps range and avg frames should be in the 50-60 fps range.
    If minimum framerates are always in the 30 fps range the avg. fps do not have to be as high.

    And typical demanding gameplay situations should be tested. It s not sufficient for a computer to run a game at 60 fps in most situations but also in critical gameplay situations where you have more load on the cpu and gpu, where fluid gameplay is even more important than in a scenario, where you don t have to react in a fast and fluid way a computer has to be able produce certain minimum framerates.
    In this scenarios it is often the case that some graphics cards in the charts would switch places.
  • TheHolyLancer - Monday, March 15, 2010 - link

    bring in wow (no joke) on 800*600 -> 1920*1200 and then crysis/warhead/2 on 1680*1050 -> 2560*2048 to cover the top end and low end with varying settings at each res (at least 3 AA/AF combos for each res), this will act as the "synthetic" benchmark of low gfx demand and high gfx demand. If wow is undesirable as a bench due to the connectivity to server issue or lack of benchmarking tool (maybe use some of that tools that created them funny videos with? them machinma deals?), replace with CS 1.6 or freelancer or other low gfx demanding game.

    then bring in some 1920*1200 goodness with max AA / AF with L4D2 for source, bioshock 2/ME 2 for unreal, MW2 , CIV IV or V, C&C 3/4/RA3, SC II, Sup Com II.

    the above OR if you want to be less hardcore with your benchies, the above games but under 1680*1050 with mid AA / AF instead of 1920*1200 under max AA/ AF
  • Antryg - Monday, March 15, 2010 - link

    Test each game at 2 resolutions & 2 quality-settings: that shows you how each card scales/doesn't-scale with both quality & resolution!
    Anything less than 2x2 and you end up blind to the actual curve of performance for that card.
    Anything more than 4 and you are filling-in the curve, instead of diversifying one's dataset with more games/apps.

    Bonus rule: make 3/4 of the games be the "problematic" ones ( like that GTS or whatever it was called, racing-game someone mentioned, that doesn't work right on ANY cards... ) so that the mindless-enthusiasm we are biologically prone to can be torpedoed directly with some facts of life...

    Cheers!
  • jive - Monday, March 15, 2010 - link

    I find it really odd that in this particular case the testing methods of Anandtech differ from the objective and scientific approach you normally take.

    What point is test against some game titles when your argument of not to include 3DMark is not to include title for which the GPU vendors optimise their drivers for? Last time I checked the release notes of GPU driver it consist a huge list of specific titles where the driver brings in some improvement. What is that if not optimizing for single title? I'm not saying you should include 3DMark but the decision to omit synthetic benchmarks is done using exceptionally poor argument which is almost utterly without any ground.

    What I want to see from GPU review is the efficency of the HW/SW implementation for some standard API. If you test against a 'real' applications you end up testing the application's implementation of the certain API and/or the driver optimisations for the application under test.

    If your site really wants to differentiate from the mass of HW sites do what you have done in server and storage benches, create your own. Use only the standard compliant ways to use the API and run with different CPU setups to reveal the effects of the platform.

    If not, you're blending into the gray mass of uninteresting reviewers of batch run game benchies ...
  • Targon - Monday, March 15, 2010 - link

    The big problem(which you touch on), is that the drivers are tuned to specific games/applications for performance. As a result, since not everyone plays the same games, it is sometimes a good test to use BOTH titles that the drivers are tuned to do well with, and those that are not.

    I am talking here about games like The Witcher, which did not see enough advertising here in the USA, but would be good to test performance on. Do you think AMD or NVIDIA tuned their drivers for that game? Other games that require a lot of GPU power but do not have specific driver optimizations from AMD or NVIDIA are good to test with as well.

    I am in the minority it seems, in that I am not a fan of first person shooter, or World War II combat games. As a result, when I find a game, as often as not it hasn't gotten much attention when it comes to checking the GPU performance with games _I_ play.

    So, Synthetic Benchmarks have their place in a comparison, but looking for lesser known titles to use in testing would also say a lot.
  • ellarpc - Monday, March 15, 2010 - link

    I want to start off by saying I've been an Anandtech reader for most of a decade. I disagree with your stand on 3dmark. It really is a lot of fun to run. I would like to see synthetic benchmarks added. Just because games are different than 3dmark doesn't mean end users don't want to know the potential bragging rights a video card has to offer compared to the others. I love to play games but I also love to overclock to the max and run synthetics. When buying a video card I generally go for the one that scores higher because I enjoy running benchmarks almost as much as playing some games. I can spend hours tweaking just to lift another 1000 points. I go to overclockersclub to get my information on the futurmark bench results and I'd rather not have to stray from Anand. Anandtech is really the only place that doesn't run them. Your stand on that really is narrow minded. I may not play 3dmark but I use it alot for fun. I hope AMD and NVIDIA continue to optimize for it just as they do for games.

    on another note:
    I would also like to see BFBC2 GPU showdown. A list of what will play it what won't.
    Thanks anand. Love you guys alot.
  • Quidam67 - Monday, March 22, 2010 - link

    I'm with you.

    The aversion to synthetic benchmarking seems almost religious. I'm pretty sure most people already get that synthetic benchmarks don't directly correlate to ingame performance. Most sights that include these tests normally say as much when they include the results.

    If the objective is to make synthetic benchmarks disappear off the market, then Anandtech is being naive, or has an inflated view of it's punching power in the industry.

    Synthetics exist because they offer detailed performance anaylsis that isn't possible within a regular application not designed specifically to provide performance metrics.

    Yes, it's a flawed system, which is why no relies only on these values, but to banash them entirely from a test-suite is a mistake in my opinion. And yes, Anandtech is still a great site -keep up the good work.
  • JonnyDough - Wednesday, March 17, 2010 - link

    The thing about synethetic benchmarks is that other sites use it, and you can compare cards online on the developer's website. So it really isn't needed. Older taxing games however, aren't usually used which is why it would really be great to see Anandtech use them!
  • SlyNine - Sunday, March 14, 2010 - link

    I know this is about GPU testing. But I would like to add when testing CPU's with games like supreme commander. Check the simulation time ( how long it takes to play through a replay) not just the FPS.

    Take Empire TW, It's important to see just how fast the CPU can simulate it. Simulation time in RTS's can be very important.
  • Jediron - Monday, March 15, 2010 - link

    Afcource this is about GPU testing. However, i never seen written in stone that GPU's must be tested with the latest and greatest of CPU's, highly overclocked.

    No, that's a factor "we" made up ourselfs. Wrongly so, if you ask me. It's perfectly fair to test GPU's on more moderate, populair types of CPU's; nothing wrong with that. In fact; most people would
    appriciate those results more then "another i7 Extreme @4Ghz results page".

    Give us the results with dual's, triple's, quad cores. With cheaper cpu's and the best a man can get. That's what we are waiting for! That will be the day we can see immidiatly how much the cpu affects the performance of the GPU, without having to visit multiple sites and reviews.

    Anantech, you have a honorable task lying ahead of you; grab it!
  • rocky12345 - Sunday, March 14, 2010 - link

    Just a few notes
    I think there should be a base set of games older ones say 2 years old
    maybe at least 2 of them that were known to work good on most hardware.
    This way when you test mid range & lower end cards they will at least get more than 3 to 4 fps. The other thing I noticed is when testing mid & lower end cards the testers always seem to think a $50 card should be able to run at the same resolutions as a $300 to $700 & tests them at the top of the pile resolutions like they would with the top end cards lets face it a $50 card is not meant to run at 2048x1536 with max details & it should not be tested at that extreme. So maybe having 2 sets of bench marks one for the high end cards & one for the the budget cards that way everyone gets to see how their potential card performs. Integrated graphics would fall into the lower end card test suite. Lets face it the mid & lower end cards out number the high end cards probably 500 to 1 out in the wilds of peoples systems.

    The other thing is to test on different CPU's not everyone has a Hex core i7@4Ghz. I think using different CPU's from both Intel & AMD is a good thing it will show potential buyers how a graphics card performs on a host of different CPU's not just the ultra high end Intel platform.

    The last point is have a set base of games for each of the two benchmark setups the low end & the high end but always have a selection of current games where you could rotate in to shake things up a bit.

    I won't even get into The way it's meant to be played games to much but having to many type's of games like this that are heavily catered towards one company skews the whole test. I am not sure what AMD calls their vender supported titles but same goes for them as well. Maybe there could be a whole extra test suite setup just for those type of games lol.

    My point is if test sites stop testing the heavily Nvidia & AMD sponsored games maybe both companies would focus more on a OpenCL & DX 11 platform where everyone is on an even playing field.
  • Jamahl - Sunday, March 14, 2010 - link

    I'd love to see a multiscreen eyefinity/whatever nVidia call theirs benchmark although perhaps that is best for individual titles. 5040x1050 would be good as a lot of people will be looking at three cheap 22" screens.
  • Ryan Smith - Monday, March 15, 2010 - link

    There will be Eyefinity/NVSurround testing. It'll be limited to the high-end cards that can already drive a single monitor with power to spare, but we'll have it.
  • Roland00 - Sunday, March 14, 2010 - link

    If they people are buying new monitors, then they will be hard pressed to find any 16:10 screens now a days. 5760x1080 or 4800x900 would be more helpful.
  • demomanca - Sunday, March 14, 2010 - link

    I don't know that more games will help, but to show how the GPU's scale on difference CPU's I think is important. No point showing a card will run a game at 50fps, but only when it's got a 6ghz i7 behind it.
  • PrinceGaz - Sunday, March 14, 2010 - link

    I disagree. If I want to know how fast a game will run on a specific CPU, I'll read a CPU review article.

    In a graphics-card review, I want to know what the graphics-card is capable of when it is not bottlenecked, even if that means pairing it up with a far faster CPU than I currently have.

    If there's time for CPU scaling on a given card, by all means add it to the review, but a graphics-card reviews should always focus first and foremost on how the card performs with the fastest available system.
  • Targon - Monday, March 15, 2010 - link

    There is a place to see how well a video card/GPU will run with limited CPU power though. At the least, a Phenom 2 X4 955 is considered a "fast enough" CPU to see the performance of a video card in most situations. This is why I also posted that it would be good to see GPU performance on different chipsets from time to time. If every test is done on an Intel i7 with Intel chipset, how do we know that a given GPU doesn't have issues with the Intel chipset?
  • piroroadkill - Monday, March 15, 2010 - link

    I agree with this. It's a GPU review, not a CPU review.
  • dubyadubya - Sunday, March 14, 2010 - link

    +1
  • Crusse - Sunday, March 14, 2010 - link

    I'd be interested in that too, but I guess it'd add quite a lot of time to the benchmarking process. Maybe only add one or two last generation CPU's.

    I wonder if monitoring GPU load in different games is a reliable way to measure the relative size of a bottleneck...
  • jimhsu - Monday, March 15, 2010 - link

    Is there a way to generate a 2 dimensional contour plot of GPU performance vs. CPU performance?

    Let's say you have GPUs ranked from last to first on the x-axis, and CPUs ranked last to first on the y-axis. Color resulting performance by blue (worst) to red (best). The result, assuming perfect scaling, should be blue in the bottom left corner, and red in the top right corner. (example: http://yfrog.com/ja33420199j">http://yfrog.com/ja33420199j )

    In contrast, a GPU limited game would look like this: http://yfrog.com/jagputj">http://yfrog.com/jagputj

    The difference is subtle, but it's there - in the 2nd image, notice that improvements to CPU speed don't affect performance all that much, while improvements to GPU speed matter a lot.

    There's probably a much easier way to explain this, but no site that I know if has ever attempted a multidimensional visualization like this before. I don't know just how much sample data you have to make this possible, but it would be an exceedingly interesting feature.




  • ET - Sunday, March 14, 2010 - link

    I agree that there's no need for synthetic benchmarks. Regarding games, I think that a suite (and a set testing rig) is good for a comparison over several reviews, but I think that adding some other games here and there won't hurt. In particular, I'd be interested in games that particularly favour a specific graphics card architecture. I'd like Anandtech to test a lot of games on a limited number of cards, and if there's a game that's interesting (GPU limited but performs well on specific cards), test it on more cards.

    In general, though, just have a test suite that's made of best selling games which are GPU bound.
  • ComputerGuy2006 - Sunday, March 14, 2010 - link

    I agree with no synthetic benchmarks, keep it with real games.

    More game benchmarks the better, I do understand time is an issue. Maybe pick 4-5 main games (with 2-3 resolutions) then have about 5-6 EXTRA games were you only bench at a single resolution (such as 1920x1200).

    This way we not only know how the cards work on the new games on different resolutions, but we also get a sense (with 1 res) on how they perform on other/older games compared to other cards.

    And I highly recommend you keep working on the "Bench (beta)". I think it has potential for interesting feature such as: The ability to customize review pages based on the user reading it. So for example right now I have an "4850 512MB". If this site was dynamic enough, it would be possible to have that video card included as part of the benchmarks in the review. This way on every article GPU review release id instantly be able to compare it with my cards (or cards on my wishlist) while reading the review.
  • JonnyDough - Monday, March 15, 2010 - link

    Also in agreeance with game benchmarks only. Also would love to add that Fallout 3 or Oblivion needs to be in there - or any game that uses these types of engines which is heavy on shaders and textures.
  • JonnyDough - Monday, March 15, 2010 - link

    Maybe you could "group" games somehow according to what strain they put on certain components. Such as Oblivion/Fallout/Dragon Age? - find a way to make a "group" of tests and then label it "heavy on textures and shaders RPGs" or something easier. That way when we want to play a type of game we have a list of games that gives us some idea of how a card should perform. Again as I listed before, all we really need to know is will this card suit my needs or won't it? If a card does well in FPS but not in other types of games, but all I play is FPS then I just want a card that will do well in games. Recommendations for cards that might perform well in future games would be helpful as well - according to where you see current developers going with PC gaming.
  • jimhsu - Monday, March 15, 2010 - link

    I actually think Fallout/Oblivion aren't all that heavy on the GPU side per se (they depend a lot on CPU also). I would recommend splitting "game" benchmarks into CPU focused (RTSes like Supreme Commander, and I would include Fallout/Oblivion in here), and GPU focused (Crysis, most FPSes, etc) benchmarks. Or to make this easier, you could implement general categories: FPS (Crysis, L4D2, MW2), RTS (Supreme Commander, Starcraft 2 (when it comes out), Battleforge), RPG (Mass Effect, Dragon Age), Simulation (be creative here), etc.
  • NT78stonewobble - Tuesday, March 16, 2010 - link

    While it may not be a standard game per say Oblivion with some custom texturepacks are really hard on any graphics card at higher resolutions.

    And the do add alot of candy to the game. There are quite a few of these mods that enhance the graphics of older games. To the point of bringing even relatively new graphics cards to their knees.

    Eg.: Oblivion with Quarls texturepacks are quite hard on my:

    e8500@3.8 ghz, 4 gb ram and gtx 260(216 core) when it is run at 1920x1200 with as much eye candy as possible (4x+ AA with transparency AA)
  • moriz - Tuesday, March 16, 2010 - link

    fallout 3 and oblivion are not GPU limited at this point, so there's really no point in including them. they are by far and away CPU limited, especially with heavy duty mods thrown in like deadly reflexes.

    on my system (E7200 3.2ghz, G33 chipset, hd 5850), oblivion runs at the same fps no matter what graphic setting i use. but as soon as there are more than four actors on screen, the game absolutely CRAWLS as the CPU struggles with calculating physics interactions on those actors. ironically enough, oblivion running with mods would greatly benefit from intel's new 6-core CPU.
  • JonnyDough - Wednesday, March 17, 2010 - link

    Would be good to see benchmarks of Fallout3/Oblivion using gulftown. Anyone know if there are any? My Opteron 185 (Think FX-60, My Opty goes in Socket 939 as well) OC'd about 250mhz seems to handle Oblivion fairly well with my 8800GTS 640mb card and 2ghz of ram. But the settings aren't all the way at the top on everything and I can't run high AA and AF. I'm wondering what a GTS 250 would do for the game, if anything.
  • JonnyDough - Wednesday, March 17, 2010 - link

    Also curious if cards like the GTS 250 would be PCI-E bus limited, as it isn't a 2.0 slot.
  • iamezza - Monday, March 15, 2010 - link

    +1 for more games, even if only one resolution is tested.

    +1 for the bench like idea, being able to include your current card into the graphs would be awesome.

    Also I think there should be room for older titles as well, because if you only ever focus on the latest and greatest games the older games tend to get forgotten by the driver teams. For example I play a racing sim called GTR2 that was released Sep 2006 which is still one of the best racing sims ever made. With modern cards this game should run extremely well, but on nvidia cards the game experiences random lockups and on ATI cards the framerates aren't great when the going gets tough. GTR Evolution is a similar game with a similar game engine released in 2008 which has the same problems. I'm sure there are many other older games out there in similar situations.
  • Ryan Smith - Wednesday, March 17, 2010 - link

    I like the idea of doing older titles, however it's not practical for our general benchmark suite. However if there's sufficient interest, we could periodically (e.g. once or twice a year) do roundups of older games.
  • JonnyDough - Wednesday, March 17, 2010 - link

    In a way older games are just as if not more important because if they're old and they have good replay value then people are still playing them. The majority of gamers are NOT on the cutting edge of PC gaming. A new game too often requires new and expensive hardware (unless its made by Valve!) and not everyone has a "fast" internet connection either. Nor does everyone want to waste $50-60 on a new game. Leave that for the people buying consoles. A lot of us PC gamers (check general ages of PC gamers) are generally an older group of guys (and gals) who are around thirty years old and spend our money a bit more wisely. We have families, jobs, and not a lot of time to game so we find a good game and stick with it awhile - on the cheap.

    Sidenote: I don't use torrents at all fyi, its stealing - although I won't buy a game I haven't seen good footage of or played a demo of so I agree on not blowing $50 on a game I can't return - although part of the reason they started doing this was BECAUSE of pirating.

    Anyway my whole point is that tons of us are still just picking up Fallout 3 or Oblivion now that prices are cheap, bugs are fixed, and availability is a plenty. Not to mention we can build a decent little system for under $600 that will play it.
  • RjBass - Monday, March 15, 2010 - link

    While the bech idea is a good one, it is far from practicle. There are so many graphics cards on the market that Anandtech simply wouldn't have the time or resources to cover the many different cards out there. I can see it already, while they coveered your 4650, sombody elses 2900 xt was left out, so they start complaining about it.

    The best thing to do currently in my opinion is to take a look at the latest benchmarks, then see where that card fits on Tom's graphics card chart, then look at where your card fits on the chart and base your decision off of that. http://www.tomshardware.com/reviews/best-graphics-...">http://www.tomshardware.com/reviews/best-graphics-...

    In an ideal world, Anandtech would create their own hierarchy chart so that we could get an even better idea of how our older cards compare to the latest reviews and benchmarks on Anandtech. Because the way that Anandtech tests the cards may not be the same as Toms Hardware and thus Tom's hierarchy chart might not be the best answer to your question.
  • novaton - Sunday, March 14, 2010 - link

    ComputerGuy2006, And I highly recommend you keep working on the "Bench (beta)". I think it has potential for interesting feature such as: The ability to customize review pages based on the user reading it. So for example right now I have an "4850 512MB". If this site was dynamic enough, it would be possible to have that video card included as part of the benchmarks in the review. This way on every article GPU review release id instantly be able to compare it with my cards (or cards on my wishlist) while reading the review.

    I totally agree with you, great idea.
  • ProphetMikey - Tuesday, March 30, 2010 - link

    A quick search shows that despite the talk about older games being included, nobody has yet mentioned one particular older game that can still bring a new system to its knees; Neverwinter Nights 2.

    I played it a few months ago on my new PC (Core i7@3.8GHz + 4870X2 + 4870 all overclocked) and STILL couldn't get playable framerates in the outdoor scenes at 2560x1600. In fact, performance in the outdoor scenes was so bad I ended up playing the game in 1920x1200 with some of the shadow, lighting and detail settings turned down. Even then I hit some truly horrendous locations that brought the FPS down below 5.

    I'd be seriously impressed with any system that allows you to walk around the opening town fair scene with everything maxed out at 2560x1600.

Log in

Don't have an account? Sign up now