The Next Step: 3D

Moving on, NVIDIA’s ace in the hole here is clearly 3D Vision Surround, offering a feature that AMD is still months off from being able to match. By combining their existing 3D Vision technology with NVSLS NVIDIA can offer ultra-widescreen 3D, and having seen it in person at CES we’ll be the first to profess that it definitely looks impressive. With respect to 3D Vision there’s nothing new here – it’s just the same glasses now looking at more than 1 monitor – but it’s a natural extension of the technology.

For those of you interested in the nuts & bolts of how 3D Vision Surround will work, NVIDIA also released some additional technical details on the feature. With 3D Vision Surround NVIDIA is faced with a great deal of rendering to do: not only do they need to render a very large frame to cover 3 monitors, but then they need to render it again for the other eye. In doing this, they have taken an interesting approach to dividing up work – this image from their press kit pretty much says it all:

In short NVIDIA has opted to stick with Alternate Frame Rendering while at the same time having 1 GPU render both the left and right eye versions of any individual frame, rather than having each GPU work on each eye. It’s truly alternate frame rendering rather than alternate eye rendering. Under normal circumstances having the same GPU render two images in a row would increase input lag, but when it comes to 3D Vision there’ s no penalty since the second image represents the same gamestate as the first image, meaning the pre-rendered frame count isn’t actually higher as it would initially appear.

Meanwhile 3D Vision Surround also puts further restrictions on the hardware compared to NVIDIA Surround. A big difference will of course be performance due to rendering another image for the 3D effect, but there’s also a matter of monitors. For NVIDIA Surround the monitor requirements are analogous to Eyefinity: 3 monitors at the same resolution, refresh rate, and sync polarity. However for 3D Vision Surround, monitors must be more than similar: they must be identical. This is because 3D Vision is heavily reliant on V-sync timing to match up a frame with blocking the correct eye, and different monitors can have slightly different refresh timings even though they operate at the same refresh rate. As a result all 3 monitors must be the same to ensure they all refresh at the exact same moment.

 

LCD Monitor Requirements
NVIDIA Surround 3D Vision Surround
Similar: Resolution, Sync, and 60hz Refresh Rate Identical Monitors, 120hz Refresh Rate

The other interesting quirk when it comes to 3D Vision Surround and monitors is portrait orientation. For NVIDIA Surround, NVIDIA holds parity with AMD straight down to the support of landscape and portrait orientations. But with 3D Vision, horizontal linear polarization comes in to play: because both the monitor and the glasses are polarized for glare reduction and image blocking respectively, they have to be properly aligned. Anyone who has tilted their head when viewing 3D through a linear system has seen what happens if the screen and glasses are not aligned: the polarization blocks the entire image. As  a result 3D Vision Surround is not currently usable in portrait mode when used in conjunction with an LCD monitor – only projectors are supported.

Last but not least, there’s a matter of software. While NVIDIA is 9 months behind AMD overall when it comes to triple-monitor gaming, they’re starting off in a better position than AMD did. It wasn’t until March of this year that AMD delivered on bezel correction for Eyefinity, meanwhile NVIDIA is launching with it today. Even on this timeline NVIDIA is still behind AMD, but with this taken in to account they’re not as far back as it would first appear. Grouping groupies may be disappointed however – while we don’t have the software in hand to confirm this, it doesn’t look like NVIDIA has any monitor grouping features at this time.

First Thoughts

Without the software in hand there’s not much more we can say about NVIDIA Surround and 3D Vision Surround at this time. We are of course interested in the performance of NVIDIA’s solution, not only in comparison to AMD’s Eyefinity, but also comparing the GTX 200 series to the GTX 400 series and seeing the performance hit to moving to 3D Vision Surround from NVIDIA Surround modes. Teething issues will also bear watching as this is NVIDIA’s first beta driver : we already know GTX 200 series 3-way SLI isn’t supported, and that anti-aliasing modes above 2x on 3D Vision Surround are also unsupported – both things we would hope to see NVIDIA fix down the line.

Perhaps the best news for the moment though is that this should help to further legitimize the concept of triple-display gaming with game developers. While it’s not a difficult technology to work with, having only 1 GPU manufacturer support it made it yet another manufacturer-specific feature. With NVIDIA on board this will provide further incentive for developers to take the technology in to consideration. Since the biggest thorn in the side triple-display gaming continues to be the lack of proper aspect ratio support, any progress here in converting developers will be of benefit for both sides.

In the meantime stay tuned for our full review of NVIDIA’s 3D Vision Surround later this month.

NVIDIA Launches 3D Vision Surround
Comments Locked

61 Comments

View All Comments

  • Wayne321 - Tuesday, June 29, 2010 - link

    Great, more competition = more innovation. I'm still waiting for quality 120Hz LCDs though, for a non-3D upgrade.
  • Etern205 - Tuesday, June 29, 2010 - link

    Nvidia needs 2 video cards to get 3 screens running while ATi can do the same thing with one (minus the 3D).
  • Death666Angel - Tuesday, June 29, 2010 - link

    They can do 3D, in various formats, for example:
    http://www.engadget.com/2010/06/26/sapphire-makes-...
    Stupid nVidia bias.
  • Heatlesssun - Tuesday, June 29, 2010 - link

    While technically true there are a LOT of caveats to this. First is the resolution. I'm running 5760x1200 and that's really a LOT of pixels to push through only on card, I'm running 3 480s and they muderize a single 5870 at this resolution. So while you need two cards you WANT two cards at these resolutions unless you are willing to give up a LOT of eye candy.
  • B3an - Sunday, July 4, 2010 - link

    I'd also like to point out that the 1GB 5870 does not have enough RAM either for gaming on multiple displays at this kinda res.
    It can cope with older games of course, but on my single 2560x1600 display i can run out of RAM with my 5870's in some games made within the last 3 years, but it usually requires some level of AA. When you hit the RAM limit though you go from perfectly smooth into single digit FPS.

    If you're playing at higher res i believe two 480's would be overall best. I know ATI have 2GB 5870's but they are not as fast.
  • wiak - Monday, July 5, 2010 - link

    AMD can do 6 screens on 2x standard eyefinity cards, or 12 screens on 2x eyefinity6 cards :P

    didnt AMD show a linux based desktop running a flight sim that was having 24 screens? :D

    that must have been 4x eyefinity6 cards

    jup here is it
    http://www.youtube.com/watch?v=N6Vf8R_gOec
  • Earballs - Tuesday, June 29, 2010 - link

    Same boat here. Give me high resolution 120Hz IPS LCD please. I can't upgrade my display in good conscience until then.
  • james.jwb - Tuesday, June 29, 2010 - link

    exactly the same feeling here, IPS 120hz, hurry!
  • PPalmgren - Tuesday, June 29, 2010 - link

    Aren't IPS panels already problemating regarding input lag? I want black accuracy as much as you do, but its not worth the atrocious input lag problems I've experienced on my panels. I have a good TN 24'' on the left now for gaming, on the right I have my 24'' expensive IPS that I'll never play a game on again.

    Considering the main function of this type of system is gaming (what else could you use 3d surround for?) I'd say fixing the input lag issues would be precedent unless you're playing a game that requires low reaction time and accuracy.
  • PPalmgren - Tuesday, June 29, 2010 - link

    Oops...disregard that post. I'm an idiot, its a PVA panel I have.

Log in

Don't have an account? Sign up now