Seeing the Future: DisplayPort 1.2

While Barts doesn’t bring a massive overhaul to AMD’s core architecture, it’s a different story for all of the secondary controllers contained within Barts. Compared to Cypress, practically everything involving displays and video decoding has been refreshed, replaced, or overhauled, making these feature upgrades the defining change for the 6800 series.

We’ll start on the display side with DisplayPort. AMD has been a major backer of DisplayPort since it was created in 2006, and in 2009 they went as far as making DisplayPort part of their standard port configuration for most of the 5000 series cards. Furthermore for AMD DisplayPort goes hand-in-hand with their Eyefinity initiative, as AMD relies on the fact that DisplayPort doesn’t require an independent clock generator for each monitor in order to efficiently drive 6 monitors from a single card.

So with AMD’s investment in DisplayPort it should come as no surprise that they’re already ready with support for the next version of DisplayPort, less than a year after the specification was finalized. The Radeon HD 6800 series will be the first products anywhere shipping with DP1.2 support – in fact AMD can’t even call it DP1.2 Compliant because the other devices needed for compliance testing aren’t available yet. Instead they’re calling it DP1.2 Ready for the time being.

So what does DP1.2 bring to the table? On a technical level, the only major change is that DP1.2 doubles DP1.1’s bandwidth, from 10.8Gbps (8.64Gbps video) to 21.6Gbps (17.28Gbps video); or to put this in DVI terms DP1.2 will have roughly twice as much video bandwidth as a dual-link DVI port. It’s by doubling DisplayPort’s bandwidth, along with defining new standards, that enable DP1.2’s new features.

At the moment the feature AMD is touting the most with DP1.2 is its ability to drive multiple monitors from a single port, which relates directly to AMD’s Eyefinity technology. DP1.2’s bandwidth upgrade means that it has more than enough bandwidth to drive even the largest consumer monitor; more specifically a single DP1.2 link has enough bandwidth to drive 2 2560 monitors or 4 1920 monitors at 60Hz. Furthermore because DisplayPort is a packet-based transmission medium, it’s easy to expand its feature set since devices only need to know how to handle packets addressed to them. For these reasons multiple display support was canonized in to the DP1.2 standard under the name Multi-Stream Transport (MST).

MST, as the name implies, takes advantage of DP1.2’s bandwidth and packetized nature by interleaving several display streams in to a single DP1.2 stream, with a completely unique display stream for each monitor. Meanwhile on the receiving end there are two ways to handle MST: daisy-chaining and hubs. Daisy-chaining is rather self-explanatory, with one DP1.2 monitor plugged in to the next one to pass along the signal to each successive monitor. In practice we don’t expect to see daisy-chaining used much except on prefabricated multi-monitor setups, as daisy-chaining requires DP1.2 monitors and can be clumsy to setup.

The alternative method is to use a DP1.2 MST hub. A MST hub splits up the signal between client devices, and in spite of what the name “hub” may imply a MST hub is actually a smart device – it’s closer to a USB hub in that it’s actively processing signals than it is an Ethernet hub that blindly passes things along. The importance of this distinction is that the MST hub does away with the need to have a DP1.2 compliant monitor, as the hub is taking care of separating the display streams and communicating to the host via DP1.2. Furthermore MST hubs are compatible with adaptors, meaning DVI/VGA/HDMI ports can be created off of a MST hub by using the appropriate active adaptor. At the end of the day the MST hub is how AMD and other manufacturers are going to drive multiple displays from devices that don’t have the space for multiple outputs.

For Barts AMD is keeping parity with Cypress’s display controller, giving Barts the ability to drive up to 6 monitors. Unlike Cypress however, the existence of MST hubs mean that AMD doesn’t need to dedicate all the space on a card’s bracket to mini-DP outputs, instead AMD is using 2 mini-DP ports to drive 6 monitors in a 3+3 configuration. This in turn means the Eyefinity6 line as we know it is rendered redundant, as AMD & partners no longer need to produce separate E6 cards now that every Barts card can drive 6 DP monitors. Thus as far as AMD’s Eyefinity initiative is concerned it just became a lot more practical to do a 6 monitor Eyefinity setup on a single card, performance notwithstanding.

For the moment the catch is that AMD is the first company to market with a product supporting DP1.2, putting the company in a chicken & egg position with AMD serving as the chicken. MST hubs and DP1.2 displays aren’t expected to be available until early 2011 (hint: look for them at CES) which means it’s going to be a bit longer before the rest of the hardware ecosystem catches up to what AMD can do with Barts.

Besides MST, DP1.2’s bandwidth has three other uses for AMD: higher resolutions/bitdepths, bitstreaming audio, and 3D stereoscopy. As DP1.1’s video bandwidth was only comparable to DL-DVI, the monitor limits were similar: 2560x2048@60Hz with 24bit color. With double the bandwidth for DP1.2, AMD can now drive larger and/or higher bitdepth monitors over DP; 4096x2160@50Hz for the largest monitors, and a number of lower resolutions with 30bit color. When talking to AMD Senior Fellow and company DisplayPort guru David Glen, higher color depths in particular came up a number of times. Although David isn’t necessarily speaking for AMD here, it’s his belief that we’re going to see color depths become important in the consumer space over the next several years as companies look to add new features and functionality to their monitors. And it’s DisplayPort that he wants to use to deliver that functionality.

Along with higher color depths at higher resolutions, DP1.2 also improves on the quality of the audio passed along by DP. DP1.1 was capable of passing along multichannel LPCM audio, but it only had 6.144Mbps available for audio, which ruled out multichannel audio at high bitrates (e.g. 8 channel LPCM 192Khz/24bit) or even compressed lossless audio. With DP1.2 the audio channel has been increased to 48Mbps, giving DP enough bandwidth for unrestricted LPCM along with support for Dolby and DTS lossless audio formats. This brings it up to par with HDMI, which has been able to support these features since 1.3.

Finally, much like how DP1.2 goes hand-in-hand with AMD’s Eyefinity initiative, it also goes hand-in-hand with the company’s new 3D stereoscopy initiative, HD3D. We’ll cover HD3D in depth later, but for now we’ll touch on how it relates to DP1.2. With DP1.2’s additional bandwidth it now has more bandwidth than either HDMI1.4a or DL-DVI, which AMD believes is crucial to enabling better 3D experiences. Case in point, for 3D HDMI 1.4a maxes out at 1080p24 (48Hz total), which is enough for a full resolution movie in 3D but isn’t enough for live action video or 3D gaming, both of which require 120Hz in order to achieve 60Hz in each eye. DP1.2 on the other hand could drive 2560x1600 @ 120Hz, giving 60Hz to each eye at resolutions above full HD.

Ultimately this blurs the line between HDMI and DisplayPort and whether they’re complimentary or competitive interfaces, but you can see where this is going. The most immediate benefit would be that this would make it possible to play Blu-Ray 3D in a window, as it currently has to be played in full screen mode when using HDMI 1.4a in order to make use of 1080p24.

In the meantime however the biggest holdup is still going to be adoption. Support for DisplayPort is steadily improving with most Dell and HP monitors now supporting DisplayPort, but a number of other parties still do not support it, particular among the cheap TN monitors that crowd the market these days. AMD’s DisplayPort ambitions are still reliant on more display manufacturers including DP support on all of their monitors, and retailers like Newegg and Best Buy making it easier to find and identify monitors with DP support. CES 2011 should give us a good indication on how much support there is for DP on the display side of things, as display manufacturers will be showing off their latest wares.

Barts: The Next Evolution of Cypress Seeing the Present: HDMI 1.4a, UVD3, and Display Correction
Comments Locked

197 Comments

View All Comments

  • StriderGT - Friday, October 22, 2010 - link

    I agree with you that the inclusion of the FTW card was a complete caving and casts shadows to a so far excellent reputation of anandtech. I believe the whole motivation was PR related, retaining a workable relation with nvidia, but was it worth it?!

    Look how ugly can this sort of thing get, they do not even include the test setup... Quote from techradar.com:

    We expected the 6870 to perform better than it did – especially as this is essentially being pitched as a GTX 460 killer.
    The problem is, Nvidia's price cuts have made this an impossible task, with the FTW edition of the GTX 460 rolling in at just over £170, yet competently outperforming the 6870 in every benchmark we threw at it.
    In essence, therefore, all the 6870 manages is to unseat the 5850 which given its end of life status isn't too difficult a feat. We'd still recommend buying a GTX 460 for this sort of cash. All tests ran at 1,920 x 1,080 at the highest settings, apart from AvP, which was ran at 1,680 x 1,050.

    http://www.techradar.com/reviews/pc-mac/pc-compone...
  • oldscotch - Friday, October 22, 2010 - link

    ...where a Civilization game would be used for a GPU benchmark.
  • AnnihilatorX - Friday, October 22, 2010 - link

    It's actually quite taxing on the maps. It lags on my HD4850.

    The reason is, it uses DX 11 DirectCompute features on texture decompression. The performance is noticeably better on DX11 cards.
  • JonnyDough - Friday, October 22, 2010 - link

    "Ultimately this means we’re looking at staggered pricing. NVIDIA and AMD do not have any products that are directly competing at the same price points: at every $20 you’re looking at switching between AMD and NVIDIA."

    Not when you figure in NVidia's superior drivers, or power consumption...depending on which one matters most to you.
  • Fleeb - Friday, October 22, 2010 - link

    I looked at the load power consumption charts and saw the Radeon cards are better in this department and I don't clearly understand your statement. Did you mean that the nVidia cards in these tests should be better because of superior power consumption or that their power consumption is superior in a sense that nVidia cards consume more power?
  • jonup - Friday, October 22, 2010 - link

    I think he meant the nVidia has better drivers but worse power consumption. So it all depends on what you value most. At least that's how I took it.
  • zubzer0 - Friday, October 22, 2010 - link

    Great review!

    If you have the time I would be wery happy if you test how well these boards do in Age of Conan DX10?

    Some time ago you included (feb. 2009) Age of Conan in your reviews, but since then DX10 support was added to the game. I have yet to see an official review of the current graphics cards performance in AoC DX10.

    Btw. With the addon "Rise of the godslayer" the graphics in the new Khitai zone are gorgeous!
  • konpyuuta_san - Friday, October 22, 2010 - link

    In my case (pun intended), the limiting factor is the physical size of the card. I've abandoned the ATX formats completely, going all out for mini-ITX (this one is Silverstone's sugo sg06). The king of ITX cases might still be the 460, but this is making me feel a bit sore about the 460 I'm just about to buy. Especially since the 6870 is actually only $20 more than the 6850 where I live and the 6850 is identically priced to the 460. There's just no way I can fit a 10.5 inch card into a 9 inch space. The 9 inch 6850 would fit, but there's a large radiator mounted on the front of the case, connected to a cpu water cooling block, that will interfere with the card. I've considered some crazy mods to the case, but those options just don't feel all that attractive. The GTX460 is a good quarter inch shorter and I'm getting a model with top-mounted power connectors so there's ample room for everything in this extremely packed little gaming box. I'm still kind of trying to find a way to put a 6850 in there (bangs and bucks and all that), which leads to my actual question, namely:

    The issue of rated power consumption; recommended minimum for the 460 is 450W (which I can support), but for the 6850 it's 500W (too much). How critical are those requirements? Does the 6850 really require a 500W supply? Despite having lower power consumption than the 460?! Or is that just to ensure the PSU can supply enough amps on whatever rail the card runs off? If my 450W SFF PSU can't supply the 6850, it really doesn't matter how much better or cheaper it is ....
  • joshua4000 - Friday, October 22, 2010 - link

    let me get this straigt, fermi was once too expensive to manufacture due to its huge die and stuff but its striped down versions sell for less and outpace newley released amd cards (by a wide margin when looked at the 470)

    amds cheaper to manufacture cards (5xxx) on the other hand came in overpriced once the 460 had been released (if they havent been over priced all along...), still, the price did not drop to levels nvidia could not sell products without making a loss.

    amd has optimised an already cheap product price wise, that does not outperforme the 470 or an oced 460 while at the same time selling for the same amount $.

    considering manufacturing and pricing of the 4870 in its last days, i guess amd will still be making money out of those 6xxx when dropping the price by 75% msrp.
  • NA1NSXR - Friday, October 22, 2010 - link

    Granted there have been a lot of advancements in the common feature set of today's cards and improvement in power/heat/noise, but the absolute 3D performance has been stagnant. I am surprised the competition was called alive and well in the final words section. I built my PC back in 7/2009 using a 4890 which cost $180 then. Priced according to the cards in question today, it would slot in roughly the same spot, meaning pretty much no performance improvement at all since then. Yes, I will repeat myself to ward off what is certainly coming - I know the 4890 is a pig (loud, noisy, power hungry) compared to the cards here. However, ignoring those factors 3D performance has barely budged in more than a year. Price drops on 5xxx was a massive disappointment for me. They never came in the way I thought was reasonable to expect after 4xxx. I am somewhat indifferent because in my own PC cycle I haven't been in the market for a card, but like I said before, disappointment in the general market and i wouldn't really agree with the statement that competition is alive and well, at least in any sense that is benefiting people who weight performance more heavily in criteria.

Log in

Don't have an account? Sign up now