Comments Locked

57 Comments

Back to Article

  • Lord of the Bored - Friday, September 19, 2014 - link

    Is this still using a FPGA dev board, or does nVidia have an ACTUAL PART to offer monitor manufacturers yet? Because really, I can't take GSync seriously until nVidia is willing or able to make silicon for it.
  • MadMan007 - Friday, September 19, 2014 - link

    I don't recall hearing anything about custom silicon, and given the pricing, even as a value-add feature, I would guess there is not any custom silicon yet.

    There might not be any either if AdaptiveSync becomes the open standard for this type of technology. Then it is more likely it will just become a standard feature of monitors, or at least monitors with the appropriate scalar chips, along with DP 1.3. I am not certain if AdaptiveSync is a requirement for DP 1.3 certification though, or optional.
  • nathanddrews - Friday, September 19, 2014 - link

    Optional for 1.2 and 1.3.
  • Lord of the Bored - Friday, September 19, 2014 - link

    Honestly speaking, the strangeness of such a highly-touted feature being implemented in such a haphazard way suggests to me that nVidia was nowhere near ready to unleash GSync to the world and rushed it out in whatever state they could get it into when something forced their hand. Something like an effort to bring VESA adaptive sync to the desktop, just as a random hypothetical example.
  • nathanddrews - Friday, September 19, 2014 - link

    The historical record implies that it was the other way around. G-Sync was demonstrated live and had a limited product release before Free-Sync, Adaptive Refresh, or A-Sync were ever mentioned by VESA or AMD.
  • Lord of the Bored - Saturday, September 20, 2014 - link

    Which was kind of my point. If they got wind of an effort that was not yet unveiled to the public, they could rush-job a release and steal the glory or ignore it and become known as the guys doing an incompatible re-implementation of someone else's product.

    Apparently, there are still no actual GSync chips available and they are STILL shipping FPGA prototypes on development boards, a year after the initial "release". That is not the mark of a product that was released last year by a real company. The only people who ship FPGAs are hobbyists making devices that MIGHT sell a thousand units.
    nVidia is not a hobbyist working out of a garage. They are a major company with real silicon designers, access to real fabs, enough business volume to make a run of custom silicon cost-effective... and they're shipping FPGAs on a product they launched a year ago. Why?
    I am open to alternative explanations for why a major company would conduct business in the manner of a guy in his garage. But I can't explain it without assuming an external factor forced their hand.
  • nathanddrews - Monday, September 22, 2014 - link

    You're making an awful lot of assumptions based upon nothing, really. The reality is that if you want, you can buy and use G-Sync today. You can't use FreeSync.
  • flatrock - Monday, September 22, 2014 - link

    "The only people who ship FPGAs are hobbyists making devices that MIGHT sell a thousand units."

    I've seen a lot of products shipping in higher volumes than a few thousand units using FPGAs.

    nVidia might be more interested in licensing the technology to monitor manufacturers that could integrate it into their own chipsets rather than selling ASICs that provide the feature.

    Of course they may also be worried that another similar tech will get adopted as the standard, and they would get stuck with a large quantity of ASICs that no one wants to buy. FPGAs are much lower risk.
  • chizow - Friday, September 19, 2014 - link

    Huh? Nvidia developed an ASIC from scratch to do what was not previously available, how does that in any way spell haphazard or unready to launch G-Sync to the world? Do you think FPGAs like this just pop out of holes in the ground overnight?

    This is in comparison to the competition, which announces a competing solution that doesn't do what they say it does, need to haphazardly force a standard through a notoriously slow standards board, isn't ready to demonstrate their tech for nearly a year after they make the announcement of a competing solution, isn't going to be compatible with the overwhelming majority of their recent products, and won't be available for purchase in actual products until next year.
  • Gigaplex - Saturday, September 20, 2014 - link

    They didn't make an ASIC, they programmed an FPGA. And yes, they can pop up very quickly. That's the whole point of an FPGA.
  • Lord of the Bored - Saturday, September 20, 2014 - link

    FPGAs don't pop out of holes in the ground, no. They are designed and manufactured by companies like Atmel and Xilinx, then sold to other entities for prototyping and small-scale hardware production.
    FPGAs are not custom parts in any way, shape, or form. You can likely buy the EXACT part nVidia uses for GSync off any electronic component store in the world(I've been unable to find an identifier for WHICH FPGA they're using, just that they use one).
    The ONLY difference is you don't have access to the code nVidia is configuring theirs with.

    There is NO nVidia-developed silicon in a GSync module.

    Which is kinda my point. How many product launches have you seen shipping on a generic FPGA dev board instead of a custom chip? I'mma bet it's close to zero.
    Most of the cost of GSync IS that FPGA. An actual GSync chip would be far cheaper, because it has fewer wasted transistors and no need to be reprogrammable.
    And why would they ship an expensive reprogrammable FPGA on a development board(and ask the end user to install it in their own monitor initially!) instead of shipping dedicated silicon to monitor manufacturers for integration? It makes no sense whatsoever if there isn't some need to bring a product to market before there's an actual product to ship.
  • chizow - Saturday, September 20, 2014 - link

    The point is that Nvidia programmed the ASIC to do what was not previously capable with regard to variable refresh. This takes time and effort, calling it haphazard is careless plain and simple, especially when the competition's "response" is to make a lot of claims about their own solution that have since been systematically proven to be untrue.

    Once you put in the work to program the FPGA, it is now Nvidia-developed silicon. Because as you have both stated, you can buy a FPGA, but you can't one that mimics a G-Sync module without the code Nvidia configured it with.
  • Lord of the Bored - Sunday, September 21, 2014 - link

    Programming an FPGA is not converting it into nVidia-developed silicon. It is still an FPGA. Just one that functions as a prototype of a device that might one day be dedicated silicon. There are differences, most notably that dedicated silicon is a lot cheaper and can't be reprogrammed.

    I find it hard to call a product launch that resembles a guy in his garage anything other than haphazard when it comes from a multi-billion-dollar company.

    I haven't seen any evaluation of VESA Adaptive Sync, systematic or otherwise, and I would really love to, if you'd care to share the link. Because I like the idea that nVidia, as a VESA member, saw that their plans to adapt Adaptive Sync to the desktop were flawed, and rushed prototypes out so they could get a BETTER solution to market before VESA's plan became entrenched. It's certainly a nicer explanation than any I've been able to come up with.
    So... link please?
  • chizow - Sunday, September 21, 2014 - link

    Of course it's Nvidia developed silicon at that point, just as any other FPGA you purchase for an intended purpose is now that maker's silicon. Similarly, something as simple as an EEPROM once programmed, is now the silicon of that particular BIOS/board maker. Same for an SSD controller, especially in the case of Intel, where they just purchase commodity controllers from SandForce and program it with their own firmware to get the desired results. That's Intel silicon at that point.

    Again, how is it haphazard how they've designed and brought their G-Sync module to market? They invented a solution from scratch, using existing panel interfaces and replaced the entire logic board on an existing panel on the market. Bear in mind, they don't make monitors or monitor logic boards, they make GPUs. They've since worked with at least half a dozen monitor makers to come out with dozens more designs (4-5 of which have hit the market already) that are going to implement G-Sync not even a year after it was announced.

    You seem fixated on them using an FPGA instead of a dedicated ASIC in your haphazard characterization, but if the FPGA is the best tool for the job and gets the job done, why bother changing? Also, if you read some of the more technical background of G-Sync, you might know that it needs to be tuned to each specific panel, so a FPGA being programmable may very well be the reason Nvidia does not go to a more rigid ASIC design.

    You haven't seen an evaluation of VESA Adaptive Sync because it doesn't exist. AMD has made hints they will debut it sometime soon, there may be something at their GPU event next week (9/25), but what we have seen so far has been underwhelming, and I dare say, even haphazard in the way they have presented their "solution" as their latest efforts don't even accomplish what they say it does (dynamic refresh rates synchronized to the GPU).

    http://www.anandtech.com/show/8129/computex-2014-a...
    Latest demo from Computex, showing a fixed, sub-60Hz refresh rate on some unbranded panel.
  • Samus - Monday, September 22, 2014 - link

    To call it not nVidia silicone is like calling a Geforce TSMC silicone. Technically it is manufactured by somebody else, but it's unique to nVidia, their IP, and their programming and quality standards.

    ASIC, FPGA, BGA, whatever you want to call it. If it is programmed for or by nVidia, it's technically their silicone, just not physically.
  • Kuad - Monday, September 22, 2014 - link

    Silicon, as in the element, not silicone, as in breast implants.
    http://www.livescience.com/37598-silicon-or-silico...
  • Lord of the Bored - Wednesday, September 24, 2014 - link

    A program running on someone else's silicon does not make it YOUR custom-developed silicon.
    It is your custom-developed PROGRAM, but just because I run Windows does not mean my CPU was designed by Microsoft.

    And your link doesn't say anything you claim it does. It says the anonymous monitor is an off-the-shelf product, not an unbranded panel, that it was made FreeSync-compatible with nothing more than a firmware change, and that it varies in refresh between 40 and 60 Hz, but YouTube is fixed 30 Hz(which for obvious reasons makes the video demonstration of dubious utility). It also says that is not THE range for FreeSync, but the monitor is allowed to select a compatible range of refresh rates from the much broader range presented by the standard.
    That's what was said. Not that the demo was running at a fixed sub-60Hz refresh.

    I would still genuinely love to see a link to a source where AMD's claims about FreeSync are "systematically proven to be untrue" , but... that wasn't it.
  • flatrock - Monday, September 22, 2014 - link

    ASICs are cheaper per unit, but there is a larger up front cost, and a large set-up cost for making a batch. Because of the up front costs you need to make an awful lot of them before they become cost effective, and you need to be sure that the design won't need tweaked, because they can't be changed.
    With a FPGA the up front costs are much smaller, and if you need to tweak the design, you can usually do it in circuit. In many cases you can do it as a firmware update while the product remains with the customer.
    As for how many products I have seen shipping with a FPGA? More than I can count. I worked in military/aerospace for quite a while. Smaller volumes than commodity consumer products, and lots of updates and even customization of existing products. We are still talking about more than a few thousand units, but not into the millions.

    G-Sync hasn't hit real mass market volumes yet, so a FPGA makes sense, especially if they managed good volume discounts on the FPGA.
  • chizow - Friday, September 19, 2014 - link

    DP 1.3 spec was just ratified a few days ago, nothing about Adaptive Sync or Variable refresh. The dream that all DP 1.3 monitors will be FreeSync capable is dead (for the few that ever thought this would be the case).
  • Gigaplex - Saturday, September 20, 2014 - link

    It's in DP 1.3, it's just optional.
  • Flunk - Friday, September 19, 2014 - link

    I can't take G-Sync seriously until it's part of the VESA standards... Which it won't be, because they already included Adaptive-Sync (AMD trademark Freesync) in displayport 1.2a. The second displays the support display port 1.2a come out G-Sync is done, which is good because since it's a part of the standard everyone can you it and we'll all get the benefits of this technology.
  • chizow - Friday, September 19, 2014 - link

    But you can take a standard that no one has bothered or bring to market seriously?

    G-Sync is available TODAY with more solutions being introduced daily and the results are impressive, what more do you need to see in order to take it seriously? The competition talks a big game, but they have still yet to perform their big reveal and are only now starting to talk about partnerships to bring their tech to market sometime next year.

    And no, DP 1.2a displays won't spell the end of G-Sync, because the number of G-Sync capable GPUs dwarf the number of DP 1.2a capable GPUs by a greater than 2:1 ratio. G-Sync is supported by all Kepler family and later GPUs, meanwhile DP 1.2a FreeSync is only going to be supported by the handful of GCN 1.1+ GPUs and APUs on the market.
  • Samus - Monday, September 22, 2014 - link

    I have a Philips 144Hz monitor and the TN panel is terrible. From all the monitors I've looked at with 144Hz, or "G-Sync" all have TN panels and they all look pretty crappy. Is IPS not capable of refreshing 144Hz?
  • wolrah - Friday, September 19, 2014 - link

    "The XB280HK will support 3840x2160 at 60 Hz via DisplayPort 1.2"

    So is this saying its 144Hz support is sort of like those cheap "120Hz" 4K TVs where you get either the resolution or the refresh rate, but not both?

    If so what's the maximum resolution for 144Hz mode? Both of my housemates have the 24" 1080p version (though without the G-Sync module installed because for the price it makes more sense to just get a better GPU) and seem to love the higher refresh rate.
  • Flunk - Friday, September 19, 2014 - link

    DisplayPort 1.2 maxes out at 60hz @ 4K resolutions so there is no way it could possibly support 144Hz @ max res, which makes the 144Hz mode utterly worthless, if you want that you'd be better off with a 1080p display with 144Hz support which would cost less than half what this does.
  • know of fence - Friday, September 19, 2014 - link

    This monitor is actually an almost ideal combination of 1080p gaming and 4K for general use. High resolution for still pictures, high framerates and low motion blur for gaming. We can't have both because there simply isn't enough bandwidth, but that's a good thing because the pursuit of high resolution gaming is a futile quest of using however few GPU advances Moore's Law has left to render invisibly small pixels.

    The good thing about the current situation with display interfaces, that they don't create the wrong expectation of 4K@144 HZ gaming. Leave 4K to pictures and text sharpening, use it to perfectly scale 1080p and 720p games and video.
  • nathanddrews - Friday, September 19, 2014 - link

    According to the manuals on Acer's website:

    XB280HK is max refresh 60Hz at all resolutions 4K and below.
    XB270H is max refresh of 144Hz at all resolutions 1080p and below.

    Acer's marketing makes it sound like the XB280HK does 4K@60Hz and other resolutions up to 144Hz, but it is not a jack of all trades. You can't change from 4K@60Hz to 1080p@144Hz. Until I see a hands-on review testing out all possible resolutions and refresh rates, I'm going to stick to the Acer literature.
  • ArtForz - Saturday, September 20, 2014 - link

    If the XK280HK is using the same CMI M280DGJ-L30 panel as all the other 28" UHDs (it's the same 1ms, 1000:1 contrast, 300cd/m^2, 8bpc+frc TN specs, so imo pretty likely), then I strongly doubt it'll do anything > 60Hz.
    According to my info, the only modes the t-con on that panel supports are 3840x2160@30Hz@10bpc over 4-lane v-by-one and 3840x2160@60Hz@10bpc over 8-lane v-by-one.
  • wolrah - Friday, September 19, 2014 - link

    Personally I want the 4K mostly for desktop use, where the higher refresh rate just makes everything look like butter. Gaming-wise I expect to mostly run at 1080p and take advantage of the perfect 2x scaling. I'll be getting a GTX970 before I upgrade my monitor, so older games will probably run fine at native, but I don't expect to run the latest and greatest at 4K with any sort of detail.

    I would just go with a 1440p monitor, but that makes my 2x scaling resolution 720p and that's just too low for a screen that size. I end up dealing with blind idiots running the wrong resolution on nice LCDs all day at work and hate it ("waah I don't want to wear my glasses and it's blurry to me anyways so why not make it blurry for everyone?"), so I refuse to do anything other than native or perfect multiple scaling on my own hardware.

    1080p on the desktop gains me pretty much nothing over my current 1680x1050 20 inchers, and since most 1080p monitors are 22+ inches I'd generally be losing DPI.

    You are correct about the interface bandwidth, which gets me in to the mood for a bit of a rant...

    Is it just me or have the digital display interfaces been really terrible about future-proofing? We had VGA basically unchanged for decades and ran from 640x480 up through 1200p reliably and high-quality implementations went beyond that. DVI single link officially could only match VGA at 1200p60, though for some reason dual link had no restrictions on clock rate and thus could be pushed to more than double that if the hardware allowed.

    DisplayPort and the more recent evolutions of DVI via HDMI have always seemed to be just barely ahead of the hardware. We're finally getting higher resolutions after stagnating for a decade but it seems that every time we end up needing a new generation of interface. For those who primarily care about gaming that's not a big deal, their new monitor will basically require a new GPU as well, but for those of us who want the resolution on the desktop my current GTX550 would handle Aero just fine even 5K just fine if it only had a way to actually deliver that signal.

    We need display interfaces to be more like Ethernet, where updates come more on the decade time span but bandwidth goes up 10x each time leaving plenty of headroom for years of increasing use.
  • extide - Friday, September 19, 2014 - link

    The thing is, these are already very b/w intensive interfaces. DP 1.3 is like 30Gbit, and remember most people aren't even using 10Gbit LAN yet! It's just technically difficult or expensive to make interfaces that go much faster. You start running into issues where you need to go optical, or use active powered cables (expensive) (see Thunderbolt). Getting ~30Gbit over a fully passive cable is quite a feat, honestly.
  • Gigaplex - Saturday, September 20, 2014 - link

    It's funny you reference ethernet as an example. Gigabit ethernet has been a bottleneck for my home network for years, and there's still no sign of 10Gbit coming to consumers any time soon.
  • Asmodian - Sunday, September 21, 2014 - link

    Yes, 1GbE had been limiting my home networks for several years now too. The same bandwidth block happened with Ethernet. We had pretty ubiquitous 10Mbps, then ~5 years later 100Mbps was cheap, then after another 5 years 1000GbE then.... 10 years later 10 GbE is just starting to become affordable.

    Recently I finally got 10GbE at home, but only between the server and one client as switches are still priced and designed for business use. 10GbE can run over passive cables, either over twin-ax cables which cost quite a bit (a 5m cable is ~$75) or over 10/100/1000 compatible Cat-6a but with Cat6a it uses coded packets of data for 10 Gbps. This would not be an issue for a display interface, the increase in latency is very small, but the coding and decoding increases power use and cost.

    That is one of the funny things about modern computer interfaces, think how many years went by with the first bw and then color TV formats.

    Going higher and higher is harder and harder, I wonder if the market would accept a fiber optic monitor cable at some point in the future? For now 2560x1440@144Hz or 4K@60Hz (very similar bandwidth) seems to be the "cheap cable" limit, next we start adding lossless compression or even "visually lossless" compression.

    I would love DP 1.4 to have 300 Gbps but I would not like needing to pay the prices 100GbE cards/switches go for.
  • Daniel Egger - Sunday, September 21, 2014 - link

    10GBase is actually quite affordable if you plan it wisely and only connect system that *really* need that much bandwidth. Problem is more finding devices that can handle 10GBase physically and then also on the OS level. We recently decided not to go 10GBase yet on some of our already equipped servers because we couldn't saturate the link to make it worth the effort.

    I don't quite get what you're saying about Cat.6a. There's always going to be some form of "encoding" going on and especially when going over copper wires it absolutely makes sense to have some form of error detection and correction which is adding the overhead here.

    Question: If you consider Twinax expensive (and no, it's not considering that you'll have two SFP+ modules included) and don't mind having a fixed non-twisted-pair setup, why don't you use Multi-Mode fibre instead?
  • chrnochime - Sunday, September 21, 2014 - link

    If you really have a need to have 10GbE at home, then obviously you're doing something that should be generating income, at which point all equipment purchased should be tax deductible anyway. If you're still balking at the cost, then obviously you are not making enough money to justify the equipment cost.
  • fade2blac - Friday, September 19, 2014 - link

    I don't get the choice between 4k@60Hz or 1080p@144Hz for a 27-28" size display. I want to upgrade from a 5-year old 23.5" to a 27" display but the sweet spot is in between the extremes of these two products. And yet, it is as if display manufacturers are actively avoiding it. I imagine people would jump on a quality monitor with variable refresh that doesn't cost as much or more than the GPU(s) required to drive it.

    Compared to a 23.5" 1080p@60Hz monitor (a very common and affordable display with good PPI at arms length):

    A 27"@1080p is essentially like stretching the image so the PPI drops. I guess the thought is to sacrifice spacial detail (pixel density) for temporal detail (refresh rate). It feels a bit like paying a premium for a compromise.

    A 28"@4k makes the PPI is so much higher (unnecessarily so) that it breaks typical DPI scaling and signal bandwidth places hard limits refresh rates. Not to mention that you'll want to keep high image quality settings to take advantage of that overblown PPI, but then no single GPU can deliver solid frame rates at this resolution. The mainstream enthusiast doesn't have $2k to spend on a monitor and the GPU's to drive it to it's full potential.

    As for variable refresh, if I can regularly push 120+ FPS then I have to think that variable refresh has a greatly diminished effect. And if graphical settings result in frame rates swinging between 40-140 FPS then I would expect that minimum frame rates would still dominate the "smoothness" of the experience even when using variable refresh (ie. 144Hz is overkill).

    Finally, there is the disproportionate pricing of display size vs. resolution. Aggressive 4k pricing cuts have helped make those displays less ridiculously expensive starting at about $500-600 while good 1440p displays have been stubbornly stuck around $400-500 for a couple of years. Meanwhile, a solid 23" 1080p IPS display can regularly be found for around $130 or so.

    Is it asking too much for a display that can do the following?
    1) use a good 27" 1440p non-TN panel
    2) support a variable refresh range of at least 24-96 Hz
    3) not lock buyers into a single GPU vendor
    4) cost less than $300 or so

    If you build it...we will buy!
  • SunLord - Friday, September 19, 2014 - link

    1080p and 4K are tv standards so everyone will focus on them for marketing and multiple markets namely monitors and TVs. 1440p is only really used in monitors so it will likely fade away soon as 4k panels ramp up. Eventually we will see 1080p on all monitors less than 23" and 4k on everything 23" and bigger. Supporting 1080p and 4k only also means you only need to develop/carry 2 controllers.

    So don't hold your breath on $300 name brand 1440p monitors
  • ArtForz - Friday, September 19, 2014 - link

    If you think a FHD 23.5" at arms length is good PPI, you probably have really long arms or really bad eyesight. ;)
    I'm looking at a 28" UHD at ~70cm distance right now, and for displaying high contrast line drawings I still have to choose between annoying jaggies or annoying edge blur.
    If there were "affordable" (< $600 for TN, < $1k for IPS) 20-24" single stream UHD@60Hz monitors I'd likely get 2 or 3, but sadly no such thing exists yet.
  • fade2blac - Saturday, September 20, 2014 - link

    I do sit just a bit further than 70 cm, more like 80 cm or 32". So maybe just beyond arms reach would be more accurate. As for "good" PPI, "good enough" would be referring to 20/20 equivalent or 60 pixels per degree. This is about exactly the case for a 27" 1440p @ 32" distance.
    http://phrogz.net/tmp/ScreenDens2In.html

    Your use case of high contrast line drawings is likely the sort of thing the new 5k displays are intended for. As another Anantech article pointed out, "...human vision systems are able to determine whether two lines are aligned extremely well, with a resolution around two arcseconds. This translates into an effective 1800 PPD."
    http://www.anandtech.com/show/7743/the-pixel-densi...
  • eddman - Friday, September 19, 2014 - link

    I might upgrade from my current 21.5" 1920 x 1080 display, but don't want to go bigger than 23".

    Why no one seem to be making smaller 4k monitors?

    Technical reasons? Financial?
  • fade2blac - Friday, September 19, 2014 - link

    Do you have a specific reason for wanting 4k at such a small size? Unless you sit about 12-18 inches away from your monitor, I would say 4k is a waste of pixels on such a small screen. Your 1080p 21.5" monitor has ~102 pixels per inch (some AV guides for human visual acuity estimate 108 PPI is enough for a 32" viewing distance, 123 PPI gets you to about 28" away). 4k @ 23.5" jumps the pixel density all the way to ~187 PPI which seems hard to justify. This would put the ideal viewing distance at ~18" which is rather close for desktop use. A better compromise might be maybe 1440p @ 21.5" which would give you ~125 PPI or ~28" ideal viewing distance.
  • fade2blac - Friday, September 19, 2014 - link

    *Correction: 1440p @ 23.5" is ~125 PPI, 1440p @ 21.5 is ~ 137 PPI
  • eddman - Friday, September 19, 2014 - link

    For gaming. I could get a sharp picture without AA.

    Also, since 4k is 4 times 1920 x 1080, I could set the resolution to 1920 x 1080 in heavy games and it'd still scale well. It look be as nice with 2560 x 1440 display.

    Another reason for wanting a smaller display; gaming on a PC with a big display isn't comfortable, because you sit close to the display.

    The whole picture is in my face and I get disoriented.

    The solution would be to sit further away, but then I can't reach the mouse and keyboard comfortably.

    I sometimes resort to switching to a lower resolution w/o scaling, like 1600 x 900, so that I get a smaller picture.
  • eddman - Friday, September 19, 2014 - link

    @anandtech
    This is 2014. Improve this damn comment system. Is it too much to ask for an edit function?!
    -------------------------------------------

    *It won't be as nice with 2560 x 1440 display.
  • Impulses - Friday, September 19, 2014 - link

    If your mouse/keyboard distance is a problem too it sounds like you might need a new desk as much as a new display... :p
  • eddman - Friday, September 19, 2014 - link

    You raised a good point, actually.

    That's the problem. I can't get a deeper desk (is that how you say it?). I have a very small room.
  • otherwise - Friday, September 19, 2014 - link

    Pretty much no market for it. It's too big for a laptop and most high end desktops these days won't settle for less than 24" or 27".
  • thewhat - Friday, September 19, 2014 - link

    Partly technical, because a lot of software still doesn't have proper scaling. And partly market, because a lot of consumers don't know scaling exists and just assume everything would be tiny.
  • SunLord - Friday, September 19, 2014 - link

    Will see 23" 4k monitors in a few years as they shrink down just like it took time for 1080p monitors to filter down to 23" and smaller when they first came out.
  • BD2003 - Friday, September 19, 2014 - link

    If the ROG swift can do 1440p/144hz over DP 1.2, surely this can as well?

    I think I could live with having the choice between 1440p/144hz and 2160p/60hz g-sync, but if it's only g-sync mode is 2160p/60hz, than that's a real shame.
  • ant6n - Friday, September 19, 2014 - link

    I'd prefer 21:9 as well.
  • jb510 - Friday, September 19, 2014 - link

    4k for $800 with portrait orientation. Awesome! Wait it's TN. How's that going to work
  • Centauri0 - Friday, September 19, 2014 - link

    Ug.. all we want is 27" 1440p or higher IPS!!! with GSync.. Why oh why can't it be. I can't stand TN period when you go IPS you just can't go back.
  • chizow - Friday, September 19, 2014 - link

    To each his own, but that's not true for everyone. I went from IPS to 120+Hz TN and just can't go back, there are trade-offs but I prefer crisp, blur-free images in motion over great still images/desktop and muddy blurry mess in motion.
  • Asmodian - Sunday, September 21, 2014 - link

    I went from 2560x1440@110Hz IPS (overclocked) to 2560x1440@144Hz TN with Gsync and I would not go back either. The low motion blur is amazing even without ULMB on. Of course there is a trade off going to TN; we need faster refresh rate IPS type panels.

    However, if you are looking at 4K we are bandwidth limited to 60Hz so IPS would have similar apparent motion blur. Something like a PA328Q with G-sync would be my pick. Of course that would be $2000+.
  • chizow - Sunday, September 21, 2014 - link

    Haha Swift owner? Same here /high five.

    Great monitor, I've had a few SLI G-Sync issues but should be resolved once my 980 arrives this week. :)
  • gwolfman - Wednesday, September 24, 2014 - link

    "...8 bit+HiFRC TN display...and 72% NTSC." Horrible, horrible panel! I can't imagine how ugly this will look for for games/photos/movies. Let's just throw away 30% of your color space... NO thanks!
  • Tempest1232 - Sunday, September 28, 2014 - link

    Is that a mistake in here or is the 4k screen supposed to be 144Hz compatable ?

Log in

Don't have an account? Sign up now