Philips has started sales of its Momentum 436M6VBPAB ultra-high def gaming LCD, which happens to be one of the world’s first shipping monitors to obtain the DisplayHDR 1000 certification. Pricing of the product varies from country to country and from store to store, but in general its retail price is in line with a rather moderate sub-$1000 MSRP announced a couple of months ago.

The Philips Momentum 43-Inch at a Glance

The Philips Momentum 436M6VBPAB is based on a 43-inch 8-bit + FRC MVA panel featuring a 3840×2160 resolution, 720 – 1000 nits brightness (typical and peak), a 4000:1 contrast ratio, a 4 ms response time, 60 – 80 Hz refresh rate (optimal and overclocked), 178°/178° viewing angles, and so on (check out all the specs in the table below). A major selling point of the display it its Quantum Dot-enhanced backlighting that enables it to cover an above-average 97.6% of the DCI-P3 color gamut as well as 100% of the sRGB color range. The monitor is AMD FreeSync certified, however we haven't seen the FreeSync minimum refresh rate. So it's unclear whether this monitor supports a wide enough range for LFC. Though even if the LCD’s FreeSync ranges are far from what hardcore gamers might want, it is still good to have a dynamic refresh rate tech rather than not have it at all on a 43-inch gaming monitor.

The 43-inch Philips Momentum was the industry’s first display to get the DisplayHDR 1000 logo from VESA (the second monitor to get the badge was the ASUS ROG Swift PG27UQ, which is now also shipping). This means that it complies with VESA’s rather strict requirements for brightness (600 nits full-screen long duration minimum, 1000 nits full-screen flash minimum) and black levels (corner maximum limit of 0.05 nits and tunnel maximum limit of 0.1 nits). The latter are particularly hard to get even on a VA panel, and all but requires local dimming. Philips hasn't published anything here, but from reports I've seen elsewhere, it sounds like they're using a 32 zone edge-lighting system.

The Philips 436M6VBPAB has four display inputs: 1x DisplayPort 1.2, 1x Mini DisplayPort 1.2, 1x HDMI 2.0, and 1x USB Type-C that can be used both for display connectivity and as an upstream port for a USB 3.0 hub. As expected from an ultra-large LCD, the unit supports Picture-in-Picture and Picture-by-Picture capabilities from two sources. As for audio, the display has a 3.5-mm audio input, 3.5-mm audio output as well as two built-in 7-W speakers with the DTS Sound badge. Finally, the 43-incher comes with a remote controller that can be used to control the monitor as well as other devices connected using HDMI (e.g., media players, game consoles, etc.), which is particularly handy as the huge LCD will clearly be used for watching content.

To read more about the Philips Momentum 43-inch monitor you can check out the original material covering the product as we move to the topic of the news story — availability and prices.

Pricing and Availability

The Philips Momentum 436M6VBPAB is currently available from Amazon in the U.S., Germany, France, Spain, and Japan. Since the product is very special and probably is in high demand, its prices at Amazon in Europe seem to be somewhat inflated. Good news is that a number of stores in Austria, Germany, Poland, and Nordic countries are selling (or at least taking pre-orders) on the 43-inch gaming LCD at its MSPR of €799 or even below that.

Pricing and Availability of the Philips Momentum 436M6VBPAB
Retailer Country Local Price Equivalent in USD
Amazon U.S. $1,000 $1,000
Germany €990 $1,159
France €1,081 $1,265
Spain €1,081 $1,265
Japan ¥106,205 $959
MediaMarkt Germany €869 $1,017
Otto €790 $925
Saturn €869 $1,017
MediaMarkt Austria €799 $935
ProShop €805 $942
Saturn €799 $935
Ale.pl Poland €727 $850
Zizako €738 $864
Komplett Denmark 6,490 kr. $1,018
Finland €754 $882
Sweden 7,790 kr. $870
Arvutitark Estonia €720 $843

Since the Philips 436M6VBPAB is sold not only by Amazon in the U.S., and a couple of large retail outets like MediaMarkt or Saturn in Europe, but can also be bought from smaller retailers, it is evident that the product is available worldwide at price points that do not really bite. Apparently, Philips (just like ASUS, MSI, Samsung, and NVIDIA) believes that demand for large gaming-grade displays is about to skyrocket and it has a product that offers premium features at a moderate price.

Philips Momentum 43" 4K HDR display with Ambiglow
  436M6VBPAB
Panel 43" MVA
Native Resolution 3840 × 2160
Maximum Refresh Rate 60 Hz (normal)
80 Hz (overclocked)
Response Time 4 ms GtG
Brightness 720 cd/m² (typical)
1000 cd/m² (peak)
Contrast 4000:1
Backlighting LED with quantum dots
Viewing Angles 178°/178° horizontal/vertical
Aspect Ratio 16:9
Color Gamut 100% sRGB/BT.709
97.6% DCI-P3
HDR HDR10
DisplayHDR Tier 1000
Dynamic Refresh Rate Tech AMD FreeSync
? - 80 Hz
Pixel Pitch 0.2479 mm²
Pixel Density 102 PPI
Inputs 1 × DisplayPort 1.2
1 × Mini DisplayPort 1.2
1 × HDMI 2.0
1 × USB Type-C
Audio 3.5 mm input/output
2 × 7 W DTS Sound speakers
USB Hub 2 × USB 3.0 Type-A connectors
1 × USB 3.0 Type-C input
VESA Mount 200 × 200 mm
MSRP Europe: €799
UK: £699
US: $799 without VAT (unconfirmed)

Related Reading:

Comments Locked

54 Comments

View All Comments

  • xype - Tuesday, July 3, 2018 - link

    He can’t play the games he wants at the settings he wants on 4K, and you’re telling him he’s "wrong" and stupid?

    You’re a retard.
  • JoeyJoJo123 - Wednesday, July 4, 2018 - link

    >He can’t play the games he wants at the settings he wants on 4K

    Here's the thing, idiot, there's three people in this situation.
    1) The graphics card manufacturer.
    2) The software developer.
    3) You, the one playing games.

    For the graphics card manufacturers, they're working closely with fab companies like TSMC for products on future process nodes before it's even out in the public. They're inherently limited by what TSMC and current technology can manufacture. They're inherently limited by the pricing and availability of memory they can interface on the GPU. They're inherently limited by thermodynamics and how much power their chip can consume at the process node they're stuck at and pushing the envelope too much can burn them. AMD Fury GPUs weren't particularly well received since their forward-thinking HBM got too expensive over time and the power consumption was high compared to competing products, outside the silicon's ideal operating efficiency. You can “push the silicon” harder, but it doesn’t always make a better product.

    Yes, let’s blame the damn GPU manufacturers for not getting their shit together. Don’t they know GPUs operate on magic? They’re the reason why “4k 60hz gaming™” isn’t possible.

    For software developers, they’re under time and budget constraints to make a product and ship it out the door to meet investor and consumer demands. Whether it’s the 27th Call of Duty or a completely original game, a publishing company sunk money to develop it and they’re looking to make it back with game sales or support down the line from expansions/DLC/microtransactions, or whatever cost model(s) they employ. Ultimately early design decisions with the game inherently limit what future patches can rectify down the line (in terms of optimization). It’s not always sensible to go back and optimize it so well that the game performs at 4k 60hz smoothly if all that work won’t result in sales that justify that extra development time; the game was already produced and sold and the bulk of its lifetime sales have already passed in most cases. 4k monitor users are in the utter minority anyways.

    Ultimately, some blame falls on the software developer, and they should’ve employed better design decisions to make more optimal use of computer resources to make a game that’s playable at 4k 60hz from the beginning, but development costs doesn’t make this feasible, so the best they can really do is push those lessons learned into future titles.

    And then there’s you, the “entitled gamer™” that likes to point the finger and blame other people, when it’s a matter of PEBCAK. You aren’t a hardware manufacturer, no amount of whining makes process node shrinks happen faster. You aren’t a software developer, no amount of whining makes games look prettier and operate in a more optimized manner. You only have control of two things here:

    1) Buy better/faster hardware. This costs $$$.
    2) Tune your game settings according to your system’s specs to get the best balance between visuals and performance.

    You don’t get to whine on Anandtech article comments without being called out for being stupid for not realizing these facts. If you’re too poor to afford a better GPU, hit up the spam bot in the comments section with the totally legitimate pyramid scheme they have going on to afford a 1080Ti. If you already have a 1080Ti + 8700k + exotic cooling and overclocks, then you’ll have to make compromises on settings, because the only thing you have in YOUR CONTROL are the amount of money you spend on entertainment and the performance you can eke out of your system when enjoying your games.

    And again, I don’t give a shit if I come across as rude. Facts are facts. Deal with it.
  • xype - Tuesday, July 10, 2018 - link

    Your "attacking the poor GPU manufacturers" argument is a straw man. So you’re still a retard, because you’re completely missing the point. Go eat some boogers or play with your poop a bit to calm down.
  • close - Wednesday, July 4, 2018 - link

    @JoeyJoJo123, JoeyBoy, your comments don't make sense. People don't care how YOU like to play games and what you're willing to sacrifice. They want to play them like THEY like it.

    Pointing out that they can drop some settings is helpful. Insulting them for not wanting what you want is... not smart
  • edzieba - Wednesday, July 4, 2018 - link

    "Uhm. That’s like him saying a car isn’t fast enough to drive on a highway at reasonable speeds and you going "Idiot, you just have to drive it at 30mph!""

    To use the car analogy, it would be like loading every car up to it's maximum rated operating load, then complaining it doesn't hit the maximum possible acceleration.

    If you want to run at 4k, maybe just easy back a bit from the visually-indistinguishable-but-I-need-everything-at-11 "Utra" settings to more reasonable 'high' (or even 'medium' if it doesn't hurt your 1337 reputation too badly) and performance will be perfectly fine.

    Game developers will always scale target performance to what GPUs can provide. As cards get faster, newly released games will get more graphically complex in lockstep.
  • Bizwacky - Monday, July 9, 2018 - link

    >Game developers will always scale target performance to what GPUs can provide. As cards get faster, newly released games will get more graphically complex in lockstep.

    Is that really true? Modern high-end graphics cards can easily push basically any game at 1080P ultra at 150+ FPS, which 95%+ of gamers are at that resolution or below. I don't think developers really optimize for the highest end hardware, outside of certain edge cases.

    Obviously new games get more and more graphically complex, but I'd bet that has more to do with the average gamer's hardware getting faster. I.e. what are laptops from the last couple years capable of? Or for many games, what are consoles capable of? As TVs and consoles move to 4k, I think we'll see more and more games optimized to run at 4k, with not too much thought put in to what an 1180TI can run.
  • xype - Tuesday, July 10, 2018 - link

    "If you want to run at 4k, maybe just easy back a bit from the visually-indistinguishable-but-I-need-everything-at-11"

    It’s not "run at 4K", it’s "run at 4K with the settings I like". Both perfectly fine as a personal preference, the former being achievable today, the latter less so.

    "I’d like ham and eggs." — "We have no ham, you should get the eggs only." — "Well I won’t have the eggs then." — "You complete imbecile, it’s both with eggs, is it not?!"
  • cocochanel - Tuesday, July 3, 2018 - link

    You're absolutely right. But equally guilty are game developers who throw so many graphics options around, that one needs a degree in CS to understand.
    Performance can be affected by so many things, like it's the CPU doing the physics or the GPU ?
    Throwing the physics or AI at the GPU may overload it.
    Again, game developers keep this info to themselves.
    So, how is the average user supposed to "optimize" their game ?
  • GreenReaper - Tuesday, July 3, 2018 - link

    Honestly, that's the developer's job. A good development team will usually spend time on figuring out the ideal settings, but many don't and punt to the end-user.
  • Holliday75 - Tuesday, July 3, 2018 - link

    I click on the option that says "Low", "Medium" or "high". Usually works.

Log in

Don't have an account? Sign up now