Comments Locked

34 Comments

Back to Article

  • Sivar - Monday, March 19, 2018 - link

    Raytracing is not fundamentally non-realtime. Hardware today is fast enough to do some of the raytrace work done with batch encoding in the 90's. There were even 4k demos (4K file size, not resolution) that did real-time raytracing on Pentium-1 level computers. The scenes weren't as complicated as, say, a battle in Warframe, but one I remember from the 90's that had about a dozen liquid blobs floating around and merging with each-other as liquid in zero gravity tends to do. Granted, it was hand-tuned assembly language done by a graphics/coding genius, but it shows that realtime RT is possible.
  • HStewart - Monday, March 19, 2018 - link

    Yes back then graphics especially like in game like Doom - was directly to hardware - I was going though my closets and found this old book on technology.

    https://www.amazon.com/Zen-Graphics-Programming-Ul...
  • bji - Monday, March 19, 2018 - link

    Yes that's the book commonly referred to as "Abrash". Seminal work.
  • Yojimbo - Tuesday, March 20, 2018 - link

    In the past, it was preferable for games to apply greater computation ability towards more complicated raster techniques rather than to raytracing. Now the industry seems to have decided that it makes sense to start to apply future improvements in computation ability to raytracing instead. This could mean that innovation in raster techniques is slowing down, or that demands from VR are causing a shift in thinking, or both.
  • AndrewJacksonZA - Tuesday, March 20, 2018 - link

    4K demos, huh?

    "Following people who write stuff for old machines and folk who write stuff for new machines means occasional confusion over who is doing what with 4k." - @RetroRemakes
    https://twitter.com/retroremakes/status/9685809015...
  • Kevin G - Monday, March 19, 2018 - link

    I wonder if Imagination's Caustic RT hardware is supported. They have had ray tracing accelerators for awhile now.
  • Alexvrb - Monday, March 19, 2018 - link

    They were ahead of their time... again. Pity. I wish they had fought it out some more in discrete PC graphics. I owned a Kyro I and a Kyro II, and if they had released a Kyro III I might have ended up with one of those too.
  • StevoLincolnite - Monday, March 19, 2018 - link

    They were efficient as well. Shame they lacked support for things like T&L natively in hardware though.

    The Matrox Parhelia I had allot of hope for at one point as well... And I thought S3 Chrome was going to finally bring a 3rd competitor into the limelight.

    I guess competing against AMD and nVidia in the graphics space is a tall order, they have been refining their hardware and software stacks for decades now and have a cadence nailed down.
  • Dragonstongue - Monday, March 19, 2018 - link

    I wonder if MSFT is putting some clever BS "behind the scenes" to make sure that Intel and Nvidia "perform better" instead of being truly hardware/software agnostic (that is, basically the better the cpu or gpu can handle it the better it will, rather than putting crappy backhanded ways of making sure someone will always do it "better" while the "others" have to play by other rules, like, tessellation, where Nv pulled a hissy and made MSFT give them the "advantage" at the cost of AMD suffering a performance hit, even after AMD/Radeon did all the leg work and $$$$$$$ in supporting it for many years PRIOR)

    anyways, will be interesting to see, but, how many will actually use it, especially at its "best" I doubt many will, devs are "simple" folks, or at least those whose names are attached to the devs who want the product out the door asap even if incomplete (like EA and all their studios who want to make the best they can but are short changed on time or funding to make it awesome as it should be)
  • ಬುಲ್ವಿಂಕಲ್ ಜೆ ಮೂಸ್ - Monday, March 19, 2018 - link


    No need to wonder....

    The past will tell you the future!

    It will be a locked down, DRM'd, Proprietary existence where only a rigged game determines who gets to win every single time

    The rest of you will get to lose forever and ever, Amen

    You may not be able to handle the truth......but you can at least TRY!
  • eddman - Monday, March 19, 2018 - link

    @ಬುಲ್ವಿಂಕಲ್ ಜೆ ಮೂಸ್
    Good job making sock puppet accounts, Bullwinkle J Moos.
  • HStewart - Monday, March 19, 2018 - link

    I think NVidia decided that change was worth it put in there hardware. If AMD desires to do so - they could always do it. More likely NVidia hardware is ready to handle it hardware and AMD is not.

    I believe the more they put in hardware the better it will be long term.
  • Yojimbo - Tuesday, March 20, 2018 - link

    Neither AMD nor NVIDIA have consumer hardware capable of good acceleration of DXR, yet. I am guessing that both will actively support it, the difference being that NVIDIA's compatible hardware will come out soon and AMD's will come out a year or more later. However, if the performance depends a lot on tensor core-like operations, I am guessing NVIDIA will have an advantage even after Navi comes out, because I seriously doubt AMD will have such functionality in Navi.
  • blppt - Monday, March 19, 2018 - link

    I dont think its that MSFT gave Nvidia some kind of advantage in tessellation, its just that the Gameworks libraries are tessellation-saturated and Nvidia has superior hardware dedicated to tessellation.

    Now, we can claim that Nvidia saw the one area that their own cards had a clear advantage and designed the GW libraries specifically to exploit that advantage---thats something I think we can all agree on.

    AMD would probably do the same thing if they were the clear market leader---but its hard to get devs to use a theoretical AMD-advantage library that cripples the market leader's cards (Nvidia), so AMD is forced to just be as open and architecture-agnostic as they can possibly be with their own dev libraries.
  • tamalero - Monday, March 19, 2018 - link

    Pretty sure they are talking about how games on purpose do dirty tricks or very unoptimized designs on purpose to bog down AMD's hardware (see CRYSIS 2 and 3 and the maps where they are tessellating the entire sea below the map, even when it shouldn't be like that)
  • FreckledTrout - Monday, March 19, 2018 - link

    Microsoft has a vested interest in AMD graphics running Directx 12 well since the Xbox one and Xbox one X use AMD GPU's.
  • Yojimbo - Tuesday, March 20, 2018 - link

    Shh, unbelievers are such a buzz kill. Get out of here with your facts and logic.
  • Manch - Tuesday, March 20, 2018 - link

    And Intel in regards to their "APU's with Radeon"
  • mooninite - Monday, March 19, 2018 - link

    Why is this getting front-page news? Vulkan + raytracing has been a "thing" since 2016. This and the NVIDIA article smell like ads.
  • nevcairiel - Monday, March 19, 2018 - link

    Its not "front page news", its just a pipeline story, where all such announcements end up.
  • Mr Perfect - Monday, March 19, 2018 - link

    Because DirectX is the industry leader and this might actually get used now. As cool as Vulkan is, the list of games using it is currently 27, moving to 34 if you include games that haven't launched yet. https://en.wikipedia.org/wiki/List_of_games_with_V...
  • inighthawki - Monday, March 19, 2018 - link

    In fairness, this is coming to DX12, which has an equally small number of released titles (albeit I believe more of which are AAA titles, whereas that vulkan list includes a lot of indie/mobile titles or re-releases of older games). Nonetheless, this is definitely more newsworthy than a PowerVR-specific proprietary extension to vulkan which was never used beyond a tech demo in unreal 4. This actually has the opportunity to get mass adoption of ray tracing as an industry standard in games. I'm quite excited to even try it out myself :)
  • Friendly0Fire - Monday, March 19, 2018 - link

    Because this is the first time raytracing becomes API level. We've been able to do raytracing for years, far before Vulkan was ever imagined, thanks to GPGPU hardware and CUDA/OpenCL/DirectCompute. That's not the novelty.

    The news is that this might herald the start of some kind of hardware accelerated ray tracing in mainstream GPUs. Even today's incredibly powerful and general-purpose GPUs would benefit greatly from that sort of acceleration.
  • Daffy_ch - Monday, March 19, 2018 - link

    Coincidence? OTOY released the very same day their real-time ray tracing engine Octane 4 which will work in Unity and Unreal Engine: https://www.reddit.com/r/nvidia/comments/85k88i/ot...
  • Yojimbo - Tuesday, March 20, 2018 - link

    Any idea if there is a future plan to put this in DirectX 11 or some DirectX 11-type API?
  • Ryan Smith - Tuesday, March 20, 2018 - link

    Nothing has been announced. Nor would I expect anything since the entire programming model for DXR is very much focused on DX12 generic programming.

    "The primary reason for this is that, fundamentally, DXR is a compute-like workload. It does not require complex state such as output merger blend modes or input assembler vertex layouts. A secondary reason, however, is that representing DXR as a compute-like workload is aligned to what we see as the future of graphics, namely that hardware will be increasingly general-purpose, and eventually most fixed-function units will be replaced by HLSL code. The design of the raytracing pipeline state exemplifies this shift through its name and design in the API. With DX12, the traditional approach would have been to create a new CreateRaytracingPipelineState method. Instead, we decided to go with a much more generic and flexible CreateStateObject method. It is designed to be adaptable so that in addition to Raytracing, it can eventually be used to create Graphics and Compute pipeline states, as well as any future pipeline designs."
  • evilpaul666 - Tuesday, March 20, 2018 - link

    So is the software mode going to be like turning on software DX8 shaders on a DX7 card back in the day?
  • stephenbrooks - Tuesday, March 20, 2018 - link

    I think they're saying the "fallback" is doing raytracing with traditional compute shaders (still on the GPU), rather than special hardware.
  • wr3zzz - Tuesday, March 20, 2018 - link

    I am not getting my hopes up. Just look at the pathetic advancements on gaming physics and AI over the last 15 years by developers. Things will only get worse now that the market has validated investments into addiction and extraction is far, far more profitable than putting capital behind innovation and creativity.
  • Yojimbo - Tuesday, March 20, 2018 - link

    I think advancement in physics has been slow because the increased computation ability achieved generation over generation was instead applied to improved graphics. The advancement in AI has been slow because AI is a very hard thing, and hand-coded AI especially has severe limitations. Machine learning will trickle into games and improve the AI. Physics might not become a priority to spend compute on unless VR starts to take off.
  • Yojimbo - Tuesday, March 20, 2018 - link

    Oh, one more point about physics... Unlike superior visual features, physics fundamentally changes a game, so it's not possible to develop a game for console hardware and then easily add in improved physics to be run on more powerful PC hardware.
  • Yojimbo - Tuesday, March 20, 2018 - link

    Edit: When I say that I guess I am making assumptions about the sort of advancements in gaming physics you are looking for.
  • stephenbrooks - Tuesday, March 20, 2018 - link

    The big thing here is that the 3D scene itself is being loaded onto the GPU in some form, rather than a sequence of triangles one at a time. Once the GPU has a scene representation, a lot of things are possible.
  • WatcherCK - Wednesday, March 21, 2018 - link

    Hmmmmm, get a Qualcomm 999 or two into a smart(er)TV where they can render a raytraced 4k plus resolution and then your latest netflix series is just a script running on a datacenter being streamed to your portal :)

    Also a question about raytracing, and HDR are the two complementary?

Log in

Don't have an account? Sign up now