Nvidia Announces Real Time Ray Tracing

outrider42outrider42 Posts: 3,679
edited March 2018 in The Commons

"On Monday NVIDIA will announce RTX technology. This is a real-time cinematic rendering for game developers. Together with RTX, NVIDIA is also announcing Ray Tracing for Gameworks, which enables real-time Area Shadows, Glossy Reflections and Ambient Occlusion in game development. This technology is already available in early access.

NVIDIA has partnered up with Microsoft to create a new API for RTX, which will be supported by major game engines, such as Unreal Engine, Unity, and Frostbite.

Three developers are already participating in the development of RTX: EA Games, Remedy, and 4A Games."

https://videocardz.com/newz/nvidia-to-announce-rtx-technology

So we'll see what happens Monday. The bad news, it may be exclusive to Volta, or at the very least far more optimized than older generations.

Real time rendering is coming, people. Slightly related, you may have missed that Unreal was used for some special effects shots in Rogue One and Finding Dory. That big robot in Rogue One? That's real time, and Unreal. Unreal is working with Pixar. 

https://www.cinemablend.com/games/1631230/why-two-disney-films-rendered-scenes-in-unreal-engine-4

Its only a matter of time before we start seeing humans rendered this way, in big movies. Or even small ones. A real time renderer would greatly level the playing field, IMO.

And it raises some big questions. If any of this tech is not made part of Iray, what happens to Daz and Iray? 

UPDATE

True to their word, Nvidia revealed RTX, showed a demo real, and announced several partners along with Gameworks SDK and DirectX 12 integration.

https://wccftech.com/nvidia-rtx-technology-gameworks-sdk/

Video reel.

Post edited by outrider42 on
«1

Comments

  • agent unawaresagent unawares Posts: 3,513
    That big robot in Rogue One? That's real time, and Unreal.

    Holy crap. I wonder how much of that was out-of-the-box and how much was post-processing.

  • outrider42outrider42 Posts: 3,679
    That big robot in Rogue One? That's real time, and Unreal.

    Holy crap. I wonder how much of that was out-of-the-box and how much was post-processing.

    I would think that would defeat the purpose if it required much post. More articles, one has a clip of the conference. They had a conference to teach people some tips.

    https://www.polygon.com/2017/3/1/14777806/gdc-epic-rogue-one-star-wars-k2so

    http://www.gdconf.com/news/get-real-time-rendering-tips-rogue-ones-vfx-team-gdc-2017/

    They modified Unreal to help achieve this. But the companies are working together and sharing tech. They now have a plugin so Unreal can natively import Pixar's file formats. So this sounds like a pretty big partnership. I bet Disney has plans to use this more and more. And I can see why. Time is money, and if they render in real time, BOOM. It also gives them a way to have real actors interact with what is on screen with them, which could help performances a little.

  • PadonePadone Posts: 3,707

    At the time being there's already iClone 7 with real-time PBR and Blender's EEVEE that's expected this year .. both with importers for DAZ assets. I guess DAZ Studio will go with Iray real-time at some point, or will embrace whatever Nvidia brings out.

  • outrider42outrider42 Posts: 3,679
    edited March 2018

    The current iClone uses Nvidia VXGI, it is not the same process. It is a process similar to what modern video games use, which significantly cut down the number of ray tracing paths compared to Iray. That is why VXGI is able to render in real time. This is RTX, a new form of real time PBR, and it promises to be far more accurate for lighting.

    https://wccftech.com/nvidia-rtx-technology-gameworks-sdk/

    "Epic, 4A Games, Remedy Entertainment and Unity will demonstrate it.

    NVIDIA also announced their plan to add a “ray-tracing denoiser module” to the existing GameWorks SDK so that game developers may properly take advantage of NVIDIA RTX. Furthermore, as per the previous report, NVIDIA “partnered closely” with Microsoft to enable full RTX support in Microsoft’s new DirectX Raytracing (DXR) API."

    Now why can't they fix the denoise feature in Iray? This kind of irritates me a bit, as it shows where Nvidias focus is (not Iray.) But I digress. Here's another quote, and pay attention to the wording.

    Kim Libreri, chief technology officer at Epic Games, said:

    The availability of NVIDIA RTX opens the door to make real-time ray tracing a reality. By making such powerful technology available to the game development community with the support of the new DirectX Raytracing API, NVIDIA is the driving force behind the next generation of game and movie graphics.

    So Nvidia is anticipating this being used by movies for special effects, not just as a video game tech. That could trickle down to Iray. Maybe.

     

    Remedy Entertainment’s technology team manager Mikko Orrenmaa stated:

    Integrating NVIDIA RTX into our Northlight engine was a relatively straightforward exercise. We were surprised just how quickly we were able to prototype new lighting, reflection and ambient occlusion techniques, with significantly better visual fidelity than traditional rasterization techniques. We’re really excited about what we can achieve in the future with the NVIDIA RTX technology; gamers are in for a something special.

    Demo reel:

    If this lives up to the hype, it could be be big.

    Post edited by outrider42 on
  • agent unawaresagent unawares Posts: 3,513
    NVIDIA also announced their plan to add a “ray-tracing denoiser module” to the existing GameWorks SDK so that game developers may properly take advantage of NVIDIA RTX. Furthermore, as per the previous report, NVIDIA “partnered closely” with Microsoft to enable full RTX support in Microsoft’s new DirectX Raytracing (DXR) API."

    Now why can't they fix the denoise feature in Iray? This kind of irritates me a bit, as it shows where Nvidias focus is (not Iray.)

    Prooobably not the same development team.

  • outrider42outrider42 Posts: 3,679
    NVIDIA also announced their plan to add a “ray-tracing denoiser module” to the existing GameWorks SDK so that game developers may properly take advantage of NVIDIA RTX. Furthermore, as per the previous report, NVIDIA “partnered closely” with Microsoft to enable full RTX support in Microsoft’s new DirectX Raytracing (DXR) API."

    Now why can't they fix the denoise feature in Iray? This kind of irritates me a bit, as it shows where Nvidias focus is (not Iray.)

    Prooobably not the same development team.

    I'm sure that's the case, but it doesn't matter. The end result does, Iray does not have this feature working as intended, and it has been in that state for a very long time.

  • agent unawaresagent unawares Posts: 3,513
    NVIDIA also announced their plan to add a “ray-tracing denoiser module” to the existing GameWorks SDK so that game developers may properly take advantage of NVIDIA RTX. Furthermore, as per the previous report, NVIDIA “partnered closely” with Microsoft to enable full RTX support in Microsoft’s new DirectX Raytracing (DXR) API."

    Now why can't they fix the denoise feature in Iray? This kind of irritates me a bit, as it shows where Nvidias focus is (not Iray.)

    Prooobably not the same development team.

    I'm sure that's the case, but it doesn't matter. The end result does, Iray does not have this feature working as intended, and it has been in that state for a very long time.

    A development team that has nothing to do with Iray working on something that has nothing to do with Iray doesn't show anything at all about NVIDIA's focus.

  • Ghosty12Ghosty12 Posts: 2,060

    Say this much the hardware for doing that sort of thing will not be cheap that is for sure.. When you look at the video info it talks about Volta architecture based cards..

  • outrider42outrider42 Posts: 3,679
    edited March 2018
    NVIDIA also announced their plan to add a “ray-tracing denoiser module” to the existing GameWorks SDK so that game developers may properly take advantage of NVIDIA RTX. Furthermore, as per the previous report, NVIDIA “partnered closely” with Microsoft to enable full RTX support in Microsoft’s new DirectX Raytracing (DXR) API."

    Now why can't they fix the denoise feature in Iray? This kind of irritates me a bit, as it shows where Nvidias focus is (not Iray.)

    Prooobably not the same development team.

    I'm sure that's the case, but it doesn't matter. The end result does, Iray does not have this feature working as intended, and it has been in that state for a very long time.

    A development team that has nothing to do with Iray working on something that has nothing to do with Iray doesn't show anything at all about NVIDIA's focus.

    They could have the Iray development team actually fix the problem, now they couldn't they? Its been out for how many years now? RTX is brand new, has never been released to the public, but it already as a properly working denoiser. It does not matter how unrelated the teams are. And besides, Nvidia has very openly stated they are dedicated to gamers...it is not some big secret that they would focus more on gaming technology than on Iray. In the time that Iray has been out, Nvidia has debuted numerous brand new technologies for gaming, such as new physics, Ansel, dedicated VR software, and more, while offering modest updates to Iray. 

    I didn't think this was even something up for dispute. Let us not forget that it took several MONTHS for Pascal cards to even get Iray support. I rest my case.

    Post edited by outrider42 on
  • agent unawaresagent unawares Posts: 3,513
    NVIDIA also announced their plan to add a “ray-tracing denoiser module” to the existing GameWorks SDK so that game developers may properly take advantage of NVIDIA RTX. Furthermore, as per the previous report, NVIDIA “partnered closely” with Microsoft to enable full RTX support in Microsoft’s new DirectX Raytracing (DXR) API."

    Now why can't they fix the denoise feature in Iray? This kind of irritates me a bit, as it shows where Nvidias focus is (not Iray.)

    Prooobably not the same development team.

    I'm sure that's the case, but it doesn't matter. The end result does, Iray does not have this feature working as intended, and it has been in that state for a very long time.

    A development team that has nothing to do with Iray working on something that has nothing to do with Iray doesn't show anything at all about NVIDIA's focus.

    They could have the Iray development team actually fix the problem, now they couldn't they? Its been out for how many years now? RTX is brand new, has never been released to the public, but it already as a properly working denoiser.

    Was Iray not the first NVIDIA product to get their AI denoiser?

  • outrider42outrider42 Posts: 3,679
    ghosty12 said:

    Say this much the hardware for doing that sort of thing will not be cheap that is for sure.. When you look at the video info it talks about Volta architecture based cards..

    Yes, it seems to target Volta, and there in no mention if other generations would get it. But if it required a very expensive GPU, it would be illogical for game studios to include a feature that very few people could actually use, because that would severely limit the sales potential of such games while increasing development time and cost...a lose-lose situation. Of course Nvidia also created Nvidia Hairworks, which is extremely taxing on GPUs and is usually disabled by most gamers who don't have good hardware. But even while it is taxing, it can run on older GPUs. In fact, Hairworks can run on ANY DirectX 11 GPU, even AMD ones. So consider that this announcement speaks of working with DirectX12, that gives me hope that this tech will be available on GPUs that can run DirectX12, or at least most of them. Like DirectX 12, some features may only work in newer GPUs. That much I do expect.

    Secondly, even if RTX is hard on hardware, Iray itself is very taxing on hardware. My expectation is this will still render much faster than Iray ever will, as it is designed from the ground up for gaming, and gamers desire 60+ renders every single second. So no matter how taxing it is, it will render faster than Iray does, LOL. Most of us are only looking to get a single still image. This would make it highly desirable to export scenes to a program that can use RTX, in order to get that near instant snapshot render. It has major implications for animators.

  • bluejauntebluejaunte Posts: 1,902

    This is cutting edge stuff. It'll be a while before the average gamer can run this on their game rig I think. There's nothing wrong with that, have to start somewhere. Real time raytracing is insane. This is the ultimate dream of any game dev and gamer, and ultimately any person into rendering, starting to come true in 2018. Hardware is starting to become powerful enough for this to be a thing. Imagine rendering at full blown Iray quality in realtime sometime in the future. It's just the logical next step, in 20-30 years or so we will look back and laugh that we had to wait minutes or even hours for each frame to render.

    Here's some more info: https://www.forbes.com/sites/davealtavilla/2018/03/19/nvidia-and-microsoft-lay-foundation-for-photorealistic-gaming-with-real-time-ray-tracing/#26a19a346e31

  • outrider42outrider42 Posts: 3,679
    NVIDIA also announced their plan to add a “ray-tracing denoiser module” to the existing GameWorks SDK so that game developers may properly take advantage of NVIDIA RTX. Furthermore, as per the previous report, NVIDIA “partnered closely” with Microsoft to enable full RTX support in Microsoft’s new DirectX Raytracing (DXR) API."

    Now why can't they fix the denoise feature in Iray? This kind of irritates me a bit, as it shows where Nvidias focus is (not Iray.)

    Prooobably not the same development team.

    I'm sure that's the case, but it doesn't matter. The end result does, Iray does not have this feature working as intended, and it has been in that state for a very long time.

    A development team that has nothing to do with Iray working on something that has nothing to do with Iray doesn't show anything at all about NVIDIA's focus.

    They could have the Iray development team actually fix the problem, now they couldn't they? Its been out for how many years now? RTX is brand new, has never been released to the public, but it already as a properly working denoiser.

    Was Iray not the first NVIDIA product to get their AI denoiser?

    The new SDK released in December of 2017. Iray has been around since at least 2014, and 2015 in Daz. That's 4 years to fix a completely borked feature. They couldn't even get the denoiser to work at the correct time, the denoiser works at the start of the render, rather than the end. It makes no sense at all how a feature could be this broken for this long.

    If this had been a Nvidia gaming technology with this kind of problem, it would have been patched within the first year. Gamers would not put up with that and quickly revolt if their games had noisy pictures from Nvidia GPUs.

  • outrider42outrider42 Posts: 3,679

    Microsoft pretty much did theirs with Nvidia. It seems there was a bit of confusion as to how it all works out. DXR is what MS is calling theirs. And it will be a part of DirectX12, hence the DXR moniker. That means everybody with DirectX12 can use DXR. Windows is treating DXR like any process, and it can be combined with any graphics task.

    Then comes Nvidia RTX. If you run MS DXR on a Volta GPU, then it seems like RTX will kick in, adding whatever features Nvidia adds on top of DXR.

    Not to be left in the ray traced dark, AMD joins the party. The article makes it seem like the tech is geared towards developers more than gamers. But it also uses DirectX12, and it kicks in for DXR on AMD hardware. So it looks like MS wins big time here playing both fields. AMD also has Vulcan support.

    Anan has a bit deeper article on them.

     

    https://www.anandtech.com/show/12546/nvidia-unveils-rtx-technology-real-time-ray-tracing-acceleration-for-volta-gpus-and-later
  • xyer0xyer0 Posts: 5,952

    Watch the presentation for yourself, and see what you think. Either their all liars/overselling or the breakthrough is almost here.

  • mikekmikek Posts: 195
    edited March 2018

    Yes, it seems to target Volta, and there in no mention if other generations would get it. But if it required a very expensive GPU, it would be illogical for game studios to include a feature that very few people could actually use, because that would severely limit the sales potential of such games while increasing development time and cost...a lose-lose situation. Of course Nvidia also created Nvidia Hairworks, which is extremely taxing on GPUs and is usually disabled by most gamers who don't have good hardware. But even while it is taxing, it can run on older GPUs. In fact, Hairworks can run on ANY DirectX 11 GPU, even AMD ones. So consider that this announcement speaks of working with DirectX12, that gives me hope that this tech will be available on GPUs that can run DirectX12, or at least most of them. Like DirectX 12, some features may only work in newer GPUs. That much I do expect.

    Their ai de-noiser uses tensor cores. Possible their current volta only support is because they are the only chips with tensor cores. It wouldn't surprise me if it runs on gpus without tensor cores only with a big performance/quality loss.

    Post edited by mikek on
  • PadonePadone Posts: 3,707
    They couldn't even get the denoiser to work at the correct time, the denoiser works at the start of the render, rather than the end. It makes no sense at all how a feature could be this broken for this long.

    That's the main reason why I don't use Iray for production. A PBR engine without denoising is simply unusable in my opinion. While Cycles has a wonderful and fast integerated denoiser, and also much more options to optimize the integrator to the scene needs.

  • Why do I feel like I'm in an alternate universe... this was posted in the new Octane thread about Iray denoiser coming in 4.11 ... http://docs.daz3d.com/doku.php/public/software/dazstudio/4/change_log#4_11_0_65

  • Ghosty12Ghosty12 Posts: 2,060
    edited March 2018
    mikek said:

    Yes, it seems to target Volta, and there in no mention if other generations would get it. But if it required a very expensive GPU, it would be illogical for game studios to include a feature that very few people could actually use, because that would severely limit the sales potential of such games while increasing development time and cost...a lose-lose situation. Of course Nvidia also created Nvidia Hairworks, which is extremely taxing on GPUs and is usually disabled by most gamers who don't have good hardware. But even while it is taxing, it can run on older GPUs. In fact, Hairworks can run on ANY DirectX 11 GPU, even AMD ones. So consider that this announcement speaks of working with DirectX12, that gives me hope that this tech will be available on GPUs that can run DirectX12, or at least most of them. Like DirectX 12, some features may only work in newer GPUs. That much I do expect.

    Their ai de-noiser uses tensor cores. Possible their current volta only support is because they are the only chips with tensor cores. It wouldn't surprise me if it runs on gpus without tensor cores only with a big performance/quality loss.

    According the the Otoy forum announcing Octane 4 there is a post there where someone enabled the Octane 4's denoiser and they saw a massive drop in performance.. Though bare in mind that Octane 4 I think is still in beta at the moment..

    Well ran into this interesting video, seems that pricing of video cards will get more expensive due to RTX and DXR since it seems the way for RTX/DXR to work properly will require a videocard with Tensor Cores on it.. So looks like Tensor Cores/Volta Titan V and whatever Nvidia release next, will be useable for more than just Deep Learning/AI..  And it seems for gamers looks like the major game developers are hopping on the RTX/DXR bandwagon as well..

    Post edited by Ghosty12 on
  • outrider42outrider42 Posts: 3,679
    GPU prices have been expected to rise, but not because of RTX. Remember RTX is exclusive to Nvidia. So if RTX really pushed the price up that much, AMD would be able to massively undercut them. In the super competitive world of GPUs, that could literally swing the the market back around in AMD's favor, especially if their version of this is in the ballpark of quality. And again, DirectX12 is getting its own version.

    The Tensor core thing is IMO, largely marketing hype. The Titan has been around $3000 before in the past.

    The video is just an opinion. The real reasons for the price increase are the shortage of VRAM supplies, the demand created by crypto mining, and good old greed. I talked about AMD Freesync and Nvidia Gsync in another thread, which could be a similar situation.

    Consoles use AMD chips, and as such they present a powerful equalizer to Nvidia. While Nvidia enjoys a large near 80% stake in the GPU market, only AMD powers the PS4 and Xbox families. Microsoft just enabled AMD Freesync in its consoles. If Sony joins them that will put a big hurting on demand for monitors with Gsync. Gsync monitors already cost far more than Freesync counterparts. TV manufacturers can easily add Freesync for little cost, which would give both console and PC gamers a new choice.

    With Xbox being a DirectX12 box (that's where the name Xbox comes from to begin with, it was a DirectX box,) it is possible MS could add DXR to Xbox as well. They did not announce this, but I bet they are trying to make this work. The Xbox One X in particular has a dedicated DirectX chip built into it at the hardware level. (That's a first.)

    Back to Nvidia, they have unvealed and hyped new technology in their GPUs frequently, and it hasn't resulted in large markups. Pascal unvieled numerous such tech, including new physics, 3D physics based sound, VR optimized graphics, and Ansel. All of this optimized for Pascal. So real time ray tracing is just the next big thing for them, just like VR was back in 2016. Nvidia hyped their VR tech hard on Pascal. VR has kind of fell flat, so they have to hype their new GPUs in another way...ray traced graphics fits that bill. Tech has been talking about real time ray traced graphics for a long time. And perhaps it is finally here. But its not why the cost of GPUs will go up.
  • mikekmikek Posts: 195
    edited March 2018
    ghosty12 said:

    Well ran into this interesting video, seems that pricing of video cards will get more expensive due to RTX and DXR since it seems the way for RTX/DXR to work properly will require a videocard with Tensor Cores on it.. So looks like Tensor Cores/Volta Titan V and whatever Nvidia release next, will be useable for more than just Deep Learning/AI..  And it seems for gamers looks like the major game developers are hopping on the RTX/DXR bandwagon as well..

    Thanks for the video. It will be interesting to see if Nvidia really starts to add tensor cores to gpus targeting the gamer market.

     

    The Tensor core thing is IMO, largely marketing hype. The Titan has been around $3000 before in the past.

    For certain ai tasks the tensor cores double the performance (or more). Double the performance is certainly useful for the ones who work in that area.

    With Xbox being a DirectX12 box (that's where the name Xbox comes from to begin with, it was a DirectX box,) it is possible MS could add DXR to Xbox as well. They did not announce this, but I bet they are trying to make this work. The Xbox One X in particular has a dedicated DirectX chip built into it at the hardware level. (That's a first.)

    Possible they will implement it but doesn't look like they try to push much with the current gen. Otherwise wouldn't they have used it for marketing of the Xbox One X already? It's only 4 months since the release.

    Post edited by mikek on
  • outrider42outrider42 Posts: 3,679

    The reason why I think this will come to console is because they said it will cover all of DirectX12. They could be saving this for a more gaming oriented conference. E3 is the biggest conference in gaming, and it is coming up in June. MS has already claimed this will be their biggest E3 event ever, even going as far as renting out an entire venue for E3. It sounds to me like the sort of thing they would say for an E3.

    MS has added features post launch before, such as the previously mentioned AMD Freesync.

  • mikekmikek Posts: 195
    edited March 2018

    The basis for my argument was the assumption for such features not being implemented in a couple months. If microsoft had it in the pipeline why not use it for marketing when there was just a improved console release? But I agree it's also possible they are waiting for more games or a event.

    New video from futuremark (directx version):


    They say it runs in realtime on current gpu hardware. They have also some comparsion gifs here:
    https://www.futuremark.com/pressreleases/watch-our-new-directx-raytracing-tech-demo

    Post edited by mikek on
  • Richard HaseltineRichard Haseltine Posts: 101,076

    A noticeable lack of skin, though - I wonder how well it would have done without the mirrored faceplate.

  • agent unawaresagent unawares Posts: 3,513

    Realtime raytraced reflections at 1080p!?! *drools*

  • outrider42outrider42 Posts: 3,679

    I was just about to post this. I was thinking of making a thread just for this as it pertains to the future of rendering in general. Especially the other videos showing off the digital human, Siren. I have come across users who scoff at my posts about gaming engines being the future of rendering. It is my opinion that they are. And videos like this prove that they can make movie quality graphics. The video you linked, as well as this one, were running one machine. Not a farm, or any sort of linked system. I believe they used the crazy DGX-1, that expensive super box Nvidia makes, but still, this is far more reasonable than the old method of CPU farms.

    More https://www.polygon.com/2018/3/21/17147502/unreal-engine-graphics-future-gdc-2018

  • agent unawaresagent unawares Posts: 3,513

    Wow, that's not bad at all.

  • nonesuch00nonesuch00 Posts: 18,142

    I will like this when the cheapy Android HW ARM GPUs (eg typical current Android TV Box has octacore ARM GPUs) are beefed up enough to handle it.

Sign In or Register to comment.