Nvidia Announces Real Time Ray Tracing
"On Monday NVIDIA will announce RTX technology. This is a real-time cinematic rendering for game developers. Together with RTX, NVIDIA is also announcing Ray Tracing for Gameworks, which enables real-time Area Shadows, Glossy Reflections and Ambient Occlusion in game development. This technology is already available in early access.
NVIDIA has partnered up with Microsoft to create a new API for RTX, which will be supported by major game engines, such as Unreal Engine, Unity, and Frostbite.
Three developers are already participating in the development of RTX: EA Games, Remedy, and 4A Games."
https://videocardz.com/newz/nvidia-to-announce-rtx-technology
So we'll see what happens Monday. The bad news, it may be exclusive to Volta, or at the very least far more optimized than older generations.
Real time rendering is coming, people. Slightly related, you may have missed that Unreal was used for some special effects shots in Rogue One and Finding Dory. That big robot in Rogue One? That's real time, and Unreal. Unreal is working with Pixar.
https://www.cinemablend.com/games/1631230/why-two-disney-films-rendered-scenes-in-unreal-engine-4
Its only a matter of time before we start seeing humans rendered this way, in big movies. Or even small ones. A real time renderer would greatly level the playing field, IMO.
And it raises some big questions. If any of this tech is not made part of Iray, what happens to Daz and Iray?
UPDATE
True to their word, Nvidia revealed RTX, showed a demo real, and announced several partners along with Gameworks SDK and DirectX 12 integration.
https://wccftech.com/nvidia-rtx-technology-gameworks-sdk/
Video reel.
Comments
Holy crap. I wonder how much of that was out-of-the-box and how much was post-processing.
I would think that would defeat the purpose if it required much post. More articles, one has a clip of the conference. They had a conference to teach people some tips.
https://www.polygon.com/2017/3/1/14777806/gdc-epic-rogue-one-star-wars-k2so
http://www.gdconf.com/news/get-real-time-rendering-tips-rogue-ones-vfx-team-gdc-2017/
They modified Unreal to help achieve this. But the companies are working together and sharing tech. They now have a plugin so Unreal can natively import Pixar's file formats. So this sounds like a pretty big partnership. I bet Disney has plans to use this more and more. And I can see why. Time is money, and if they render in real time, BOOM. It also gives them a way to have real actors interact with what is on screen with them, which could help performances a little.
At the time being there's already iClone 7 with real-time PBR and Blender's EEVEE that's expected this year .. both with importers for DAZ assets. I guess DAZ Studio will go with Iray real-time at some point, or will embrace whatever Nvidia brings out.
The current iClone uses Nvidia VXGI, it is not the same process. It is a process similar to what modern video games use, which significantly cut down the number of ray tracing paths compared to Iray. That is why VXGI is able to render in real time. This is RTX, a new form of real time PBR, and it promises to be far more accurate for lighting.
https://wccftech.com/nvidia-rtx-technology-gameworks-sdk/
"Epic, 4A Games, Remedy Entertainment and Unity will demonstrate it.
NVIDIA also announced their plan to add a “ray-tracing denoiser module” to the existing GameWorks SDK so that game developers may properly take advantage of NVIDIA RTX. Furthermore, as per the previous report, NVIDIA “partnered closely” with Microsoft to enable full RTX support in Microsoft’s new DirectX Raytracing (DXR) API."
Now why can't they fix the denoise feature in Iray? This kind of irritates me a bit, as it shows where Nvidias focus is (not Iray.) But I digress. Here's another quote, and pay attention to the wording.
Kim Libreri, chief technology officer at Epic Games, said:
So Nvidia is anticipating this being used by movies for special effects, not just as a video game tech. That could trickle down to Iray. Maybe.
Remedy Entertainment’s technology team manager Mikko Orrenmaa stated:
Demo reel:
If this lives up to the hype, it could be be big.
Prooobably not the same development team.
I'm sure that's the case, but it doesn't matter. The end result does, Iray does not have this feature working as intended, and it has been in that state for a very long time.
A development team that has nothing to do with Iray working on something that has nothing to do with Iray doesn't show anything at all about NVIDIA's focus.
Say this much the hardware for doing that sort of thing will not be cheap that is for sure.. When you look at the video info it talks about Volta architecture based cards..
They could have the Iray development team actually fix the problem, now they couldn't they? Its been out for how many years now? RTX is brand new, has never been released to the public, but it already as a properly working denoiser. It does not matter how unrelated the teams are. And besides, Nvidia has very openly stated they are dedicated to gamers...it is not some big secret that they would focus more on gaming technology than on Iray. In the time that Iray has been out, Nvidia has debuted numerous brand new technologies for gaming, such as new physics, Ansel, dedicated VR software, and more, while offering modest updates to Iray.
I didn't think this was even something up for dispute. Let us not forget that it took several MONTHS for Pascal cards to even get Iray support. I rest my case.
Was Iray not the first NVIDIA product to get their AI denoiser?
Yes, it seems to target Volta, and there in no mention if other generations would get it. But if it required a very expensive GPU, it would be illogical for game studios to include a feature that very few people could actually use, because that would severely limit the sales potential of such games while increasing development time and cost...a lose-lose situation. Of course Nvidia also created Nvidia Hairworks, which is extremely taxing on GPUs and is usually disabled by most gamers who don't have good hardware. But even while it is taxing, it can run on older GPUs. In fact, Hairworks can run on ANY DirectX 11 GPU, even AMD ones. So consider that this announcement speaks of working with DirectX12, that gives me hope that this tech will be available on GPUs that can run DirectX12, or at least most of them. Like DirectX 12, some features may only work in newer GPUs. That much I do expect.
Secondly, even if RTX is hard on hardware, Iray itself is very taxing on hardware. My expectation is this will still render much faster than Iray ever will, as it is designed from the ground up for gaming, and gamers desire 60+ renders every single second. So no matter how taxing it is, it will render faster than Iray does, LOL. Most of us are only looking to get a single still image. This would make it highly desirable to export scenes to a program that can use RTX, in order to get that near instant snapshot render. It has major implications for animators.
This is cutting edge stuff. It'll be a while before the average gamer can run this on their game rig I think. There's nothing wrong with that, have to start somewhere. Real time raytracing is insane. This is the ultimate dream of any game dev and gamer, and ultimately any person into rendering, starting to come true in 2018. Hardware is starting to become powerful enough for this to be a thing. Imagine rendering at full blown Iray quality in realtime sometime in the future. It's just the logical next step, in 20-30 years or so we will look back and laugh that we had to wait minutes or even hours for each frame to render.
Here's some more info: https://www.forbes.com/sites/davealtavilla/2018/03/19/nvidia-and-microsoft-lay-foundation-for-photorealistic-gaming-with-real-time-ray-tracing/#26a19a346e31
The new SDK released in December of 2017. Iray has been around since at least 2014, and 2015 in Daz. That's 4 years to fix a completely borked feature. They couldn't even get the denoiser to work at the correct time, the denoiser works at the start of the render, rather than the end. It makes no sense at all how a feature could be this broken for this long.
If this had been a Nvidia gaming technology with this kind of problem, it would have been patched within the first year. Gamers would not put up with that and quickly revolt if their games had noisy pictures from Nvidia GPUs.
AMD and Microsoft also announced real time raytracing today...
https://www.anandtech.com/show/12552/amd-announces-real-time-ray-tracing-for-prorender-and-radeon-gpu-profiler-12
https://arstechnica.com/gadgets/2018/03/microsoft-announces-the-next-step-in-gaming-graphics-directx-raytracing/
Microsoft pretty much did theirs with Nvidia. It seems there was a bit of confusion as to how it all works out. DXR is what MS is calling theirs. And it will be a part of DirectX12, hence the DXR moniker. That means everybody with DirectX12 can use DXR. Windows is treating DXR like any process, and it can be combined with any graphics task.
Then comes Nvidia RTX. If you run MS DXR on a Volta GPU, then it seems like RTX will kick in, adding whatever features Nvidia adds on top of DXR.
Not to be left in the ray traced dark, AMD joins the party. The article makes it seem like the tech is geared towards developers more than gamers. But it also uses DirectX12, and it kicks in for DXR on AMD hardware. So it looks like MS wins big time here playing both fields. AMD also has Vulcan support.
Anan has a bit deeper article on them.
Watch the presentation for yourself, and see what you think. Either their all liars/overselling or the breakthrough is almost here.
Their ai de-noiser uses tensor cores. Possible their current volta only support is because they are the only chips with tensor cores. It wouldn't surprise me if it runs on gpus without tensor cores only with a big performance/quality loss.
That's the main reason why I don't use Iray for production. A PBR engine without denoising is simply unusable in my opinion. While Cycles has a wonderful and fast integerated denoiser, and also much more options to optimize the integrator to the scene needs.
Why do I feel like I'm in an alternate universe... this was posted in the new Octane thread about Iray denoiser coming in 4.11 ... http://docs.daz3d.com/doku.php/public/software/dazstudio/4/change_log#4_11_0_65
According the the Otoy forum announcing Octane 4 there is a post there where someone enabled the Octane 4's denoiser and they saw a massive drop in performance.. Though bare in mind that Octane 4 I think is still in beta at the moment..
Well ran into this interesting video, seems that pricing of video cards will get more expensive due to RTX and DXR since it seems the way for RTX/DXR to work properly will require a videocard with Tensor Cores on it.. So looks like Tensor Cores/Volta Titan V and whatever Nvidia release next, will be useable for more than just Deep Learning/AI.. And it seems for gamers looks like the major game developers are hopping on the RTX/DXR bandwagon as well..
Thanks for the video. It will be interesting to see if Nvidia really starts to add tensor cores to gpus targeting the gamer market.
For certain ai tasks the tensor cores double the performance (or more). Double the performance is certainly useful for the ones who work in that area.
Possible they will implement it but doesn't look like they try to push much with the current gen. Otherwise wouldn't they have used it for marketing of the Xbox One X already? It's only 4 months since the release.
The reason why I think this will come to console is because they said it will cover all of DirectX12. They could be saving this for a more gaming oriented conference. E3 is the biggest conference in gaming, and it is coming up in June. MS has already claimed this will be their biggest E3 event ever, even going as far as renting out an entire venue for E3. It sounds to me like the sort of thing they would say for an E3.
MS has added features post launch before, such as the previously mentioned AMD Freesync.
The basis for my argument was the assumption for such features not being implemented in a couple months. If microsoft had it in the pipeline why not use it for marketing when there was just a improved console release? But I agree it's also possible they are waiting for more games or a event.
New video from futuremark (directx version):
They say it runs in realtime on current gpu hardware. They have also some comparsion gifs here:
https://www.futuremark.com/pressreleases/watch-our-new-directx-raytracing-tech-demo
A noticeable lack of skin, though - I wonder how well it would have done without the mirrored faceplate.
Realtime raytraced reflections at 1080p!?! *drools*
https://arstechnica.com/gaming/2018/03/star-wars-demo-shows-off-just-how-great-real-time-raytracing-can-look/
I was just about to post this. I was thinking of making a thread just for this as it pertains to the future of rendering in general. Especially the other videos showing off the digital human, Siren. I have come across users who scoff at my posts about gaming engines being the future of rendering. It is my opinion that they are. And videos like this prove that they can make movie quality graphics. The video you linked, as well as this one, were running one machine. Not a farm, or any sort of linked system. I believe they used the crazy DGX-1, that expensive super box Nvidia makes, but still, this is far more reasonable than the old method of CPU farms.
More https://www.polygon.com/2018/3/21/17147502/unreal-engine-graphics-future-gdc-2018
Wow, that's not bad at all.
I will like this when the cheapy Android HW ARM GPUs (eg typical current Android TV Box has octacore ARM GPUs) are beefed up enough to handle it.