Nvidia Delivers Ray Tracing Driver for Previous-Gen Pascal Cards
Taoz
Posts: 10,262
in The Commons
"Nvidia Delivers Ray Tracing Driver for Previous-Gen Pascal Cards"
https://www.tomshardware.com/news/nvidia-ray_tracing-pascal-10_series-driver,39043.html
Does anyone know if this will have any affect on rendering?

Comments
Well... I suppose three more demos to accompany the three titles that support ray-tracing is good; ok better than nothing, but not by much.
I don't get it. AFAIK, Iray already does ray tracing. And the older generation GPU's don't have the fancy new RTX hardware architecture that's supposed to blow away all the older architecture GPU's with ray tracing. So how can a new "ray tracing" driver make any difference in GPU's that don't have ray-tracing-dedicated hardware? I don't get it.
And now the Iray benchmark enthusiasts will have to start all over again to allow for this new driver set along with 236 other parameters. I have a headache. Like I said, wait six months.
In this case they're talking about real-time ray tracing, useful for game engines and stuff like that. I don't know if it will have much effect on Iray/OpenCL rendering. It is a lot faster, but takes a lot of shortcuts and sacrifices accuracy in the process. Plus all the demos they show are just empty rooms. Once you start adding volumetric fog, transmapped hair, subsurface skin, refraction/caustics, and multiple clothed figures in a scene I don't think it will be very fast.
It's a completely different type of rendering ... altogether.
How is it different? What do the algorithms do differently? Ray tracing, by definition, shoots zillions of rays from the camera to determine color for each pixel in the image, and depending on the desired sampling it averages many rays to get an individual pixel color. That takes a lot of time. A 1920x1080 image has over 2 million pixels to calculate, and each pixel has many rays, and those rays also have to be calculated for detecting if they hit something in the scene, and if so what color the hit point is, and then determine how they bounce off the object, and what the next hit point color is, and so on. RTX has hardware configured to do those things specifically very fast. GTX doesn't.
Ray tracing has been around for decades. So why now they suddenly have a new driver that does faster ray tracing on legacy hardware? Makes no sense. Although I haven't had my coffee yet...
BTW, here's a screenshot of a ray tracing application I wrote recently in C#. It loads a text file that describes what's in the scene (in this case just three spheres and a camera), prints the file contents in a text box on the right, and does ray tracing and generates the image. Kind like what the very basics of what Iray/Studio does, but FAR simpler. So if anyone knows what's new in ray tracing that allows them to make a faster ray tracing driver, please let me know so I can modify my algorithm.
Okay, apparently it's version 425.31 which came out on Thursday (see attached). And the "game ready" driver version says it includes "support for DirectX raytracing" under "Gaming Technology", whatever the heck that means.
I dunno. I have a driver from last October 2018, and I have a rule to never ever download the latest drivers. But I'm still thinking I might download it just to see how it does with Iray. Maybe not. Not sure.
Cool, thanks. You prevented me from pressing the "Download" button. That was a close call.
BTW, I still don't understand what it does under the hood to make any difference to games, but anyway...
Okay, I found a brief NVIDIA overview video, and it looks like games didn't utilize any (or much) ray tracing in the past. And since I have no interest whatsoever in games, I'm pretty clueless about that stuff. So apparently this new "raytracing" introduces some (what I call) "shortcuts" to ray tracing so you can do real time, like ambient occlusion, denoising, and some other stuff. And since the older generation doesn't have the architecture that the RTX/Turing has, it will probably have limited effect. They also mention a "one ray per pixel" global illumination, which also sounds like a shorthand version of global illumination that doesn't include the averaged ray samples for each pixel in order to save time.
But it sounds like this driver is one more step forward in introducing the RTX-related software for the RTX hardware. Although, as with anything, it probably depends mostly on how the individual software/games implement it all.
It's a completely different type of rendering.
(couldn't resist participating in an Airplane reference :))
OK, no effect on DS rendering then. Thanks!
Yeah, I downloaded the new 425.31 version to one of my backup computers, and ran the Sickleyield with my 1080ti, and it makes zero difference. Both were 2 minutes 15 seconds.
However it did help the raytracer I wrote in c# (using Direct3D11 backend) and it cranked the gigarays/sec up to just under 2.8 with my 1080ti, which is supposed to be around 1.2. And for reference, the RTX 2070 is supposed to be around 6 gigarays/sec.
Yeah, games have not traditionally used this form of ray tracing before because of the massive cost to performance. That is why real time ray tracing has been called the Holy Grail of the industry. Up until now, video games cheat with every trick in the book. Shadows may be baked in, for example, and they use screen space reflections instead of real reflections. This technique has progressed to a point where it works reasonably well. A great example of this is the PS4 game "Spiderman". As you swing through the city, you will see screen space reflections everywhere. You will see reflections on shiny buildings and cars. If you look close you can see these reflections are baked in. However, these are not so noticeable during frantic gameplay.
This video does a great job of breaking down all of the ways Spiderman's game engine works. It is extremely in depth, but a great viewing. The game is truly beautiful in motion.
Some talk about reflections begins at around the 11 minute mark with water. At about 13 minutes they start showing off the reflections on the skyscrapers and describe how they work. There is no real time ray tracing in Spiderman. This is how video games have traditionally handled reflections, Spiderman does it better than just about any other.
The next video shows a real time ray tracing demo vs "faking" it. It is highly educational.
When you watch it, you can see how the real time mode can get noise in it...just like Iray does! You will also note how the RTX cards shoot about 4 rays per pixel, which is still a far cry from what would be needed. Video game engines still use some tricks to get their final result because no card it quite powerful enough to run everything ray traced at the gold standard of 60 frames per second at 4K resolution. (That is why some gamers are not so keen on RTX cards.)
However, we are not gaming! We don't need the image to draw that fast, just fast enough. Going back to the video with RTX on, if he stands still the image resolves in just a few seconds. This obviously does not work for video games, but for us, getting a render in a few seconds like that would be a slice of heaven.
Thus I would urge trying out exporting to Unreal or Unity with real time ray tracing and see how it goes. I plan on doing that myself now that Unreal can use RTX with Pascal. Exciting times are upon us. Both of these are free to download and play with.
Yeah, that was interesting. Especially the noise part which really did look basically like Iray viewport. A bit faster of course but also with lots less bounces and other fancy stuff like ray traced refraction and SSS and all that jazz. I really think in the future there will not be a notable difference between offline and real time renderers, or rather both will just be called "renderers" and be pretty much real time. They will be doing the exact same thing if you think about it.
This is really cool. I downloaded the Reflections Demo, made in unreal Engine 4. It's the Stars Wars one. it's really cool. I ran it on my gtx 1060 6gig. It only ran at 5 fps at 1080p It's not great, yet still ran. I look forward to testing this with scenes in UE4