Am I correct in understanding that OptiX cannot be turned off in 4.12?
areg5
Posts: 617
I have multiple installations of Daz that I purpose differently: 3.0.1.120, 4.5.0.114, 4.10.1.123, and Beta 4.12.1.16. I won't go into all of the reasons that I have each one, but each serves it's purpose. Anyway, I tend to make scenes with many characters and complex environments that can overwhelm my Vram. OptiX can use just enough Vram to crash. Happened to my today. I was rendering an outdoor scene, which seems to render faster with 4.12. Indoor scenes seem to render faster with 4.10. Anyway, after a couple of attemnpts it was apparent that the scene was too large for 4.12. Since this was the first of similar scenes in a 12 scene batch, I didn't want to work each scene to make it "fit." So I ran it through 4.10 and the whole batch ran.
For those of us that tend to use the Vram to it's limits, it would be very helpful if OptiX could be toggled off in 4.12. My opinion, for what it is worth.

Comments
This is down to the way nVidia is handling the latest cards. In order to have feature-parity between carrds with RTX and the GTX cards, as I understand it, they use OptiX Prime for the latter to implement the features that RTX offers, but it does - as you've found - have a memory overhead.
That's for sure. Thought that might be the case. So when I upgrade my cards, I will then be obligated to use 4.12 or higher, and just make smaller scenes?
If you get a 20x0 or 16x0 card you will need to use DS 4.11 or later, older versions of Iray don't support the Turing architecture. If it's an RTX card then OptiX Prime will not be used and memory requirements should be similar to those with the older versions and OptiX Prime off.
Ok. Well I guess that means I should keep the same set up I have now, and when I replace my cards I can go to the most recent version. Thanks!