Daz Studio 4.12 Pro, General Release! (*UPDATED*)

145791028

Comments

  • RayDAntRayDAnt Posts: 956
    edited October 2019
    marble said:
    RayDAnt said:
    Padone said:
    marble said:

    Update on my GPU fallback to CPU problem. This is happening when I try to render an animation to an image series. The first render gows through fine on the GPU but the second drops to CPU. I have checked the log and it tells me aboput an Optix Prime error but I don't have Optix Prime Acceleration enabled in the Render settings (Advanced Tab).

    You may try with a simple scene just to see if it's a driver issue or a vram issue. I assume you use studio drivers and not game drivers. You may also try enabling optix though I suspect that 4.12 always enables it. My guess is it's a out of vram issue.

    Yeah, as of 4.11 the OptiX Prime toggle in Daz Studio has been deprecated by Iray in favor of ALWAYS using OptiX Prime acceleration for non-RTX GPUs. Actually already successfully reported this to Daz, and kind of surprised the toggle is still there in the Render Settings tab since it no longer does anything.

    Looking at my logs, it seems to me that Optix Prime is, as you say, always used in 4.12 but going back to 4.11 I can see when Optix Prime was used (check box on) and not used.  When not used, the log reports "Using built-in ray tracing (1.0.0)" but when Optix Prime is enabled (check box) the log reports "Using Optix Prime ray tracing (5.0.1)".

    My fall-back problem definitely seems to be an Optix Prime issue.

    I concur with that analysis. The OptiX Prime library is a platform independent software-sideded implementation of raytracing acceleration. Meaning that it requires additional memory and processing power from a render device's general pool of resources (eg. a GPU's VRAM and Cuda cores) in order to function. Previous versions of Iray (Iray 2018.1.3 and earlier) were cognizant of this and actively gave the GUI developer (ie. Daz 3D) the ability to let the end user selectively turn it or or off. And even acknowledged the pros and cons of its use in the official Iray documentation available at the time (p. 53):

    bool iray_optix_prime = true
    Switches from the classical built-in Ray Tracing traversal and hierarchy construction code
    to the external NVIDIA OptiX Prime API. Setting this option to false will lead to slower
    scene-geometry updates when running on the GPU and (usually) increased overall
    rendering time, but features less temporary memory overhead during scene-geometry
    updates and also decreased overall memory usage during rendering

     As part of Iray's re-development for RTX compatibility, this memory usage concern with OptiX Prime seems to have - for whatever reason(s) - disappeared from the Iray development team's list of priorities since they have both adopted OptiX Prime usage in all cases (excepting the presence of RTX hardware) and removed any mention of OptiX Prime or its downsides from the current version of the official Iray documentation. if OptiX Prime being always on is enough to break your ability to render a significant number of scenes, I'm sorry to say that - at this point - your only real recourse is to either upgrade to higher VRAM capacity GPUs or stick to an outdated version of Daz Studio. Both of which are very sucky solutions imo, but that is how it currently stands.

    And to be clear, this has nothing to do with updates to Daz Studio or Daz 3D in general. To this day, OptiX Prime control functionality remains in Daz Studio despite Iray no longer implementing it. This is ENTIRELY the doing of Nvidia's Iray development team. So if you want someone to blame...

    Post edited by RayDAnt on
  • nonesuch00nonesuch00 Posts: 16,163

    I am getting faster render time but then I CPU render. 4.12 does eeem to increase memory used (probably explaining how they were able to do the faster render times) which causes some scenes I have made to crash DAZ Studio with an out of memory message (actually it's windows 10 stopping DAZ Studio & not DAZ Studio crashing - that is a Windows 10 policy to stop the OS from being crashed and also a security message against trojans).

  • marblemarble Posts: 6,159

    Or I could figure out how to send my scenes to Blender and render with Cycles or Eevee. Or - expensive but perhaps cheaper than a new (high capacity VRAM) GPU - I could invest in Octane.

  • Anyone know,... if the scripts that worked up to 4.10, then were killed by 4.11, will now work in 4.12 ?

  • Anyone know,... if the scripts that worked up to 4.10, then were killed by 4.11, will now work in 4.12 ?

    Which scripts?

  • swordkensiaswordkensia Posts: 348
    edited October 2019

    This optix Prime being on by default Issue is NOT GOOD.

    I just tested a scene of mine in Studio 4.12, total memory used (monitored via GPUz) 10.5 GB.

    Unistalled 4.12, installed 4.11 re-rendered same scene, total memory used 9.4GB

    Therefore optix Prime is adding a 1.1gb overhead. So I will be un installing 4.12 from all of my machines and re-installing 4.11 

    DAZ this is NOT GOOD., and you should have made users aware of this potential memory overhead.

    I have read that there has been a new release of Iray which seems to have better texture and memory optimisation so hopefully that will help to reduce the memory overhead of optix prime in a future release.

    Thank god I found out about this before I bought that £1100.00 2080 ti I had my eye on. I'm currently using 1080ti's on Windows 7 Pro.

    I mean whats the use of having alot of GPU Memory particularly if you are using it to render big scenes, when you are essentially loosing 10%+ of that memory for accelerated rendering, its crazy.

    And I feel really sorry for you guys on Windows 10, cus now, with optix prime,you are loosing around 28% of your cards' total memory, to poor hardware/software implementation.  

    S.K.

     

    Post edited by swordkensia on
  • This optix Prime being on by default Issue is NOT GOOD.

    I just tested a scene of mine in Studio 4.12, total memory used (monitored via GPUz) 10.5 GB.

    Unistalled 4.12, installed 4.11 re-rendered same scene, total memory used 9.4GB

    Therefore optix Prime is adding a 1.1gb overhead. So I will be un installing 4.12 from all of my machines and re-installing 4.11 

    DAZ this is NOT GOOD., and you should have made users aware of this potential memory overhead.

    I have read that there has been a new release of Iray which seems to have better texture and memory optimisation so hopefully that will help to reduce the memory overhead of optix prime in a future release.

    Thank god I found out about this before I bought that £1100.00 2080 ti I had my eye on. I'm currently using 1080ti's on Windows 7 Pro.

    I mean whats the use of having alot of GPU Memory particularly if you are using it to render big scenes, when you are essentially loosing 10%+ of that memory for accelerated rendering, its crazy.

    And I feel really sorry for you guys on Windows 10, cus now, with optix prime,you are loosing around 28% of your cards' total memory, to poor hardware/software implementation.  

    S.K.

    By my understanding of the preceeding comments an RTX card like the 2080Ti is not affected by the OptiX issue, if that is ineed the issue at hand.

  • This optix Prime being on by default Issue is NOT GOOD.

    I just tested a scene of mine in Studio 4.12, total memory used (monitored via GPUz) 10.5 GB.

    Unistalled 4.12, installed 4.11 re-rendered same scene, total memory used 9.4GB

    Therefore optix Prime is adding a 1.1gb overhead. So I will be un installing 4.12 from all of my machines and re-installing 4.11 

    DAZ this is NOT GOOD., and you should have made users aware of this potential memory overhead.

    I have read that there has been a new release of Iray which seems to have better texture and memory optimisation so hopefully that will help to reduce the memory overhead of optix prime in a future release.

    Thank god I found out about this before I bought that £1100.00 2080 ti I had my eye on. I'm currently using 1080ti's on Windows 7 Pro.

    I mean whats the use of having alot of GPU Memory particularly if you are using it to render big scenes, when you are essentially loosing 10%+ of that memory for accelerated rendering, its crazy.

    And I feel really sorry for you guys on Windows 10, cus now, with optix prime,you are loosing around 28% of your cards' total memory, to poor hardware/software implementation.  

    S.K.

    By my understanding of the preceeding comments an RTX card like the 2080Ti is not affected by the OptiX issue, if that is ineed the issue at hand.

    Interesting IF that is indeed the case, that changes things a touch.., though tough luck for those on older HW. 

    Richard, would it be possible to have an official statement about OptiX not adding any significant memory overhead on RTX based cards.??

    SK

  • RayDAntRayDAnt Posts: 956
    edited October 2019

    This optix Prime being on by default Issue is NOT GOOD.

    I just tested a scene of mine in Studio 4.12, total memory used (monitored via GPUz) 10.5 GB.

    Unistalled 4.12, installed 4.11 re-rendered same scene, total memory used 9.4GB

    Therefore optix Prime is adding a 1.1gb overhead. So I will be un installing 4.12 from all of my machines and re-installing 4.11 

    DAZ this is NOT GOOD., and you should have made users aware of this potential memory overhead.

    I have read that there has been a new release of Iray which seems to have better texture and memory optimisation so hopefully that will help to reduce the memory overhead of optix prime in a future release.

    Thank god I found out about this before I bought that £1100.00 2080 ti I had my eye on. I'm currently using 1080ti's on Windows 7 Pro.

    I mean whats the use of having alot of GPU Memory particularly if you are using it to render big scenes, when you are essentially loosing 10%+ of that memory for accelerated rendering, its crazy.

    And I feel really sorry for you guys on Windows 10, cus now, with optix prime,you are loosing around 28% of your cards' total memory, to poor hardware/software implementation.  

    S.K.

    By my understanding of the preceeding comments an RTX card like the 2080Ti is not affected by the OptiX issue, if that is ineed the issue at hand.

    Interesting IF that is indeed the case, that changes things a touch.., though tough luck for those on older HW. 

    Richard, would it be possible to have an official statement about OptiX not adding any significant memory overhead on RTX based cards.??

    SK

    RTX cards do not use OptiX Prime because they have dedicated hardware-level computing structures just for raytracing. Therefore there will be no additional general use memory hit (like there is with OptiX Prime) when 20xx cards are used for raytraced rendering in RTX-aware applications.

    Post edited by RayDAnt on
  • marblemarble Posts: 6,159

    This optix Prime being on by default Issue is NOT GOOD.

    I just tested a scene of mine in Studio 4.12, total memory used (monitored via GPUz) 10.5 GB.

    Unistalled 4.12, installed 4.11 re-rendered same scene, total memory used 9.4GB

    Therefore optix Prime is adding a 1.1gb overhead. So I will be un installing 4.12 from all of my machines and re-installing 4.11 

    DAZ this is NOT GOOD., and you should have made users aware of this potential memory overhead.

    I have read that there has been a new release of Iray which seems to have better texture and memory optimisation so hopefully that will help to reduce the memory overhead of optix prime in a future release.

    Thank god I found out about this before I bought that £1100.00 2080 ti I had my eye on. I'm currently using 1080ti's on Windows 7 Pro.

    I mean whats the use of having alot of GPU Memory particularly if you are using it to render big scenes, when you are essentially loosing 10%+ of that memory for accelerated rendering, its crazy.

    And I feel really sorry for you guys on Windows 10, cus now, with optix prime,you are loosing around 28% of your cards' total memory, to poor hardware/software implementation.  

    S.K.

    By my understanding of the preceeding comments an RTX card like the 2080Ti is not affected by the OptiX issue, if that is ineed the issue at hand.

    Interesting IF that is indeed the case, that changes things a touch.., though tough luck for those on older HW. 

    Richard, would it be possible to have an official statement about OptiX not adding any significant memory overhead on RTX based cards.??

    SK

    Well, there's the other problem that has been mentioned here, if you render animations. The Optix Prime VRAM issue only appeared for me when I started rendering an image series for an animation. The scene was actually relatively small - well within the VRAM limits - but it fails on iterations after the first frame. Something is amiss with the VRAM management but I have no idea whether that is NVidia or DAZ at fault. Clearly there is a cumulative increase in VRAM occupation when rendering a series.

  • RayDAntRayDAnt Posts: 956
    edited October 2019
    marble said:

    This optix Prime being on by default Issue is NOT GOOD.

    I just tested a scene of mine in Studio 4.12, total memory used (monitored via GPUz) 10.5 GB.

    Unistalled 4.12, installed 4.11 re-rendered same scene, total memory used 9.4GB

    Therefore optix Prime is adding a 1.1gb overhead. So I will be un installing 4.12 from all of my machines and re-installing 4.11 

    DAZ this is NOT GOOD., and you should have made users aware of this potential memory overhead.

    I have read that there has been a new release of Iray which seems to have better texture and memory optimisation so hopefully that will help to reduce the memory overhead of optix prime in a future release.

    Thank god I found out about this before I bought that £1100.00 2080 ti I had my eye on. I'm currently using 1080ti's on Windows 7 Pro.

    I mean whats the use of having alot of GPU Memory particularly if you are using it to render big scenes, when you are essentially loosing 10%+ of that memory for accelerated rendering, its crazy.

    And I feel really sorry for you guys on Windows 10, cus now, with optix prime,you are loosing around 28% of your cards' total memory, to poor hardware/software implementation.  

    S.K.

    By my understanding of the preceeding comments an RTX card like the 2080Ti is not affected by the OptiX issue, if that is ineed the issue at hand.

    Interesting IF that is indeed the case, that changes things a touch.., though tough luck for those on older HW. 

    Richard, would it be possible to have an official statement about OptiX not adding any significant memory overhead on RTX based cards.??

    SK

    Well, there's the other problem that has been mentioned here, if you render animations. The Optix Prime VRAM issue only appeared for me when I started rendering an image series for an animation. The scene was actually relatively small - well within the VRAM limits - but it fails on iterations after the first frame. Something is amiss with the VRAM management but I have no idea whether that is NVidia or DAZ at fault. Clearly there is a cumulative increase in VRAM occupation when rendering a series.

    Fwiw if you can render individual frames of your animation successfully that is conclusive proof that OptiX Prime's increased memory usage is NOT to blame for your issue. Iray Context-to-Context scheduling (either on the Iray or Daz Studio side of things) most likely is.

    I'd highly recommend submitting a bug report to Daz for this (making sure to mention the single frames working versus whole sequences not) since - regardless of whether it's Iray or Daz Studio at fault - DS's developer's should be able to tune for it. I have noticed issues in recent releases of Daz Studio with active Iray Contexts not always following start/stop directives coming directly from the Daz Studio GUI. Eg. pressing "cancel" on the pop-up dialog for an active render unlocks the main DS window's GUI, but iteration updates continue to visibly accumulate in the separate render window.

    I Haven't bothered reporting this last issue to Daz since, for my purposes, it's actually sort of a useful extra feature. But it and your issue with sequences failing are most likely related.

    Post edited by RayDAnt on
  • marblemarble Posts: 6,159
    edited October 2019
    RayDAnt said:
    marble said:

    This optix Prime being on by default Issue is NOT GOOD.

    I just tested a scene of mine in Studio 4.12, total memory used (monitored via GPUz) 10.5 GB.

    Unistalled 4.12, installed 4.11 re-rendered same scene, total memory used 9.4GB

    Therefore optix Prime is adding a 1.1gb overhead. So I will be un installing 4.12 from all of my machines and re-installing 4.11 

    DAZ this is NOT GOOD., and you should have made users aware of this potential memory overhead.

    I have read that there has been a new release of Iray which seems to have better texture and memory optimisation so hopefully that will help to reduce the memory overhead of optix prime in a future release.

    Thank god I found out about this before I bought that £1100.00 2080 ti I had my eye on. I'm currently using 1080ti's on Windows 7 Pro.

    I mean whats the use of having alot of GPU Memory particularly if you are using it to render big scenes, when you are essentially loosing 10%+ of that memory for accelerated rendering, its crazy.

    And I feel really sorry for you guys on Windows 10, cus now, with optix prime,you are loosing around 28% of your cards' total memory, to poor hardware/software implementation.  

    S.K.

    By my understanding of the preceeding comments an RTX card like the 2080Ti is not affected by the OptiX issue, if that is ineed the issue at hand.

    Interesting IF that is indeed the case, that changes things a touch.., though tough luck for those on older HW. 

    Richard, would it be possible to have an official statement about OptiX not adding any significant memory overhead on RTX based cards.??

    SK

    Well, there's the other problem that has been mentioned here, if you render animations. The Optix Prime VRAM issue only appeared for me when I started rendering an image series for an animation. The scene was actually relatively small - well within the VRAM limits - but it fails on iterations after the first frame. Something is amiss with the VRAM management but I have no idea whether that is NVidia or DAZ at fault. Clearly there is a cumulative increase in VRAM occupation when rendering a series.

    Fwiw if you can render individual frames of your animation successfully that is conclusive proof that OptiX Prime's increased memory usage is NOT to blame for your issue. Iray Context-to-Context scheduling (either on the Iray or Daz Studio side of things) most likely is.

    I'd highly recommend submitting a bug report to Daz for this (making sure to mention the single frames working versus whole sequences not) since - regardless of whether it's Iray or Daz Studio at fault - DS's developer's should be able to tune for it. I have noticed issues in recent releases of Daz Studio with active Iray Contexts not always following start/stop directives coming directly from the Daz Studio GUI. Eg. pressing "cancel" on the pop-up dialog for an active render unlocks the main DS window's GUI, but iteration updates continue to visibly accumulate in the separate render window.

    I Haven't bothered reporting this last issue to Daz since, for my purposes, it's actually sort of a useful extra feature. But it and your issue with sequences failing are most likely related.

    I see your point about single frame vs series. I'm doing a test at the moment with a single render in both 4.11 and 4.12 Beta. The scene is more complex than the one that was failing in the image series. This one has 3 G8 figures, a room and furniture and the first try is in 4.12 Beta. According to GPU-Z I'm up to 7GB of VRAM. I'll see what is reported with 4.11 and update this post.

    Here are the screenshots. Note that Optix Prime was unchecked (disabled) for 4.11.

    GPU_Z_4_11.jpg
    537 x 673 - 85K
    GPU_Z_4_12.jpg
    537 x 673 - 168K
    Post edited by marble on
  • marblemarble Posts: 6,159
    edited October 2019

    One further point about Optix prime and the image series. I tried a series render using 4.11 with Optix Prime switched on and then off (this was the animation scene that I was having problems with initially, not the one used for the test in my immediately previous post).

    4.11 with Optix Prime on: the first frame rendered using GPU, the second and subsequent frames fell back to CPU.

    4.11 with Optix Prime unchecked (off): all frames rendered with GPU.

    So I'm still not convinced that Optix Prime is not the culprit, at least in part.

    Post edited by marble on
  • mavantemavante Posts: 734
    inquire said:
    mavante said:
    itinerant said:
    inquire said:
    itinerant said:
    inquire said:

    Again, for the Mac version of DAZ Studio 4.12: Catalina, the new Mac OS, is, it seems, just days away from release. Will 4.12 run in Catalina? I ask because Catalina has supposedly dropped Open GL and Open CL in favor of Metal. If DAZ Studio 4.12 will not work in Catalina, will there be a vesion of DAZ Studio that will work with Catalina, that is, Metal?

     

    I have the latest Catalina beta installed.  The install through DIM downloaded successfully, but failed to install.  I will try the install again using the Manual download later tonight.  (Daz servers are running a bit slow right now.)

    Oh, well please let us know what happens if you try to run DAZ Studio in Catalina. Did you try running a previous version of DAZ Studio in Catalina?

     

    I was able to install 4.12 manually rather than through DIM (in fact, it still shows up as a Product Update in Install Manager).  I have loaded some previously saved scenes and rendered them, both in 3DL and Iray.

    Can you say how you are able to use Iray at all on a Mac running Catalina? I've read many (I've lost count of how many) ostensibly authoritative statements that Apple crushed any and all support for NVidia GPUs after High Sierra 10.13.6.

    Lordy, all this is confusing for mere techno-speak-challenged mortals who wanted to do art, not become computer tech specialists ...

    I think itenerant means that the render is being done by the CPU, not the GPU, in Catalina. I'm a Mac user, too, and I've read that Apple was dropping Open GL and Open CL both, in favor of Metal, in Catalina. But perhaps this has not happened, at least, not yet. If that's the case, then YEA!!! 

    But, yes, I've also read that there are no drivers in Catalina that will support nVidida Graphics cards.

    Perhaps someone who works there at DAZ3D will explain these things to us. I hope so, and that's what I'm asking for. I'm not tech savy, and I don't want to spread anything that is untrue.

    Thanks, Inquire. It seems that either nobody knows, or nobody cares.

  • BobvanBobvan Posts: 2,609
    edited October 2019

    I just had the same mouse button and iray preview glicthes happen on my desktop so it's not a PC issue. The simple workaround is to avoid using mouse wheel..

    Post edited by Bobvan on
  • PadonePadone Posts: 2,195
    edited October 2019
    marble said:

    4.11 with Optix Prime on: the first frame rendered using GPU, the second and subsequent frames fell back to CPU.

    4.11 with Optix Prime unchecked (off): all frames rendered with GPU.

    Looks like either a out of vram issue or a bad driver. Assuming that cuda takes about 1G for the working buffer, then optix takes another 1G, then iray itself may double the scene data when rendering animations, you may try with a simple animation scene taking less than 2G vram itself. If this renders fine with optix then it's a out of vram issue, otherwise it's likely a driver issue.

    You may also try rendering with the viewport in iray mode, since this way iray doesn't reload the textures for every frame. This should speedup things and may also help with vram handling.

    In my system I can render animations fine. I use 4.11 with a 6G 1060 card and I keep optix on and the viewport in iray mode. But I usually render the characters separate from the scene then I composite them later. This way I get about one-two minutes per frame that's not bad for animation.

    The full system specs are in my signature. I also reserve the 1060 for rendering, that is, my monitor is wired to the mobo so I use the vega card for the viewport. May be this also helps iray to better handle the 1060 vram.

    Though I only use daz studio for simple personal projects and I use blender for "real" production.

    Post edited by Padone on
  • marblemarble Posts: 6,159
    edited October 2019
    Padone said:
    marble said:

    4.11 with Optix Prime on: the first frame rendered using GPU, the second and subsequent frames fell back to CPU.

    4.11 with Optix Prime unchecked (off): all frames rendered with GPU.

    Looks like either a out of vram issue or a bad driver. Assuming that cuda takes about 1G for the working buffer, then optix takes another 1G, then iray itself may double the scene data when rendering animations, you may try with a simple animation scene taking less than 2G vram itself. If this renders fine with optix then it's a out of vram issue, otherwise it's likely a driver issue.

    You may also try rendering with the viewport in iray mode, since this way iray doesn't reload the textures for every frame. This should speedup things and may also help with vram handling.

    In my system I can render animations fine. I use 4.11 with a 6G 1060 card and I keep optix on and the viewport in iray mode. But I usually render the characters separate from the scene then I composite them later. This way I get about one-two minutes per frame that's not bad for animation.

    The full system specs are in my signature. I also reserve the 1060 for rendering, that is, my monitor is wired to the mobo so I use the vega card for the viewport. May be this also helps iray to better handle the 1060 vram.

    Though I only use daz studio for simple personal projects and I use blender for "real" production.

    Yes, I keep coming back to thinking about Blender but I always find excuses why to put off learning that scary node system. I have an 8GB 1070 so it shouldn't be too much to ask for a simple scene with two characters and a few props to render a few frames of animation. I even have all the character texture sizes reduced vy Scene Optimizer. But it still fails in IRay. So I might, at last, have to knuckle down and learn how to transfer scenes and set up materials in Blender. Eevee could probably render my short animations in a few minutes but I suspect it might take hours to set up the materials.

    I updated the driver to the latest, by the way.

    Post edited by marble on
  • marble said:

    Or I could figure out how to send my scenes to Blender and render with Cycles or Eevee. Or - expensive but perhaps cheaper than a new (high capacity VRAM) GPU - I could invest in Octane.

    There is a *huge* difference between how the Blender renders treats the scene and how Iray works. Blender renders only what the camera can see; Iray for unknown reason always works with the whole scene, even if you make the items outside the scene invsible *AND* mark them not to be rendered.

  • marble said:

    Or I could figure out how to send my scenes to Blender and render with Cycles or Eevee. Or - expensive but perhaps cheaper than a new (high capacity VRAM) GPU - I could invest in Octane.

    There is a *huge* difference between how the Blender renders treats the scene and how Iray works. Blender renders only what the camera can see; Iray for unknown reason always works with the whole scene, even if you make the items outside the scene invsible *AND* mark them not to be rendered.

    Iray, and I would think Blender, need things from out of immediate view for reflections and shadows. Hidden/non-rendering items* are not sent to the renderer - or if theya re, it's a bug and should be reported.

    * Items with 0 Cutout Opacity are sent, since the engine won't know they are invisible until it has evaluated their surface properties.

  • marble said:

    Or I could figure out how to send my scenes to Blender and render with Cycles or Eevee. Or - expensive but perhaps cheaper than a new (high capacity VRAM) GPU - I could invest in Octane.

    There is a *huge* difference between how the Blender renders treats the scene and how Iray works. Blender renders only what the camera can see; Iray for unknown reason always works with the whole scene, even if you make the items outside the scene invsible *AND* mark them not to be rendered.

     Hidden/non-rendering items* are not sent to the renderer - or if theya re, it's a bug and should be reported.

    * Items with 0 Cutout Opacity are sent, since the engine won't know they are invisible until it has evaluated their surface properties.

    I am judging from the memory usage. When I make items invisible (not through opacity) and/or mark them for not-rendering, the memory usage is still much bigger when compared to the memory usage if I delete the same items. But this is somewhat risky if I forget to make a backup copy (I think making auto-backup copies was one of the suggestion I made some times ago).

  • jestmartjestmart Posts: 4,379

    Seems like invisible objects image maps are still using memory.

  • marble said:

    Or I could figure out how to send my scenes to Blender and render with Cycles or Eevee. Or - expensive but perhaps cheaper than a new (high capacity VRAM) GPU - I could invest in Octane.

    There is a *huge* difference between how the Blender renders treats the scene and how Iray works. Blender renders only what the camera can see; Iray for unknown reason always works with the whole scene, even if you make the items outside the scene invsible *AND* mark them not to be rendered.

     Hidden/non-rendering items* are not sent to the renderer - or if theya re, it's a bug and should be reported.

    * Items with 0 Cutout Opacity are sent, since the engine won't know they are invisible until it has evaluated their surface properties.

    I am judging from the memory usage. When I make items invisible (not through opacity) and/or mark them for not-rendering, the memory usage is still much bigger when compared to the memory usage if I delete the same items. But this is somewhat risky if I forget to make a backup copy (I think making auto-backup copies was one of the suggestion I made some times ago).

    Of the scene in DS or the render? Yes, they will cntunue to consume memory in DS - but they won't (as far as I know) be passed to Iray or 3Delight.

  • inquireinquire Posts: 1,737
    mavante said:
    inquire said:
    mavante said:
    itinerant said:
    inquire said:
    itinerant said:
    inquire said:

    Again, for the Mac version of DAZ Studio 4.12: Catalina, the new Mac OS, is, it seems, just days away from release. Will 4.12 run in Catalina? I ask because Catalina has supposedly dropped Open GL and Open CL in favor of Metal. If DAZ Studio 4.12 will not work in Catalina, will there be a vesion of DAZ Studio that will work with Catalina, that is, Metal?

     

    I have the latest Catalina beta installed.  The install through DIM downloaded successfully, but failed to install.  I will try the install again using the Manual download later tonight.  (Daz servers are running a bit slow right now.)

    Oh, well please let us know what happens if you try to run DAZ Studio in Catalina. Did you try running a previous version of DAZ Studio in Catalina?

     

    I was able to install 4.12 manually rather than through DIM (in fact, it still shows up as a Product Update in Install Manager).  I have loaded some previously saved scenes and rendered them, both in 3DL and Iray.

    Can you say how you are able to use Iray at all on a Mac running Catalina? I've read many (I've lost count of how many) ostensibly authoritative statements that Apple crushed any and all support for NVidia GPUs after High Sierra 10.13.6.

    Lordy, all this is confusing for mere techno-speak-challenged mortals who wanted to do art, not become computer tech specialists ...

    I think itenerant means that the render is being done by the CPU, not the GPU, in Catalina. I'm a Mac user, too, and I've read that Apple was dropping Open GL and Open CL both, in favor of Metal, in Catalina. But perhaps this has not happened, at least, not yet. If that's the case, then YEA!!! 

    But, yes, I've also read that there are no drivers in Catalina that will support nVidida Graphics cards.

    Perhaps someone who works there at DAZ3D will explain these things to us. I hope so, and that's what I'm asking for. I'm not tech savy, and I don't want to spread anything that is untrue.

    Thanks, Inquire. It seems that either nobody knows, or nobody cares.

    Well, I really hope that someone will explain this and enlighten us.

  • marblemarble Posts: 6,159
    marble said:

    Or I could figure out how to send my scenes to Blender and render with Cycles or Eevee. Or - expensive but perhaps cheaper than a new (high capacity VRAM) GPU - I could invest in Octane.

    There is a *huge* difference between how the Blender renders treats the scene and how Iray works. Blender renders only what the camera can see; Iray for unknown reason always works with the whole scene, even if you make the items outside the scene invsible *AND* mark them not to be rendered.

    Iray, and I would think Blender, need things from out of immediate view for reflections and shadows. Hidden/non-rendering items* are not sent to the renderer - or if theya re, it's a bug and should be reported.

    * Items with 0 Cutout Opacity are sent, since the engine won't know they are invisible until it has evaluated their surface properties.

    What about IRay Section Planes? I had one in the scene that was falling back to CPU and I had it there specifically to hide parts of the scene I didn't want to be sent to the GPU and use VRAM. Didn't seem to help.

  • marble said:
    marble said:

    Or I could figure out how to send my scenes to Blender and render with Cycles or Eevee. Or - expensive but perhaps cheaper than a new (high capacity VRAM) GPU - I could invest in Octane.

    There is a *huge* difference between how the Blender renders treats the scene and how Iray works. Blender renders only what the camera can see; Iray for unknown reason always works with the whole scene, even if you make the items outside the scene invsible *AND* mark them not to be rendered.

    Iray, and I would think Blender, need things from out of immediate view for reflections and shadows. Hidden/non-rendering items* are not sent to the renderer - or if theya re, it's a bug and should be reported.

    * Items with 0 Cutout Opacity are sent, since the engine won't know they are invisible until it has evaluated their surface properties.

    What about IRay Section Planes? I had one in the scene that was falling back to CPU and I had it there specifically to hide parts of the scene I didn't want to be sent to the GPU and use VRAM. Didn't seem to help.

    The section plane is evaluated by Iray - DS just sends the information that it is there - so Iray needs the full data for all clipped objects in order to work out which bits to draw and which to hide from view, or from light paths. Section Planes aren't for saving memory, they are to allow one to place a camera beyond a wall or the like and render it see-through while having it still show in reflections or block light (though by default DS hides the clipped areas from light paths too).

  • marblemarble Posts: 6,159

    Wasn't there some talk about NVidia looking at offloading textures to system RAM like Octane can do?

  • RayDAntRayDAnt Posts: 956
    marble said:

    Wasn't there some talk about NVidia looking at offloading textures to system RAM like Octane can do?

    Yes. Unfortunately it's currently platform-limited to Linux based systems.

  • marblemarble Posts: 6,159
    edited October 2019
    RayDAnt said:
    marble said:

    Wasn't there some talk about NVidia looking at offloading textures to system RAM like Octane can do?

    Yes. Unfortunately it's currently platform-limited to Linux based systems.

    If I understand what I'm seeing in my searches, Blender Cycles now has Out-of-Core capabilities too. That makes Blender an even more attractive alternative, doesn't it? I do hope mCasual releases his mcjteleblender for 2.8 soonish 'cause I feel the need to switch.

    Post edited by marble on
  • RayDAntRayDAnt Posts: 956
    edited October 2019
    marble said:
    RayDAnt said:
    marble said:

    Wasn't there some talk about NVidia looking at offloading textures to system RAM like Octane can do?

    Yes. Unfortunately it's currently platform-limited to Linux based systems.

    If I understand what I'm seeing in my searches, Blender Cycles now has Out-of-Core capabilities too. That makes Blender and even more attractive alternative, doesn't it? I do hope mCasual releases his mcjteleblender for 2.8 soonish 'cause I feel the need to switch.

    Under Linux, Iray has had full support for both out-of-core (aka system ram augmented GPU) rendering and video ram pooling across a theoretically unlimited number of discrete graphics cards for 4+ years. Since most Iray usage is at the Enterprise level (where Linux based server farms are the norm) there has so far been very little incentive on the Nvidia end to get this functionality fully working under Windows or Mac OS. But one can hope that pressure from developments like you describe will help make that happen.

    Post edited by RayDAnt on
  • Padone said:

    The DS 4.12.x.x version(s) of iray have RTX support.

    Thank you Richard and @RayDAnt and @f7eer for pointing to me this great news. Especially @RayDAnt fot the comparisons between optix prime and rtx, that is impressive. For some reason I was dumb enough to skip this milestone all together. So daz studio 4.12 does use rtx for raytracing.

    how do we know iray render is not falling back to cuda rendering ? is there something in the render log that confirms we actually use rtx for raytracing ? thanks

Sign In or Register to comment.