Rendering an animation... needs a NASA computer?

124

Comments

  • DustRiderDustRider Posts: 2,880
    kyoto kid said:
    drzap said:
    drzap said:
    ebergerly said:

    Okay, so now we need to complain that DAZ Studio doesn't have a de-noiser feature like Blender smiley

    Very cool. Here's a 9 second render with only like 100 samples and de-noising on. And the same render with de-noising off and 256 samples is all noisy and ugly. 

    I'm wondering if anyone in this thread knows what Optix is?   If so, they might know what the upcoming newly released Nvidia Optix 5.0 is.  cool

    I think that Optix Prime acceleration used to be an option for Iray renders but I can't find it in the render settings for Studio 4.10. Has it been removed, hidden somewhere or did I just imagine it?

    From a short web search I think that Optix may be the lower level engine that Iray is built on, but I may have got that wrong. If I'm right, version 5.0 probably means they have upgraded the whole thing a bit.

     

    Optix 5 is a denoiser.  Something to use Titan V tensor cores on.  Not sure if Daz Studio will get the AI.

    ...and who here can really afford to drop 3,000$ on a GPU card?

    From the info I linked to above, it shows that support for Titan V (Volta) was implemented for OptiX 5, but I haven't been able to find anything specific about the AI-Denoiser being only for the Titan V (though It probably is their best platform for it). OptiX 5 is simply the next version of their OptiX SDK (optimized path tracing), with the new features/enhancements I noted above.

  • drzapdrzap Posts: 795
    kyoto kid said:
    drzap said:
    drzap said:
    ebergerly said:

    Okay, so now we need to complain that DAZ Studio doesn't have a de-noiser feature like Blender smiley

    Very cool. Here's a 9 second render with only like 100 samples and de-noising on. And the same render with de-noising off and 256 samples is all noisy and ugly. 

    I'm wondering if anyone in this thread knows what Optix is?   If so, they might know what the upcoming newly released Nvidia Optix 5.0 is.  cool

    I think that Optix Prime acceleration used to be an option for Iray renders but I can't find it in the render settings for Studio 4.10. Has it been removed, hidden somewhere or did I just imagine it?

    From a short web search I think that Optix may be the lower level engine that Iray is built on, but I may have got that wrong. If I'm right, version 5.0 probably means they have upgraded the whole thing a bit.

     

    Optix 5 is a denoiser.  Something to use Titan V tensor cores on.  Not sure if Daz Studio will get the AI.

    ...and who here can really afford to drop 3,000$ on a GPU card?

    Redshift will have the denoiser in its next update.  If the results are what I expect, I'll be dropping it.  That's my render times cut in half.  The card will pay for itself.

  • ebergerlyebergerly Posts: 3,255
    edited January 2018

    This denoising thing has me very interested. I found a tech paper written by the guys (from S. Korea) who, I think, developed the denoising algorithm used by the guy who introduced it in Blender. Aside from being a brain-exploding tech paper with incomprehensible math, from what I can extract from it, as well as some discussions in the Blender site, it seems like something like this:

    Y'know how ambient occlusion is just a fake shadow, determined by a smart guess based on the geometry? Basically, if the geometry/normals indicates a corner (like where the base of a building meets the ground), then it throws a shadow there. It's not an actual ray-traced shadow, but a fake using an intelligent guess from the normals pass or whatever. At least that's how I understand it. 

    Well it seems this de-noising is similar (if my hunch is correct). Instead of bringing the final render into Photoshop and applying a blur, it's a bit like bringing the render PLUS a bunch of the associated render passes into Photoshop and using some information from those passes to figure how best to apply a blur.

    I mean, at the end of the day, it's all about figuring out what color (RGB) each pixel should be. So if you can take all the information in the important passes and extract enough info to figure out what the color of the pixel should be, you are basically applying a very intelligent post-processing filter. And you can make a much better result without having to actually calculate using a time consuming ray-tracing. Which is why sometimes it works great, but other times the result looks too blurry. Because the only way to really get the correct color of each pixel is to crank up the samples and render it for another 10 hours, using the ACTUAL geometry and materials and lights.

    If you think about it, say you have a sphere sitting on the ground, and should have a shadow underneath. But you have speckly noise, which is bright pixels which don't belong. So if you're smart, you can use the normals to determine that the surface is facing down, and it should be in shadow, and there is no indirect light there (the indirect pass shows nothing), and so on, so you can kinda guess that the outlying bright pixel doesn't belong, so you blur it. I'm guessing that's what de-noising is doing.

    Now I need to figure out what "linear regression" really means, cuz apparently they're using a lot of it smiley

    Post edited by ebergerly on
  • drzapdrzap Posts: 795
    edited January 2018
    DustRider said:
    kyoto kid said:
    drzap said:
    drzap said:
    ebergerly said:

    Okay, so now we need to complain that DAZ Studio doesn't have a de-noiser feature like Blender smiley

    Very cool. Here's a 9 second render with only like 100 samples and de-noising on. And the same render with de-noising off and 256 samples is all noisy and ugly. 

    I'm wondering if anyone in this thread knows what Optix is?   If so, they might know what the upcoming newly released Nvidia Optix 5.0 is.  cool

    I think that Optix Prime acceleration used to be an option for Iray renders but I can't find it in the render settings for Studio 4.10. Has it been removed, hidden somewhere or did I just imagine it?

    From a short web search I think that Optix may be the lower level engine that Iray is built on, but I may have got that wrong. If I'm right, version 5.0 probably means they have upgraded the whole thing a bit.

     

    Optix 5 is a denoiser.  Something to use Titan V tensor cores on.  Not sure if Daz Studio will get the AI.

    ...and who here can really afford to drop 3,000$ on a GPU card?

    From the info I linked to above, it shows that support for Titan V (Volta) was implemented for OptiX 5, but I haven't been able to find anything specific about the AI-Denoiser being only for the Titan V (though It probably is their best platform for it). OptiX 5 is simply the next version of their OptiX SDK (optimized path tracing), with the new features/enhancements I noted above.

    Without those tensors, it will probably behave just like an ordinary denoiser.  The Pascals don't have a lot of compute power for AI.

    Post edited by drzap on
  • kyoto kidkyoto kid Posts: 41,847
    drzap said:
    kyoto kid said:
    drzap said:
    drzap said:
    ebergerly said:

    Okay, so now we need to complain that DAZ Studio doesn't have a de-noiser feature like Blender smiley

    Very cool. Here's a 9 second render with only like 100 samples and de-noising on. And the same render with de-noising off and 256 samples is all noisy and ugly. 

    I'm wondering if anyone in this thread knows what Optix is?   If so, they might know what the upcoming newly released Nvidia Optix 5.0 is.  cool

    I think that Optix Prime acceleration used to be an option for Iray renders but I can't find it in the render settings for Studio 4.10. Has it been removed, hidden somewhere or did I just imagine it?

    From a short web search I think that Optix may be the lower level engine that Iray is built on, but I may have got that wrong. If I'm right, version 5.0 probably means they have upgraded the whole thing a bit.

     

    Optix 5 is a denoiser.  Something to use Titan V tensor cores on.  Not sure if Daz Studio will get the AI.

    ...and who here can really afford to drop 3,000$ on a GPU card?

    Redshift will have the denoiser in its next update.  If the results are what I expect, I'll be dropping it.  That's my render times cut in half.  The card will pay for itself.

    ...crikey I could build a pretty nice system for 3 grand even given the spike in consumer GPU prices.

     

  • drzapdrzap Posts: 795
    edited January 2018
    kyoto kid said:
    drzap said:
    kyoto kid said:
    drzap said:
    drzap said:
    ebergerly said:

    Okay, so now we need to complain that DAZ Studio doesn't have a de-noiser feature like Blender smiley

    Very cool. Here's a 9 second render with only like 100 samples and de-noising on. And the same render with de-noising off and 256 samples is all noisy and ugly. 

    I'm wondering if anyone in this thread knows what Optix is?   If so, they might know what the upcoming newly released Nvidia Optix 5.0 is.  cool

    I think that Optix Prime acceleration used to be an option for Iray renders but I can't find it in the render settings for Studio 4.10. Has it been removed, hidden somewhere or did I just imagine it?

    From a short web search I think that Optix may be the lower level engine that Iray is built on, but I may have got that wrong. If I'm right, version 5.0 probably means they have upgraded the whole thing a bit.

     

    Optix 5 is a denoiser.  Something to use Titan V tensor cores on.  Not sure if Daz Studio will get the AI.

    ...and who here can really afford to drop 3,000$ on a GPU card?

    Redshift will have the denoiser in its next update.  If the results are what I expect, I'll be dropping it.  That's my render times cut in half.  The card will pay for itself.

    ...crikey I could build a pretty nice system for 3 grand even given the spike in consumer GPU prices.

     

    My render target is 30 sec/frame (24fps) on 8 GTX1080i's.  My system only has enough slots for 3 double-wide cards, so I was going to be in the market for Supermicro's 8 gpu server.   Those things are almost 4 grand for the bare bones box.  Not counting the purchase of 7 gpu's.  Not counting the purchase of 2 Xeons and ram.  Now, if the Titan performs as I expect with the AI denoiser, it should be the equivalent of 3 1080ti's.  So, magically, I will have room in my workstation for 9 1080ti's, thereby exceeding my rendering target and saving a considerable amount of money and trouble.

    Post edited by drzap on
  • kyoto kidkyoto kid Posts: 41,847

    ...granted you do animation and from the sound of it, as more than just a hobby. 

    I'm looking to head back to 3DL for my illustration work, particularly with the release of IBL Master, as Iray just takes far too long when one doesn't have an up to date GPU.  Given the ongoing shortage of mid to high range consumer GPU cards and the resulting high prices being demanded (in some cases more than double what a card cost at rollout) not going ot be in the market for a GPU upgrade in the foreseeable future unless I come into a big pile of money.

  • drzapdrzap Posts: 795
    edited January 2018
    kyoto kid said:

    ...granted you do animation and from the sound of it, as more than just a hobby. 

    I'm looking to head back to 3DL for my illustration work, particularly with the release of IBL Master, as Iray just takes far too long when one doesn't have an up to date GPU.  Given the ongoing shortage of mid to high range consumer GPU cards and the resulting high prices being demanded (in some cases more than double what a card cost at rollout) not going ot be in the market for a GPU upgrade in the foreseeable future unless I come into a big pile of money.

    .... sadly, just a little more than a hobby.  Certainly not for money at this point.  But I have learned that many opportunities come to those who are prepared for them and unfortunately, I don't have much patience for compromise.  If I'm going to do it, I will swing for the fences.  Lots of noodles and dumpling meals in my future, but as your Lombardi quote says,".... chase perfection".

    Post edited by drzap on
  • kyoto kidkyoto kid Posts: 41,847

    ...on the backside of life here what with being retired. Oh I occasionally land a commission for a book cover or other illustration which gives me a few extra zlotys in the pocket now and then, but for myself, primarily doing this for the love of it and to illustrate my writings.  Already had music taken away from me by advancing artrhitis, with this I still have a creative outlet.

  • drzapdrzap Posts: 795
    edited January 2018

    "on the backside of life here ..."

    To the contrary, my friend!  As long as you have a love to chase, you are never on the backside.  "Though our outside is withering away, the inside is being renewed from day to day."

    Post edited by drzap on
  • PadonePadone Posts: 4,003
    edited January 2018
    ebergerly said:

    This denoising thing has me very interested.

    Just for the sake of completeness. Iray does have a denoiser filter in the render settings. But AFAIK it just doesn't work at all. No matter what you try it does nothing. So if anybody is able to get something out of this thing you're very welcome.

     

    drzap said:

    My render target is 30 sec/frame (24fps) on 8 GTX1080i's.

    Mine is real-time rendering on an average card. I believe this will be possible with EEVEE. Of course it will not be exactly the same as a "real" rendering. But given the usual compromises in animation I feel it will be very close anyway. To this goal I guess an option for DAZ Studio would be to support Iray real-time. Even if I don't know how good it is and/or if it can be integrated in the viewport.

     

    denoiser.jpg
    487 x 455 - 65K
    Post edited by Padone on
  • KlaudMKlaudM Posts: 76

    After some test I can say that:

    1080p - 250 iterations - Noise Filter enabled (5,5,0.5 thanks to Padone)

    It's a really good settings, obv far from the original quality but absolutely acceptable for the moment and takes "only" about 2mins for framerate.

    Pity that Daz reloads the scene at each frame, otherwise the time would be much lower, almost half.

  • Hi guys,

    The title reflects what I'm thinking after the last image rendered... 

    At this time I have 2xGTX1080, sw used DAZ 4.1 Pro with Iray. 

    The last image took 1h for the rendering, max 5000 iterations (after 1h was at 3100) and 4K resolution 3840x2160), the scene is nothing complex, 3 characters in standing pose, a wall in background and marble on the floor, as light I used a preset of iRadiance HDRI Studio, the view isn't particulary near, just enough to cover the half screen with the characters.

    So, if I would to make 5 mins of animation, do I really need to render for 9.000 hrs???!!! surpriselaugh (375 days!!!!)
    - 1h for 1 frames, 30hrs for 1 second (30fps), 5 mins are 300 seconds... 300 x 30.

    I'm really wondering what type of gpu (or cpu, at this time I don't know what thinking) for example have at Marvel Studio or similar to animate their 3d models... and then they must also add FX!

    Also I'd like to know how much a VCA server could reduce the time, from 9.000hrs to...?

    Ok maybe 4K is too much, at 1080p the same image takes 32mins, so we pass from 9.000hrs to 4.800hrs... only seven months angel

    Buy iclone 7 and 3Dxchange import your Daz models and scenes into iclone and  animate to your hearts content. Iray inside of Daz studio is not really set up to do animation! Daz Iray is great for stills but not animation.

  • PadonePadone Posts: 4,003
    edited January 2018

    Noise Filter enabled (5,5,0.5 thanks to Padone)

    Pity that Daz reloads the scene at each frame, otherwise the time would be much lower, almost half.

    AFAIK the noise filter just does nothing you can turn it off and you'll notice no variations. You can use After Effects or a similar software for post-denoising. As for the scene reload, you can try keeping the viewport in iray mode too while rendering, this way it shouldn't reload the scene for each frame.

    Also I agree with Silver Dolphin that iClone 7 is a good option for animation.

    Post edited by Padone on
  • outrider42outrider42 Posts: 3,679

    After some test I can say that:

    1080p - 250 iterations - Noise Filter enabled (5,5,0.5 thanks to Padone)

    It's a really good settings, obv far from the original quality but absolutely acceptable for the moment and takes "only" about 2mins for framerate.

    Pity that Daz reloads the scene at each frame, otherwise the time would be much lower, almost half.

    There is a trick to stopping Daz from reloading each frame in animation: Before you render, switch to the Iray mode in the viewport. That's it! Doing this will keep the scene in its memory instead of dumping and starting over for every single frame. A massive time saver!

    Why this hasn't been advertised more is beyond me.

    Now my question, what is the difference in the render time without that noise filter setting?

  • Fixme12Fixme12 Posts: 589

    Hi guys,

    The title reflects what I'm thinking after the last image rendered... 

    At this time I have 2xGTX1080, sw used DAZ 4.1 Pro with Iray. 

    The last image took 1h for the rendering, max 5000 iterations (after 1h was at 3100) and 4K resolution 3840x2160), the scene is nothing complex, 3 characters in standing pose, a wall in background and marble on the floor, as light I used a preset of iRadiance HDRI Studio, the view isn't particulary near, just enough to cover the half screen with the characters.

    So, if I would to make 5 mins of animation, do I really need to render for 9.000 hrs???!!! surpriselaugh (375 days!!!!)
    - 1h for 1 frames, 30hrs for 1 second (30fps), 5 mins are 300 seconds... 300 x 30.

    I'm really wondering what type of gpu (or cpu, at this time I don't know what thinking) for example have at Marvel Studio or similar to animate their 3d models... and then they must also add FX!

    Also I'd like to know how much a VCA server could reduce the time, from 9.000hrs to...?

    Ok maybe 4K is too much, at 1080p the same image takes 32mins, so we pass from 9.000hrs to 4.800hrs... only seven months angel

    for that reason, I gave up on doing animations. I am not willing to lower render quality for that. a good render takes me at least 1 hour render time, means 24 houres for 1 second of a movie, 1 month rendering for 30 seconds movie. No wy, so no. Not for me. I just have one computer.

     

    For that reason i'll give up most of my 3d hobby, even zbrush or modo can't joy me anymore and 'm full back into $ of hardware synth's & ableton... welcome back music world.

  • drzapdrzap Posts: 795
    edited February 2018

    And the preliminary benchmarks are in....great news for animation renderers!  The Titan V trounces the 1080ti in V-ray and Furryball.

    Vray renders are over 60% faster, and, more important for me, Furryball RT is almost twice as fast with Volta power.  This is a dream come true.  Things are finally coming together on the hardware decision side.  Here is a link: https://www.pugetsystems.com/blog/2017/12/12/A-quick-look-at-Titan-V-rendering-performance-1083/

    Post edited by drzap on
  • ebergerlyebergerly Posts: 3,255
    drzap said:

    And the preliminary benchmarks are in....great news for animation renderers!  The Titan V trounces the 1080ti in V-ray and Furryball.

    Vray renders are over 60% faster, and, more important for me, Furryball RT is almost twice as fast with Volta power.  This is a dream come true.  Things are finally coming together on the hardware decision side.  Here is a link: https://www.pugetsystems.com/blog/2017/12/12/A-quick-look-at-Titan-V-rendering-performance-1083/

    Why spend $3,000 though? Why not just buy two 1080ti's? Of course not at these prices, but if you can find some for cheaper I'd think the benefit/cost would be huge over the Titan V wouldn't it? 

     

  • UHFUHF Posts: 518

    Hi guys,

    The title reflects what I'm thinking after the last image rendered... 

    At this time I have 2xGTX1080, sw used DAZ 4.1 Pro with Iray. 

    The last image took 1h for the rendering, max 5000 iterations (after 1h was at 3100) and 4K resolution 3840x2160), the scene is nothing complex, 3 characters in standing pose, a wall in background and marble on the floor, as light I used a preset of iRadiance HDRI Studio, the view isn't particulary near, just enough to cover the half screen with the characters.

    So, if I would to make 5 mins of animation, do I really need to render for 9.000 hrs???!!! surpriselaugh (375 days!!!!)
    - 1h for 1 frames, 30hrs for 1 second (30fps), 5 mins are 300 seconds... 300 x 30.

    I'm really wondering what type of gpu (or cpu, at this time I don't know what thinking) for example have at Marvel Studio or similar to animate their 3d models... and then they must also add FX!

    Also I'd like to know how much a VCA server could reduce the time, from 9.000hrs to...?

    Ok maybe 4K is too much, at 1080p the same image takes 32mins, so we pass from 9.000hrs to 4.800hrs... only seven months angel

    So... If you're wanting to render animations then you have a different cost model.  Correct?  You need efficiency any way you can get it.  Correct?

    Octane is used in movies..  Its FAST... The render Dusty showed would be about 10 minutes in Octane (I use a single GTX 980 for my regular work).  The latest push in Octane is for 360 VR renders.  There's also no texture RAM restrictions on Octane, so I can do 8.5GB renders on 4 BG video cards.

    BLR got rid of their render farm for single video card and Octane, here's some of their work;

  • drzapdrzap Posts: 795
    ebergerly said:
    drzap said:

    And the preliminary benchmarks are in....great news for animation renderers!  The Titan V trounces the 1080ti in V-ray and Furryball.

    Vray renders are over 60% faster, and, more important for me, Furryball RT is almost twice as fast with Volta power.  This is a dream come true.  Things are finally coming together on the hardware decision side.  Here is a link: https://www.pugetsystems.com/blog/2017/12/12/A-quick-look-at-Titan-V-rendering-performance-1083/

    Why spend $3,000 though? Why not just buy two 1080ti's? Of course not at these prices, but if you can find some for cheaper I'd think the benefit/cost would be huge over the Titan V wouldn't it? 

     

    2 reasons:

    1. two 1080ti's = two slots and twice the power usage.  I only have 3 double wide slots and I don't want to waste them
    2. Volta have tensor cores, which could be useful in the forthcoming AI denoiser.  GTX cards are practically useless for AI.
    3.  Only Titans, Quadro and Teslas can turn off that Vram consuming bug in Windows that everyone's complaining about, so I can make use of all 12 GB on the card.

    Oh, that's 3 reasons.

  • ebergerlyebergerly Posts: 3,255
    edited February 2018
    drzap said:
    ebergerly said:
    drzap said:


    3.  Only Titans, Quadro and Teslas can turn off that Vram consuming bug in Windows that everyone's complaining about, so I can make use of all 12 GB on the card.

    Oh, that's 3 reasons.

    I'm curious about the VRAM consuming bug. I keep looking for real info about it but can't find anything definite. Where did you see the info on it? 

    BTW, wow you're right about the power usage. Looks like a Titan V and 1080ti use almost exactly the same power (350watts). So yeah, two 1080ti's will give you an extra 350 watts over a Titan V. Guess they must have made some major efficiency advances. 

     

    Post edited by ebergerly on
  • drzapdrzap Posts: 795
    UHF said:

    Hi guys,

    The title reflects what I'm thinking after the last image rendered... 

    At this time I have 2xGTX1080, sw used DAZ 4.1 Pro with Iray. 

    The last image took 1h for the rendering, max 5000 iterations (after 1h was at 3100) and 4K resolution 3840x2160), the scene is nothing complex, 3 characters in standing pose, a wall in background and marble on the floor, as light I used a preset of iRadiance HDRI Studio, the view isn't particulary near, just enough to cover the half screen with the characters.

    So, if I would to make 5 mins of animation, do I really need to render for 9.000 hrs???!!! surpriselaugh (375 days!!!!)
    - 1h for 1 frames, 30hrs for 1 second (30fps), 5 mins are 300 seconds... 300 x 30.

    I'm really wondering what type of gpu (or cpu, at this time I don't know what thinking) for example have at Marvel Studio or similar to animate their 3d models... and then they must also add FX!

    Also I'd like to know how much a VCA server could reduce the time, from 9.000hrs to...?

    Ok maybe 4K is too much, at 1080p the same image takes 32mins, so we pass from 9.000hrs to 4.800hrs... only seven months angel

    So... If you're wanting to render animations then you have a different cost model.  Correct?  You need efficiency any way you can get it.  Correct?

    Octane is used in movies..  Its FAST... The render Dusty showed would be about 10 minutes in Octane (I use a single GTX 980 for my regular work).  The latest push in Octane is for 360 VR renders.  There's also no texture RAM restrictions on Octane, so I can do 8.5GB renders on 4 BG video cards.

    BLR got rid of their render farm for single video card and Octane, here's some of their work;

    Yeah, Octane is some good stuff.  Hopefully, it will support the Volta chips soon.  I use Redshift, which is even faster.  Nice animation there.  GPU renderers like Octane are in heavy use in small productions, but not in Hollywood movies yet.  Besides being Vram deficient (even with out of core render options), GPU's still can't do all of the simulation work that is required by CGIFX.  For example, Octane is not compatible with many of Maya's dynamic simulations.  Octane is barely on the radar in the Maya world (Maya world = Hollywood CGIFX).  I feel the GPU renderer's time is coming, but for now, at least in major fx films, the cpu renderers rule.  The good news is that gpu renderers are ideal for the little guys.  Which is why the Volta is so valuable.

  • drzapdrzap Posts: 795
    ebergerly said:
    drzap said:
    ebergerly said:
    drzap said:


    3.  Only Titans, Quadro and Teslas can turn off that Vram consuming bug in Windows that everyone's complaining about, so I can make use of all 12 GB on the card.

    Oh, that's 3 reasons.

    I'm curious about the VRAM consuming bug. I keep looking for real info about it but can't find anything definite. Where did you see the info on it? 

    BTW, wow you're right about the power usage. Looks like a Titan V and 1080ti use almost exactly the same power (350watts). So yeah, two 1080ti's will give you an extra 350 watts over a Titan V. Guess they must have made some major efficiency advances. 

     

    The reason for the vRam suckage is because manufacturers have to comply with the Windows Display Driver Model (WDDM).  Nvidia provides software to allows you to disengage WDDM for cards that aren't driving a monitor.

  • ebergerlyebergerly Posts: 3,255
    drzap said:
    ebergerly said:
    drzap said:
    ebergerly said:
    drzap said:


    3.  Only Titans, Quadro and Teslas can turn off that Vram consuming bug in Windows that everyone's complaining about, so I can make use of all 12 GB on the card.

    Oh, that's 3 reasons.

    I'm curious about the VRAM consuming bug. I keep looking for real info about it but can't find anything definite. Where did you see the info on it? 

    BTW, wow you're right about the power usage. Looks like a Titan V and 1080ti use almost exactly the same power (350watts). So yeah, two 1080ti's will give you an extra 350 watts over a Titan V. Guess they must have made some major efficiency advances. 

     

    The reason for the vRam suckage is because manufacturers have to comply with the Windows Display Driver Model (WDDM).  Nvidia provides software to allows you to disengage WDDM for cards that aren't driving a monitor.

    Yeah, that's what I've heard on some forums. But I'm looking for actual manufacturer references explaining the issue and solution, because I've never seen anything official, only some forum posts. I'm not convinced it's real. Even on the GeForce forums some are saying it's just a misunderstanding of how things work. Not sure who to believe. 

     

  • UHFUHF Posts: 518

    drZap:  That video was not done with Octane...  They did however convert their 3D components over to Octane and test it in VR.

  • drzapdrzap Posts: 795
    UHF said:

    drZap:  That video was not done with Octane...  They did however convert their 3D components over to Octane and test it in VR.

    Yeah, I understand, but I can clearly see it's not a big film like Hollywood makes.  It's a budget independent film, which is the perfect niche for Octane and Redshift right now.

  • drzapdrzap Posts: 795
    ebergerly said:
    drzap said:
    ebergerly said:
    drzap said:
    ebergerly said:
    drzap said:


    3.  Only Titans, Quadro and Teslas can turn off that Vram consuming bug in Windows that everyone's complaining about, so I can make use of all 12 GB on the card.

    Oh, that's 3 reasons.

    I'm curious about the VRAM consuming bug. I keep looking for real info about it but can't find anything definite. Where did you see the info on it? 

    BTW, wow you're right about the power usage. Looks like a Titan V and 1080ti use almost exactly the same power (350watts). So yeah, two 1080ti's will give you an extra 350 watts over a Titan V. Guess they must have made some major efficiency advances. 

     

    The reason for the vRam suckage is because manufacturers have to comply with the Windows Display Driver Model (WDDM).  Nvidia provides software to allows you to disengage WDDM for cards that aren't driving a monitor.

    Yeah, that's what I've heard on some forums. But I'm looking for actual manufacturer references explaining the issue and solution, because I've never seen anything official, only some forum posts. I'm not convinced it's real. Even on the GeForce forums some are saying it's just a misunderstanding of how things work. Not sure who to believe. 

     

    This issue doesn't matter to me.  The only thing that matters to me is my performance in Redshift and they officially recommend users to disable WDDM for more memory access and better performance.  Can't do that with a GeForce card so the Titan is for me.

  • drzapdrzap Posts: 795
    UHF said:

    drZap:  That video was not done with Octane...  They did however convert their 3D components over to Octane and test it in VR.

    Oh. never mind.  I misunderstood your meaning.  I thought you were showing an Octane render.

  • No one has posted a solution on many other forums about WDDM. I just did a Google search. One Microsoft forum, the company guy danced around the issue. Either they don't know about it exactly as his reply was BS, or they do and don't know how to fix it. I'm on Windows 7 so it doesn't bother me and besides my GPU for rendering Iray is my second one. But they should fix it.

  • ebergerlyebergerly Posts: 3,255
    edited February 2018

    BTW, I just looked at my Task Manager and GPU-Z to see what the VRAM memory situation is. Prior to loading Studio, the Dedicated GPU Memory usage was 0GB out of the total 11GB VRAM, and that was shown in both apps. And GPU "Hardware reserved memory" is only 137MB (whatever that is). But as soon as I open Studio, both apps show that number jump up to 2 GB out of the 11GB on the GTX-1080ti. Which makes me think that the Studio software is allocating 2GB on the GPU, not Windows 10. And when I load a relatively small scene with just one G3, that number goes up to 3.3GB, but only when the Iray preview gives the first grainy image. Which kinda makes sense, since that's when the GPU is doing its thing and grabbing data for the render off it's VRAM. 

    Seems to me if Windows WDDM was allocating memory it would do it prior to opening Studio. Of course, the numbers I'm reading may not mean what I think they mean.  

    Post edited by ebergerly on
Sign In or Register to comment.