Rendering an animation... needs a NASA computer?

1235»

Comments

  • ebergerly said:
    drzap said:
    ebergerly said:
    drzap said:
    ebergerly said:
    drzap said:


    3.  Only Titans, Quadro and Teslas can turn off that Vram consuming bug in Windows that everyone's complaining about, so I can make use of all 12 GB on the card.

    Oh, that's 3 reasons.

    I'm curious about the VRAM consuming bug. I keep looking for real info about it but can't find anything definite. Where did you see the info on it? 

    BTW, wow you're right about the power usage. Looks like a Titan V and 1080ti use almost exactly the same power (350watts). So yeah, two 1080ti's will give you an extra 350 watts over a Titan V. Guess they must have made some major efficiency advances. 

     

    The reason for the vRam suckage is because manufacturers have to comply with the Windows Display Driver Model (WDDM).  Nvidia provides software to allows you to disengage WDDM for cards that aren't driving a monitor.

    Yeah, that's what I've heard on some forums. But I'm looking for actual manufacturer references explaining the issue and solution, because I've never seen anything official, only some forum posts. I'm not convinced it's real. Even on the GeForce forums some are saying it's just a misunderstanding of how things work. Not sure who to believe. 

     

    Hi,

    This thread caught my eye because I'm getting ready to assemble a new W10 render machine with dual 1080 ti cards.  There do seem to be some tricks to it.  I'm only at the beginning of looking into it. So far I've found NVIDIAs comments on TDR: http://docs.nvidia.com/gameworks/content/developertools/desktop/timeout_detection_recovery.htm It looks to me like I can do it (the fix) on any NVIDIA GPU.  I may try it out on my existing GTX 1070/W7 machine just for practice. I'm getting that momentary driver crash thing now that is common, maybe that will fix it.

    I'm also going to try to run the actual video from the CPU-integrated video at the same time as I run the two GPUs for rendering. Evidently this may or may not be possible depending on what options my BIOS makes available.

    I'll know plenty in 2-3 weeks. :-)

  • Rottenham said:

    Hi,

    This thread caught my eye because I'm getting ready to assemble a new W10 render machine with dual 1080 ti cards.  There do seem to be some tricks to it.  I'm only at the beginning of looking into it. So far I've found NVIDIAs comments on TDR: http://docs.nvidia.com/gameworks/content/developertools/desktop/timeout_detection_recovery.htm It looks to me like I can do it (the fix) on any NVIDIA GPU.  I may try it out on my existing GTX 1070/W7 machine just for practice. I'm getting that momentary driver crash thing now that is common, maybe that will fix it.

    I'm also going to try to run the actual video from the CPU-integrated video at the same time as I run the two GPUs for rendering. Evidently this may or may not be possible depending on what options my BIOS makes available.

    I'll know plenty in 2-3 weeks. :-)

    You misunderstood the issue. The one talked about is related to memory

    Your issue with gfx card and tdr already has a solution https://www.pugetsystems.com/labs/hpc/Working-around-TDR-in-Windows-for-a-better-GPU-computing-experience-777/

  • laststand6522732laststand6522732 Posts: 866
    edited February 2018

    You misunderstood the issue. The one talked about is related to memory

    Thank you. I thought WDDM was a factor in rendering.

     

     

    Post edited by laststand6522732 on
  • Animation, or video in general, has always been a super-costly enterprise if you want quality.  Daz3D along with a single personal computer is definitely in the consumer-space.  And that solution, like consumer-grade camcorders, just can't generate output as compared to high-end systems.

    For this reason, I chose to just fuse 3D-rendered scenes into photographs.  "Animation" now is basically just to run simulations.   The final output is always just a single still frame.   This allows me to produce very hi-res stills (30 MP) to match my DSLR.   But often times, I will quarter-res that (along with the original photo) for situations where output will not be used for large prints (e.g. posters).  A 30 MP still can take a long time, but nowhere near the time to do a few seconds of HD/4K.  At 24fps (holdover for the most part from the early days of cinema), you're looking at 48 MP total for 1 second of HD.  Multiply by 4 of course for 4K.

    Having said that, I do want to experiement with some animation for broadcast video purposes.  But that will be a few seconds here and there.  And at most in HD.  Shorts, music videos and anything longer are clearly out.

  • UHFUHF Posts: 518
    drzap said:
    UHF said:

    drZap:  That video was not done with Octane...  They did however convert their 3D components over to Octane and test it in VR.

    Oh. never mind.  I misunderstood your meaning.  I thought you were showing an Octane render.

    Life of PI was Octane.

  • kyoto kidkyoto kid Posts: 41,847
    drzap said:
    ebergerly said:
    drzap said:
    ebergerly said:
    drzap said:


    3.  Only Titans, Quadro and Teslas can turn off that Vram consuming bug in Windows that everyone's complaining about, so I can make use of all 12 GB on the card.

    Oh, that's 3 reasons.

    I'm curious about the VRAM consuming bug. I keep looking for real info about it but can't find anything definite. Where did you see the info on it? 

    BTW, wow you're right about the power usage. Looks like a Titan V and 1080ti use almost exactly the same power (350watts). So yeah, two 1080ti's will give you an extra 350 watts over a Titan V. Guess they must have made some major efficiency advances. 

     

    The reason for the vRam suckage is because manufacturers have to comply with the Windows Display Driver Model (WDDM).  Nvidia provides software to allows you to disengage WDDM for cards that aren't driving a monitor.

    ...so what if you are using a 1070 or 1080 Ti just for rendering and not driving a monitor?

  • laststand6522732laststand6522732 Posts: 866
    edited February 2018
    kyoto kid said:

    ...so what if you are using a 1070 or 1080 Ti just for rendering and not driving a monitor?

    I did my best to look into this. The "suckage," which NVIDIA calls incorrect reporting, is detailed here. At this time, nothing can be done about it. WDDM cannot be disabled. As far as I can tell, incorrect reporting will take place regardless of whether or not the GPU is driving a monitor. VRAM and scene size may be issues, as Ivy suggested.

    My own plan is to mitigate the effects of the misreporting using the techniques offerred by DustRider and th3Digit.

    Minimizing the GPU load by using the integrated video does seem like a reasonable idea, but the Asus bios I'll be using doesn't allow me to configure the primary video source. Shoganai!

    Great thread!

    Post edited by laststand6522732 on
  • So, I fired up my new version of Iclone 7 and lo and behold the vram useage is displayed right in the workspace in the top left hand corner in yellow. So Iclone 7 is great for animations just know that it does need vram as well but you will know if things will work right away and the program is complely aimed towards animation. Only downside is that you cannot use a 3DConnexion space mouse for camera work and this sucks big time.

  • drzapdrzap Posts: 795
    edited February 2018
    Rottenham said:
    kyoto kid said:

    ...so what if you are using a 1070 or 1080 Ti just for rendering and not driving a monitor?

    I did my best to look into this. The "suckage," which NVIDIA calls incorrect reporting, is detailed here. At this time, nothing can be done about it. WDDM cannot be disabled. As far as I can tell, incorrect reporting will take place regardless of whether or not the GPU is driving a monitor. VRAM and scene size may be issues, as Ivy suggested.

    My own plan is to mitigate the effects of the misreporting using the techniques offerred by DustRider and th3Digit.

    Minimizing the GPU load by using the integrated video does seem like a reasonable idea, but it seems the Asus X299 bios I'll be using doesn't allow me to configure the primary video source. Shoganai!

    Great thread!

    Just read that and that's how it was explained to me, and you can't do anything about it if you own a Geforce card.  If you have a Titan or Quadro, you can use the Nvidia SMI utility to disengage WDDM.  The utility is usually installed at c:|program files\NVIDIA Corporation\NVSMI 

    Post edited by drzap on
  • drzap said:

    Just read that and that's how it was explained to me, and you can't do anything about it if you own a Geforce card.  If you have a Titan or Quadro, you can use the Nvidia SMI utility to disengage WDDM.  The utility is usually installed at c:|program files\NVIDIA Corporation\NVSMI 

    I noticed that the latest itteration of WDDM had made allowance for the driver code itself to take some level of control over WDDM.  Then I noticed a subscript in the NVIDIA driver release notes suggesting that the latest driver had acquired the ability to take this control.  I was left with the impression that while the issue had not been eliminated, it had been minimized.  I'll see soon enough.  Yes, I read about the Titan's ability to disable WDDM.  The Titan's too rich for me.  I all but went broke ordering two 1080tis.  I'll monitor the VRAM usage, and do whatever is needed to avoid CPU rendering.

  • kyoto kidkyoto kid Posts: 41,847
    edited February 2018

    ...so is that for the Titan Xp as well, or just the Titan V?

    At the prices for 1080 Ti's and even some 1070s I 've seen a Titan Xp (direct from Nvidia) would certainly be be a better buy if WDDM could be turned off.

    Post edited by kyoto kid on
  • laststand6522732laststand6522732 Posts: 866
    edited February 2018
    kyoto kid said:

    ...so is that for the Titan Xp as well, or just the Titan V?

    At the prices for 1080 Ti's and even some 1070s I 've seen a Titan Xp (direct from Nvidia) would certainly be be a better buy if WDDM could be turned off.

    It appears to be true for all Titans. I couldn't find a clear read on this. Some articles say "all Titans." Some articles say "certain Titans." The post below, from this site, is interesting. I wonder how much "a bunch of VRAM" is.

    ................................

    Then I found this.  It looks like the WDDM tax on 11G will be about 2G. It then led me to this, which is the most interesting of all.  If I understand this correctly, this is a W10 issue. W7 is unaffected.  Am I reading this right?  Does DirectX12 cost 2G of VRAM? I had no idea this had been going on so long.  I may rethink my OS ideas.

     

    wddm.jpg
    971 x 665 - 106K
    Post edited by laststand6522732 on
  • laststand6522732laststand6522732 Posts: 866
    edited February 2018

    So, I fired up my new version of Iclone 7 and lo and behold the vram useage is displayed right in the workspace in the top left hand corner in yellow. So Iclone 7 is great for animations just know that it does need vram as well but you will know if things will work right away and the program is complely aimed towards animation. Only downside is that you cannot use a 3DConnexion space mouse for camera work and this sucks big time.

    I evaluated iClone v5 and v6.  There's a lot to like about iClone, but every time I shook hands with Reallusion I had to count my fingers.

    Post edited by laststand6522732 on
  • drzapdrzap Posts: 795
    edited February 2018
    Rottenham said:

    So, I fired up my new version of Iclone 7 and lo and behold the vram useage is displayed right in the workspace in the top left hand corner in yellow. So Iclone 7 is great for animations just know that it does need vram as well but you will know if things will work right away and the program is complely aimed towards animation. Only downside is that you cannot use a 3DConnexion space mouse for camera work and this sucks big time.

    I evaluated iClone v5 and v6.  There's a lot to like about iClone, but every time I shook hands with Reallusion I had to count my fingers.

    LOL,  I like the premise of iClone, but I'm not down with their realtime renderer yet.  It's just not good enough for my purposes now.

    Post edited by drzap on
  • wolf359wolf359 Posts: 3,931
    edited February 2018

    I have Iclone pro 6.5 with no intention
    of further updating.

    Iclones original purpose was a powerful 
    realtime  character motion creation system
    for creation and retargeting of motion to rigs 
    temporarily imported for that purpose.
    Much like Autodesk Motionbuilder.

    In the abstract, I understand fully why
    Reallusion has decided to develop their own self sustaining 
    Ecosystem of native Characters & content market.
    However it is ,in my view, becoming prohibitively expensive.
    and  not necessary for me to stay current
    with their core apps as I have no intention
    of Migrating to their figure& content ecosystem 
    for final output .

    The free Character creator is a decent start for creating your own custom
    Iclone native figures by morphing&dressing a
    base installed  figure for use in iclone,
    But I already have a superior
     option for custom character creation with Daz studio &Genesis.
    And I cerainly cant imagine myself ever buying any Iclone native
    Clothing content as I already am skilled at modeling my own for Daz Genesis 1,2,3,8

    IMHO they really need to merge the functionalty of the Character creator& 3Dxchange App into
    Icone  pro proper and re-evaluate their pricing.indecision

     The new Characters are a huge improvement over the version 5X
     realtime avatars but frankly  I dont need them to look any better
    as I will always only use them as proxy animation figures
    who's motion will ultimately be retargeted
    to a hi res daz figure to be rendered in C4D and perhaps Lightwave.

    Post edited by wolf359 on
  • laststand6522732laststand6522732 Posts: 866
    edited February 2018

    ~

    Post edited by laststand6522732 on
Sign In or Register to comment.