Render crashes to CPU

2»

Comments

  • areg5areg5 Posts: 617

    Where do I find the W10 Security & Maintenance log?

  • areg5areg5 Posts: 617
    edited June 2018

    Ok.  First off, let me explain my set up in even more detail:  I run my monitor off of intergrated graphics, so all cards are otherwise free.  I run EVGA Precision OC. I don't overclock, I just use it for fan control.  I limit the operating temp at 78 degrees.  I ran DDU and uninstalled all video drivers.  I attempted installing the latest Nvidia driver, but like I've said, I get BSOD after a whiloe with that even if I'm not rendering.  I again used DDU and uninstalled the driver, and reinstalled 391.35 using the clean install option because that has been stable BSOD-wise on my build.

    I took one of the files that has been crashing to CPU and rendered it using each card individually.  The 1080 ti in slot one finished to completion at 32 minutes, the 1080 Ti in slot 2 finished to completion at 36 minutes, the 980 Ti in slot 3 finished to completion in 54 minutes.

    Next I tried the scene with all cards.  Ran really fast but crashed to CPU in 10 minutes.  While running all 3 cards, in the BIOS as per Gigabyte's user manual, I set the 980 slot to 4x rather than auto because that's what they say to do when using the slot with other cards filling the first 2 slots.

    I did see in Tom's hardware that maybe cards of different generations don't run stable.  I'm next going to try running just the 2 1080Ti's.

     

    Here is my error log from the crash:

     

    Daz Log File

    2018-06-22 10:12:59.328 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.6   IRAY   rend error: CUDA device 2 (GeForce GTX 980 Ti): Kernel [7] failed after 0.005s
    2018-06-22 10:12:59.328 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.4   IRAY   rend error: CUDA device 1 (GeForce GTX 1080 Ti): Kernel [18] failed after 0.036s
    2018-06-22 10:12:59.328 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.5   IRAY   rend error: CUDA device 0 (GeForce GTX 1080 Ti): Kernel [1] failed after 0.016s
    2018-06-22 10:12:59.328 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.6   IRAY   rend error: CUDA device 2 (GeForce GTX 980 Ti): an illegal memory access was encountered (while launching CUDA renderer in core_renderer_wf.cpp:807)
    2018-06-22 10:12:59.328 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.4   IRAY   rend error: CUDA device 1 (GeForce GTX 1080 Ti): an illegal memory access was encountered (while launching CUDA renderer in core_renderer_wf.cpp:807)
    2018-06-22 10:12:59.328 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.5   IRAY   rend error: CUDA device 0 (GeForce GTX 1080 Ti): an illegal memory access was encountered (while launching CUDA renderer in core_renderer_wf.cpp:807)
    2018-06-22 10:12:59.328 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.6   IRAY   rend error: CUDA device 2 (GeForce GTX 980 Ti): Failed to launch renderer
    2018-06-22 10:12:59.328 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.4   IRAY   rend error: CUDA device 1 (GeForce GTX 1080 Ti): Failed to launch renderer
    2018-06-22 10:12:59.328 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.5   IRAY   rend error: CUDA device 0 (GeForce GTX 1080 Ti): Failed to launch renderer
    2018-06-22 10:12:59.328 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.9   IRAY   rend error: CUDA device 1 (GeForce GTX 1080 Ti): Device failed while rendering
    2018-06-22 10:12:59.328 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.2   IRAY   rend error: CUDA device 2 (GeForce GTX 980 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.328 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.7   IRAY   rend error: CUDA device 0 (GeForce GTX 1080 Ti): Device failed while rendering
    2018-06-22 10:12:59.328 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.9   IRAY   rend error: CUDA device 1 (GeForce GTX 1080 Ti): an illegal memory access was encountered (while initializing memory buffer)
    2018-06-22 10:12:59.328 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.2   IRAY   rend error: CUDA device 2 (GeForce GTX 980 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.328 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.9   IRAY   rend error: CUDA device 1 (GeForce GTX 1080 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.328 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.7   IRAY   rend error: CUDA device 0 (GeForce GTX 1080 Ti): an illegal memory access was encountered (while initializing memory buffer)
    2018-06-22 10:12:59.328 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.2   IRAY   rend error: CUDA device 2 (GeForce GTX 980 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.328 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.9   IRAY   rend error: CUDA device 1 (GeForce GTX 1080 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.328 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.7   IRAY   rend error: CUDA device 0 (GeForce GTX 1080 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.328 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.2   IRAY   rend error: CUDA device 2 (GeForce GTX 980 Ti): Device failed while rendering
    2018-06-22 10:12:59.328 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.9   IRAY   rend error: CUDA device 1 (GeForce GTX 1080 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.328 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.7   IRAY   rend error: CUDA device 0 (GeForce GTX 1080 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.328 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.9   IRAY   rend error: CUDA device 1 (GeForce GTX 1080 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.328 WARNING: dzneuraymgr.cpp(307): Iray WARNING - module:category(IRAY:RENDER):   1.2   IRAY   rend warn : All available GPUs failed.
    2018-06-22 10:12:59.328 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.7   IRAY   rend error: CUDA device 0 (GeForce GTX 1080 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.328 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.9   IRAY   rend error: CUDA device 1 (GeForce GTX 1080 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.328 Iray INFO - module:category(IRAY:RENDER):   1.2   IRAY   rend info : Falling back to CPU rendering.
    2018-06-22 10:12:59.328 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.7   IRAY   rend error: CUDA device 0 (GeForce GTX 1080 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.328 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.2   IRAY   rend error: CUDA device 2 (GeForce GTX 980 Ti): an illegal memory access was encountered (while initializing memory buffer)
    2018-06-22 10:12:59.344 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.9   IRAY   rend error: CUDA device 1 (GeForce GTX 1080 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.344 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.7   IRAY   rend error: CUDA device 0 (GeForce GTX 1080 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.344 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.2   IRAY   rend error: All workers failed: aborting render
    2018-06-22 10:12:59.344 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.9   IRAY   rend error: CUDA device 1 (GeForce GTX 1080 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.344 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.7   IRAY   rend error: CUDA device 0 (GeForce GTX 1080 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.344 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.2   IRAY   rend error: CUDA device 2 (GeForce GTX 980 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.344 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.9   IRAY   rend error: CUDA device 1 (GeForce GTX 1080 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.344 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.7   IRAY   rend error: CUDA device 0 (GeForce GTX 1080 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.344 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.2   IRAY   rend error: CUDA device 2 (GeForce GTX 980 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.344 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.9   IRAY   rend error: CUDA device 1 (GeForce GTX 1080 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.344 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.7   IRAY   rend error: CUDA device 0 (GeForce GTX 1080 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.344 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.2   IRAY   rend error: CUDA device 2 (GeForce GTX 980 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.344 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.9   IRAY   rend error: CUDA device 1 (GeForce GTX 1080 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.344 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.7   IRAY   rend error: CUDA device 0 (GeForce GTX 1080 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.344 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.2   IRAY   rend error: CUDA device 2 (GeForce GTX 980 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.344 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.7   IRAY   rend error: CUDA device 0 (GeForce GTX 1080 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.344 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.2   IRAY   rend error: CUDA device 2 (GeForce GTX 980 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.344 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.2   IRAY   rend error: CUDA device 2 (GeForce GTX 980 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.344 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.2   IRAY   rend error: CUDA device 2 (GeForce GTX 980 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.344 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.2   IRAY   rend error: CUDA device 2 (GeForce GTX 980 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.344 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.2   IRAY   rend error: CUDA device 2 (GeForce GTX 980 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.344 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.2   IRAY   rend error: CUDA device 2 (GeForce GTX 980 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.344 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.0   IRAY   rend error: CUDA device 0 (GeForce GTX 1080 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.344 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.0   IRAY   rend error: CUDA device 0 (GeForce GTX 1080 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.344 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.0   IRAY   rend error: CUDA device 0 (GeForce GTX 1080 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.344 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.0   IRAY   rend error: CUDA device 1 (GeForce GTX 1080 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.344 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.0   IRAY   rend error: CUDA device 1 (GeForce GTX 1080 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.344 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.0   IRAY   rend error: CUDA device 1 (GeForce GTX 1080 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.344 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.0   IRAY   rend error: CUDA device 2 (GeForce GTX 980 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.344 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.0   IRAY   rend error: CUDA device 2 (GeForce GTX 980 Ti): an illegal memory access was encountered (while de-allocating memory)
    2018-06-22 10:12:59.344 WARNING: dzneuraymgr.cpp(307): Iray ERROR - module:category(IRAY:RENDER):   1.0   IRAY   rend error: CUDA device 2 (GeForce GTX 980 Ti): an illegal memory access was encountered (while de-allocating memory)

    security and maintenence:
    Problem Event Name:    LiveKernelEvent
    Code:    141
    Parameter 1:    ffffcd8e6c14a010
    Parameter 2:    fffff803d12ffe7c
    Parameter 3:    0
    Parameter 4:    3630
    OS version:    10_0_17134
    Service Pack:    0_0
    Product:    256_1
    OS Version:    10.0.17134.2.0.0.256.48
    Locale ID:    1033

    Extra information about the problem
    Bucket ID:    LKD_0x141_Tdr:6_IMAGE_nvlddmkm.sys_Pascal
    Server information:    e50ea8aa-cd60-4346-95c6-54e681ec4114

     

    In running all 3 cards, my CPU is running at about 78% as per Task Manager.  In running 2 cards, it runs at about 45%.  I'm having difficulty running the 2 1080 ti's now.  I'm trying turning the power down.  Still crashed at 90% power.  When I set the power to 80% on all 3 cards, the render finishes.


    Also noteworthy is that I have difficulty finishing a scene with G3 characters unless I turn the resolution down to base.  So that's where I am this morning.  I was hoping one of the cards would fail when running solo but none of them did.

    Post edited by areg5 on
  • areg5areg5 Posts: 617
    edited June 2018

    New theory:  my power supply, although substantial and based on the EVGA power calculator should be enough to run all of my cards, isn't quite up to it.  The Coolermaster calculator indicates that at full power I would need 1270 Watts to run all cards at full power.  So at full power, since my PSU is 1200 watts, it might hiccup just enough  to cause the rendering faults.  With all cards tuned down to 80%, they all run to completion and the renders are very fast. The scene I was stuck on, which finished at 32 minutes with 1080 Ti #1, 36 minutes with 1080 Ti #2 and 54 miniutes with 980 Ti finishes in 12 minutes using all 3 cards.

    Tuning the power down also tunes down the heat.  The 980 Ti runs at 76 degrees, and the 1080 ti's no higher than 73.  I might just have my answer.  I can only wonder if the driver issues are related as well.  When I have the time I'll try to update the driver again.  That wasn't giving me rendering issues as much as system crashing issues, but at some point I'll have to update it if only to run the upcoming Tomb Raider game.

    Needless to say, when I do my next build, I'm going to get the biggest power supply available.  I may want to run 4 cards through it.  I would think 1600 wattts should do it.

    Post edited by areg5 on
  • ebergerlyebergerly Posts: 3,255
    edited June 2018

    "More is better" seems to be one of the most popular internet tech forum responses to difficult issues, but often it's not the case. 

    My computer has a 1080ti running with a 1070. And I have a watt meter continuously measuring the actual power the entire computer consumes from the wall outlet. And with both GPU's running at 100% I have a tough time getting up much past 400watts. Whicn means I could add another 1080ti and another 1070 (for a total of 2x 1070 and 2x 1080ti) and at most it would draw 800 watts.

    I really doubt you're drawing anywhere near 1200 watts unless you're also powering your room airconditioner with that power supply. 

    To verify you might want to spend a few $$ on a power meter (I use a Belkin one that's pretty nice, and only like $30). There's nothing like actual numbers to help resolve issues.  

    EDIT: BTW, attached are the actual specs for a 1080ti, straight from the horse's mouth (NIVIDA). They say the maximum power usage is 250watts. Even if you had 3 of them the max would be 750 watts. Their spec for a 1070 says the max is 150, so that agrees with my actual of 250 + 150 = 400watts. 

    1080ti.JPG
    693 x 496 - 61K
    Post edited by ebergerly on
  • areg5areg5 Posts: 617
    ebergerly said:

    "More is better" seems to be one of the most popular internet tech forum responses to difficult issues, but often it's not the case. 

    My computer has a 1080ti running with a 1070. And I have a watt meter continuously measuring the actual power the entire computer consumes from the wall outlet. And with both GPU's running at 100% I have a tough time getting up much past 400watts. Whicn means I could add another 1080ti and another 1070 (for a total of 2x 1070 and 2x 1080ti) and at most it would draw 800 watts.

    I really doubt you're drawing anywhere near 1200 watts unless you're also powering your room airconditioner with that power supply. 

    To verify you might want to spend a few $$ on a power meter (I use a Belkin one that's pretty nice, and only like $30). There's nothing like actual numbers to help resolve issues.  

    EDIT: BTW are the actual specs for a 1080ti, straight from the horse's mouth (NIVIDA). They say the maximum power usage is 250watts. Even if you had 3 of them the max would be 750 watts. 

    Well, all I can say is that the Coolermaster power calculator takes into account all of the peripherals attached.  When I plugged in the numbers of all cards running at 100%, I was over 1200 W.  When I plugged in the numbers Precision XOC says I'm running at 80% (the GPU clocks are a bit slower), it put me just under 1200 W and it now works.  I think the value of a really good power supply is significantly under-rated, especially when running multiple cards.  I batch render, so I'm not talking about my results for a single render.  Since I toggled the power to my cards, I've rendered 15 scenes with no difficulties, all three cards are running and man is it fast.

  • nicsttnicstt Posts: 11,715
    ebergerly said:

    Again, start simple. Completely remove the video drivers using DDU. Remove unnecessary components.  

    Don't assume.

    It can be sporadic AND a hardware problem. Maybe it's a certain memory location that's only accessed with certain sized scenes. You have no way of knowing. UNLESS you do as I suggested and test everything, including your GPU VRAM, system RAM, hard drive, etc.  

    Make sure the basics are okay, testing along the way. Let the results guide you, don't be guided by assumptions and hunches.If you're worried about some software getting in the way, uninstall it for now. Can it be related to Avast or Malwarebytes? Of course, it can be related to anything. Heck, not long ago Malwarebytes took over users' computers and grabbed ALL their RAM like a virus. Computers and software are very very complicated. I know we love to make everything a simple answer, but often it's not. So the only way to deal with complexity is simplify.  

    BTW, have you checked the W10 Security & Maintenance log to see what's happened lately to your computer? Data like that can help you figure out what's going on a lot better than guesses. 

     

    +1

    I see it a lot; folks confuse symptoms (symptoms usually cause problems), with the problem.

  • nicsttnicstt Posts: 11,715
    areg5 said:
    ebergerly said:

    "More is better" seems to be one of the most popular internet tech forum responses to difficult issues, but often it's not the case. 

    My computer has a 1080ti running with a 1070. And I have a watt meter continuously measuring the actual power the entire computer consumes from the wall outlet. And with both GPU's running at 100% I have a tough time getting up much past 400watts. Whicn means I could add another 1080ti and another 1070 (for a total of 2x 1070 and 2x 1080ti) and at most it would draw 800 watts.

    I really doubt you're drawing anywhere near 1200 watts unless you're also powering your room airconditioner with that power supply. 

    To verify you might want to spend a few $$ on a power meter (I use a Belkin one that's pretty nice, and only like $30). There's nothing like actual numbers to help resolve issues.  

    EDIT: BTW are the actual specs for a 1080ti, straight from the horse's mouth (NIVIDA). They say the maximum power usage is 250watts. Even if you had 3 of them the max would be 750 watts. 

    Well, all I can say is that the Coolermaster power calculator takes into account all of the peripherals attached.  When I plugged in the numbers of all cards running at 100%, I was over 1200 W.  When I plugged in the numbers Precision XOC says I'm running at 80% (the GPU clocks are a bit slower), it put me just under 1200 W and it now works.  I think the value of a really good power supply is significantly under-rated, especially when running multiple cards.  I batch render, so I'm not talking about my results for a single render.  Since I toggled the power to my cards, I've rendered 15 scenes with no difficulties, all three cards are running and man is it fast.

    I would be much more inclined to trust something that checks how much power is being drawn from the 'wall'; hence the power meter. Plus if they do turn out to be very similar, you have confirmation; confirmation is good when using the data and processing it into meaningful information.

  • ebergerlyebergerly Posts: 3,255
    edited June 2018

     I'll leave it to others to speculate why a power supply vendor's calculator might give results that are far above what you might actually observe or the component vendors' specs actually say. 

    Their calculator tells me my machine should be drawing about 650 watts, when in fact I can barely hit 400watts on a good day. They assume all my Ryzen cores are operating at max (90% TDP), at the same time my GPU's at max, and everything else at max, and then they add a couple hundred watts for good measure. Ridiculous.

    I'm sure there are some users out there who can get to that usage (rendering while playing a game while working on a database, etc.), but if you're really concerned spend a few bucks on a meter to erase all doubt.    

     

    Post edited by ebergerly on
  • areg5areg5 Posts: 617
    edited June 2018
    nicstt said:
    areg5 said:
    ebergerly said:

    "More is better" seems to be one of the most popular internet tech forum responses to difficult issues, but often it's not the case. 

    My computer has a 1080ti running with a 1070. And I have a watt meter continuously measuring the actual power the entire computer consumes from the wall outlet. And with both GPU's running at 100% I have a tough time getting up much past 400watts. Whicn means I could add another 1080ti and another 1070 (for a total of 2x 1070 and 2x 1080ti) and at most it would draw 800 watts.

    I really doubt you're drawing anywhere near 1200 watts unless you're also powering your room airconditioner with that power supply. 

    To verify you might want to spend a few $$ on a power meter (I use a Belkin one that's pretty nice, and only like $30). There's nothing like actual numbers to help resolve issues.  

    EDIT: BTW are the actual specs for a 1080ti, straight from the horse's mouth (NIVIDA). They say the maximum power usage is 250watts. Even if you had 3 of them the max would be 750 watts. 

    Well, all I can say is that the Coolermaster power calculator takes into account all of the peripherals attached.  When I plugged in the numbers of all cards running at 100%, I was over 1200 W.  When I plugged in the numbers Precision XOC says I'm running at 80% (the GPU clocks are a bit slower), it put me just under 1200 W and it now works.  I think the value of a really good power supply is significantly under-rated, especially when running multiple cards.  I batch render, so I'm not talking about my results for a single render.  Since I toggled the power to my cards, I've rendered 15 scenes with no difficulties, all three cards are running and man is it fast.

    I would be much more inclined to trust something that checks how much power is being drawn from the 'wall'; hence the power meter. Plus if they do turn out to be very similar, you have confirmation; confirmation is good when using the data and processing it into meaningful information.

    I'm not knocking a power meter, but I don't have one at the moment.  It would be a good investment.  I'm just saying I had a problem running 3 cards.  Each card checked out as working fine when used alone. I followed all of the advice given in this forum.  I checked all of the logs.  I checked W10 maintenence logs.  I tried toggling the BIOS.  I tried numerous system restarts.  I tried different drivers.  I tried turning off anti-virus.  I tried other versions of Daz3D studio. I tried running only 2 cards. The only thing that worked for me is limiting the power draw to each card individually.  If you look up the specs, each card supposedly is rated by Nvidia as needing 250 W.  Then you factor in the needs of the motherboard, processor, fans, DVD drive, external hard drives.  Is it possible that in a complex render the power need can spike beyond what the power supply can produce?  In my opinion, it is.  Is it possible that at the moment of a crash to CPU your meter might not catch it, or if it does catch it it might be transient enough so that if you weren't watching it, it could be missed?  Maybe.  I don't know what kind of log these meters run, if you could get a power use history, etc.

    All I'm saying is that my problem seems fixed with lowering the power needs to the cards.  And EVGA Precision XOC is a free app, that works with many different brands of card.

    Post edited by areg5 on
  • areg5areg5 Posts: 617
    edited June 2018
    areg5 said:
    nicstt said:
    areg5 said:
    ebergerly said:

    "More is better" seems to be one of the most popular internet tech forum responses to difficult issues, but often it's not the case. 

    My computer has a 1080ti running with a 1070. And I have a watt meter continuously measuring the actual power the entire computer consumes from the wall outlet. And with both GPU's running at 100% I have a tough time getting up much past 400watts. Whicn means I could add another 1080ti and another 1070 (for a total of 2x 1070 and 2x 1080ti) and at most it would draw 800 watts.

    I really doubt you're drawing anywhere near 1200 watts unless you're also powering your room airconditioner with that power supply. 

    To verify you might want to spend a few $$ on a power meter (I use a Belkin one that's pretty nice, and only like $30). There's nothing like actual numbers to help resolve issues.  

    EDIT: BTW are the actual specs for a 1080ti, straight from the horse's mouth (NIVIDA). They say the maximum power usage is 250watts. Even if you had 3 of them the max would be 750 watts. 

    Well, all I can say is that the Coolermaster power calculator takes into account all of the peripherals attached.  When I plugged in the numbers of all cards running at 100%, I was over 1200 W.  When I plugged in the numbers Precision XOC says I'm running at 80% (the GPU clocks are a bit slower), it put me just under 1200 W and it now works.  I think the value of a really good power supply is significantly under-rated, especially when running multiple cards.  I batch render, so I'm not talking about my results for a single render.  Since I toggled the power to my cards, I've rendered 15 scenes with no difficulties, all three cards are running and man is it fast.

    I would be much more inclined to trust something that checks how much power is being drawn from the 'wall'; hence the power meter. Plus if they do turn out to be very similar, you have confirmation; confirmation is good when using the data and processing it into meaningful information.

    I'm not knocking a power meter, but I don't have one at the moment.  It would be a good investment.  I'm just saying I had a problem running 3 cards.  Each card checked out as working fine when used alone. I followed all of the advice given in this forum.  I checked all of the logs.  I checked W10 maintenence logs.  I tried toggling the BIOS.  I tried numerous system restarts.  I tried different drivers.  I tried turning off anti-virus.  I tried other versions of Daz3D studio. I tried running only 2 cards. The only thing that worked for me is limiting the power draw to each card individually.  If you look up the specs, each card supposedly is rated by Nvidia as needing 250 W.  Then you factor in the needs of the motherboard, RAM,processor, fans, DVD drive, external hard drives.  Is it possible that in a complex render the power need can spike beyond what the power supply can produce?  In my opinion, it is.  Is it possible that at the moment of a crash to CPU your meter might not catch it, or if it does catch it it might be transient enough so that if you weren't watching it, it could be missed?  Maybe.  I don't know what kind of log these meters run, if you could get a power use history, etc.

    All I'm saying is that my problem seems fixed with lowering the power needs to the cards.  And EVGA Precision XOC is a free app, that works with many different brands of card.

     

    Post edited by areg5 on
Sign In or Register to comment.