Iray Starter Scene: Post Your Benchmarks!

1235749

Comments

  • ScarletX1969ScarletX1969 Posts: 587
    edited December 1969

    This is what I do for a living, and so do Dumorian and SimonWM, among others (and Spooky, who knows all ;) ). $600+ worth of graphics cards is a very legitimate (and tax-deductible) expense for a full-time DAZ Published Artist, or anyone else who has 3D as their only job.

    Yeah, I thought so. Most of us who are enthusiasts probably do not have the power you guys possess (or the desire to plop down $500+ on a video card when you're only a hobbyist).

    That being said, you can't argue with the investment or the results.

    I actually could afford a $200 - $250 card, I'm just cheap. LOL!

  • ZarconDeeGrissomZarconDeeGrissom Posts: 5,412
    edited March 2015

    SimonWM said:
    You can find the time your render took by going to the log. It will tell you in minutes and seconds.lol, except when it dose this :ohh:
    Total Rendering Time: 1 days -47.-96 seconds

    and it dose not log the amount of convergence.
    Post edited by ZarconDeeGrissom on
  • mjc1016mjc1016 Posts: 15,001
    edited December 1969

    dsexton72 said:
    This is what I do for a living, and so do Dumorian and SimonWM, among others (and Spooky, who knows all ;) ). $600+ worth of graphics cards is a very legitimate (and tax-deductible) expense for a full-time DAZ Published Artist, or anyone else who has 3D as their only job.

    Yeah, I thought so. Most of us who are enthusiasts probably do not have the power you guys possess (or the desire to plop down $500+ on a video card when you're only a hobbyist).

    That being said, you can't argue with the investment or the results.

    I actually could afford a $200 - $250 card, I'm just cheap. LOL!

    There's another group that would think nothing of shelling out that much for a video card...gamers.

  • namffuaknamffuak Posts: 4,031
    edited December 1969

    mjc1016 said:
    dsexton72 said:
    This is what I do for a living, and so do Dumorian and SimonWM, among others (and Spooky, who knows all ;) ). $600+ worth of graphics cards is a very legitimate (and tax-deductible) expense for a full-time DAZ Published Artist, or anyone else who has 3D as their only job.

    Yeah, I thought so. Most of us who are enthusiasts probably do not have the power you guys possess (or the desire to plop down $500+ on a video card when you're only a hobbyist).

    That being said, you can't argue with the investment or the results.

    I actually could afford a $200 - $250 card, I'm just cheap. LOL!

    There's another group that would think nothing of shelling out that much for a video card...gamers.

    And there's also the odd handful of us with more money than time or sense (or is that a handful of us odd types?) who wouldn't object to upgrading the video. Actually, I'm holding off, pending the DAZ offering Spooky mentioned in one of these Iray threads - and there's always the chance I'd be able to pay for such a card with store credit that resulted from a discounted gift card . . .

  • ZarconDeeGrissomZarconDeeGrissom Posts: 5,412
    edited December 1969

    namffuak said:
    mjc1016 said:
    dsexton72 said:
    This is what I do for a living, and so do Dumorian and SimonWM, among others (and Spooky, who knows all ;) ). $600+ worth of graphics cards is a very legitimate (and tax-deductible) expense for a full-time DAZ Published Artist, or anyone else who has 3D as their only job.

    Yeah, I thought so. Most of us who are enthusiasts probably do not have the power you guys possess (or the desire to plop down $500+ on a video card when you're only a hobbyist).

    That being said, you can't argue with the investment or the results.

    I actually could afford a $200 - $250 card, I'm just cheap. LOL!


    There's another group that would think nothing of shelling out that much for a video card...gamers.
    And there's also the odd handful of us with more money than time or sense (or is that a handful of us odd types?) who wouldn't object to upgrading the video. Actually, I'm holding off, pending the DAZ offering Spooky mentioned in one of these Iray threads - and there's always the chance I'd be able to pay for such a card with store credit that resulted from a discounted gift card . . .
    Agreed. I just need to replace the display card now (still counting days, lol), for many OpenGL related reasons. The render card will wait.
    (my 8600gt vs what I want)
    8600GT_Vs_Gt730_005topSelects_102_crop2.png
    1330 x 730 - 464K
  • ScarletX1969ScarletX1969 Posts: 587
    edited December 1969

    namffuak said:
    And there's also the odd handful of us with more money than time or sense (or is that a handful of us odd types?) who wouldn't object to upgrading the video. Actually, I'm holding off, pending the DAZ offering Spooky mentioned in one of these Iray threads - and there's always the chance I'd be able to pay for such a card with store credit that resulted from a discounted gift card . . .

    What DAZ offering is this?

  • ScarletX1969ScarletX1969 Posts: 587
    edited December 1969

    Agreed. I just need to replace the display card now (still counting days, lol), for many OpenGL related reasons. The render card will wait.
    (my 8600gt vs what I want)

    Those GT 730 cards are actually kinda cheap on Amazon.com. Anyone do a benchmark with those or can point me to one?

  • ZarconDeeGrissomZarconDeeGrissom Posts: 5,412
    edited March 2015

    dsexton72 said:
    Agreed. I just need to replace the display card now (still counting days, lol), for many OpenGL related reasons. The render card will wait.
    (my 8600gt vs what I want)

    Those GT 730 cards are actually kinda cheap on Amazon.com. Anyone do a benchmark with those or can point me to one?E&J has a more crippled one 92 cores (I'm afraid to ask, or be a bother), me, not yet. I'm still counting days.
    There are some OpenGL benchmarks around with the 1GB and 2GB cards, I was not able to find any on the 4GB card.
    http://www.daz3d.com/forums/viewreply/786546/

    Post edited by ZarconDeeGrissom on
  • UHFUHF Posts: 512
    edited December 1969

    dsexton72, like others, enthusiast or profession is the driving force for all the god-cards (I call most of them Watt-hogs, lol).

    I built my system for CAD&DAW; work over a decade ago (2005), and upgraded as necessary. I do have other constraints, so my system is not sporting a Watt-hog card. My interest for the time being is in fanless cards using less then fifty watts. There is a big difference between recording music in a studio, and attempting similar on the flight-deck of a carrier during flight-ops, lol.
    (I was on the ship on the left, trust me, they get a tad noisy when launching plains, lol)

    It would be nice to see some benchmarks from less capable cards. I myself will probably be doing a 384 core GT730 eventually. There is a more lobotomized variant of the 730 out there that I'm not interested in purchasing, tho seeing the score would be at least beneficial to Dumor3D (The Dumor3D CUDA core theory), and myself.

    As for VCA setups, I think it would chew threw this bench rather quickly, network bandwidth and latency probably being the biggest part of any degradation in performance. How many K6000 cards per VCA, [goes looking it up], lol.


    I use Asus STRIX for a reason... 0dB.
    http://www.asus.com/ca-en/Graphics_Cards/STRIXGTX970DC2OC4GD5/

    Basically, if I'm running simple software like audio production or a video game like Crysis 3, I'll never hear it. When I render and burn 350 Watts in my PC, I can barely hear them as the fans kick in (and not at a high speed or anything) to keep the temperatures below 70. At full speed, the fans on the STRIX are very loud, but they will also cool the GPU to 40C while rendering at full tilt.

    My biggest concern is keeping CPU fan noise down. I hear those when I'm doing a lot on my PC. Mastering your CPU Overclocking/Fan Control software will go a long ways. I'm an Asus man, so AI Suite 3, and GPU Tweak. My PC is overclocked to 5Ghz, and I leave the fans on Turbo (can be loud if the CPU loading is heavy).

    By the way... under clocking an expensive PC will also get you a lot of quiet.

    I have a friend who keeps his PC in the basement, and just runs enough cables down to it to use it.

  • Arnold CArnold C Posts: 740
    edited March 2015

    Windows 7, Build 7601, Service Pack 1, 6 GB RAM.
    AMD Athlon(tm) 64 X2 Dual Core Processor 4600+ 2,40 Ghz
    MSI GeForce GT 730 2GB DDR3 [N730K-2GD3/OC]
    (Yeah, that's the active cooled GT 730 with 128 bit Memory Interface which only has 96 CUDA-Cores).
    Already has been replaced by a

    Gainward GeForce GT 730 2048MB SilentFX [426018336-3224]
    (passive cooled, 64 bit Memory Interface, 384 CUDA-Cores)

    once my favourite salesman has been aware of his mistake [I told him :-)].

    The stats are for the MSI GeForce GT 730, I'll retry the render with the Gainward GeForce GT 730 SilentFX once I find the time.
    A first test with the "Material Ball" scene resulted in that the Gainward is around 25% faster:

    Total Rendering Time: 59 minutes 14.19 seconds

    Unfortunately the render was completed by reaching the maximum number of samples at between 85.0 and 90.0% convergence.
    I added it for comparision. Seems pretty close to the final.
    From another test render I got that it might be enough to let it render up to a Rendering Converged Ratio of 88.0 % instead of 95.0 %. I guess it depends on the scenery and lighting used. There aren't any noticeable differences, but the 88% one took only nearly half the time to completion. ("13 minutes 56.57 seconds" instead "23 minutes 56.85 seconds").

    Log Entries:
    Iray INFO - module:category(IRAY:RENDER): 1.0 IRAY rend info : Maximum number of samples reached.
    Iray INFO - module:category(IRAY:RENDER): 1.0 IRAY rend info : Device statistics:
    Iray INFO - module:category(IRAY:RENDER): 1.0 IRAY rend info : CUDA device 0 (GeForce GT 730):
    4672 iterations, 62.486s init, 3484.255s render
    Iray INFO - module:category(IRAY:RENDER): 1.0 IRAY rend info : CPU (1 threads):
    328 iterations, 49.793s init, 3499.841s render

    IrayTestScene.jpg
    400 x 520 - 79K
    Post edited by Arnold C on
  • Arnold CArnold C Posts: 740
    edited December 1969

    SimonWM said:
    OptiX Prime Acceleration seems to make a BIG DIFFERENCE!!! New benchmark:

    CPU + Both GPUs + OptiX Acceleration = 3 minutes 6.98 seconds

    It pushed my benchmark UP by over 1 minute!!! That's crazy!!! All my benchmarks have been full scene to completion.

    You could try and set "Instancing Optimization" from "Memory" to "Speed" (in the Render Settings Tab under "Optimization").
    In combintion with "Optix Prime Acceleration" on it's at least on my poor, weak rig he fastest setting.

  • Richard HaseltineRichard Haseltine Posts: 95,997
    edited March 2015

    For those wanting to see the results for less powerful GPUs.

    Intel i7 920, 2.66GHz
    12 GB RAM
    nVidia GTS 450 1GB
    Windows 7 64

    CPU

    Iray INFO - module:category(IRAY:RENDER): 1.0 IRAY rend info : CPU (8 threads): 5000 iterations, 25.058s init, 3726.222s render
    Total Rendering Time: 1 hours 2 minutes 33.57 seconds

    CPU + GPU
    Iray INFO - module:category(IRAY:RENDER): 1.0 IRAY rend info : CUDA device 0 (GeForce GTS 450): 3480 iterations, 27.550s init, 1362.842s render
    Iray INFO - module:category(IRAY:RENDER): 1.0 IRAY rend info : CPU (7 threads): 1520 iterations, 24.745s init, 1365.496s render
    Total Rendering Time: 23 minutes 13.76 seconds

    CPU + GPU + OptiX
    Iray INFO - module:category(IRAY:RENDER): 1.0 IRAY rend info : CUDA device 0 (GeForce GTS 450): 3770 iterations, 26.278s init, 1059.576s render
    Iray INFO - module:category(IRAY:RENDER): 1.0 IRAY rend info : CPU (7 threads): 1230 iterations, 24.262s init, 1061.878s render
    Total Rendering Time: 18 minutes 8.30 seconds

    Post edited by Richard Haseltine on
  • namffuaknamffuak Posts: 4,031
    edited December 1969

    dsexton72 said:
    namffuak said:
    And there's also the odd handful of us with more money than time or sense (or is that a handful of us odd types?) who wouldn't object to upgrading the video. Actually, I'm holding off, pending the DAZ offering Spooky mentioned in one of these Iray threads - and there's always the chance I'd be able to pay for such a card with store credit that resulted from a discounted gift card . . .

    What DAZ offering is this?

    Sorry, not Spooky; it was DAZ_Jon here

  • kqbjnp66vk@snkmail.com[email protected] Posts: 24
    edited December 1969

    dminut said:

    EDIT 2:
    Went ahead and ran the full scene (sphere 8 and 9 visible) on the 980 by itself just for completion as it might make it easier to compare scores against other cards.

    Iray INFO - module:category(IRAY:RENDER): 1.0 IRAY rend info : CUDA device 0 (GeForce GTX 980): 5000 iterations, 19.783s init, 305.965s render
    Finished Rendering
    Total Rendering Time: 5 minutes 27.33 seconds

    Ok. Same machine, "new" card. I'd been borrowing my 980 out of the gaming machine while I looked around, and I finally bought another card to replace it. Honestly, I was pretty close to getting a Titan X, but there was some shred of sanity left in me, it seems, and I held off. Decided to go powerful, but lower priced, with an eye towards nVidia's next gen card in 2016. I was just about to pull the trigger on a 970 (hoping it's 3.5 + .5 memory wouldn't bite me), when I noticed a refurbished 780 with 6GiB of Ram. As far as I can tell, only EVGA made one of those and there wasn't a lot of them. At around $100 more than a 970, I went ahead and grabbed it :)

    Here's the benchies:

    Full scene (8 and 9 enabled) GPU only OptiX Prime Acceleration disabled (to compare against 980 above):
    Iray INFO - module:category(IRAY:RENDER): 1.0 IRAY rend info : CUDA device 0 (GeForce GTX 780): 5000 iterations, 21.017s init, 333.390s render

    Total Rendering Time: 5 minutes 56.12 seconds

    Not bad for a non-overclocked card, really.

    Same, but with OptiX Prime enabled:
    Iray INFO - module:category(IRAY:RENDER): 1.0 IRAY rend info : CUDA device 0 (GeForce GTX 780): 5000 iterations, 20.742s init, 224.317s render

    Total Rendering Time: 4 minutes 6.66 seconds

  • LITTLEFiskyLITTLEFisky Posts: 22
    edited December 1969

    Please, can anybody help me? I've installed D|S 4.8 and tried to render something via NVIDEA IRay. Simple scene took about 2 hours of rendering. In log file it says that I'm using only CPU.

    There is a part from log:

    Iray INFO - module:category(IRAY:RENDER): 1.1 IRAY rend info : NVIDIA display driver version: 34043
    Iray INFO - module:category(IRAY:RENDER): 1.1 IRAY rend info : Your NVIDIA driver supports CUDA version up to 6050; iray requires CUDA version 6050; all is good.
    Iray INFO - module:category(IRAY:RENDER): 1.1 IRAY rend info : Using iray plugin version 4.0, build 231000.7639 n, 21 Feb 2015, nt-x86-64-vc11.
    Iray INFO - module:category(IRAY:RENDER): 1.1 IRAY rend info : CUDA device 0: "GeForce 840M" (compute capability 5.0, 2048MB total, 1959MB available)
    Iray INFO - module:category(BLEND:RENDER): 1.1 BLEND rend info : blend render (build 231000.7639, 21 Feb 2015) initialized
    Iray INFO - module:category(IRAY_CLOUD_CLIENT:NETWORK): 1.1 IRAY_C net info : iray_cloud (build 231000.7639, 21 Feb 2015) initialized
    Iray INFO - module:category(IRT:RENDER): 1.0 IRT rend info : Initializing context.
    Iray INFO - module:category(IRT:RENDER): 1.1 IRT rend info : irt (build 231000.7639, 21 Feb 2015) initialized
    Iray INFO - module:category(EIAXF:IO): 1.1 EIAXF io info : axf importer (build 231000.7639, 21 Feb 2015) initialized
    Iray INFO - module:category(ICB:IO): 1.1 ICB io info : cb_importer (build 231000.7639, 21 Feb 2015) initialized
    Iray INFO - module:category(IRAY_CLOUD_CLIENT:NETWORK): 1.1 IRAY_C net info : iray_bridge_snapshot (build 231000.7639, 21 Feb 2015) initialized
    Iray INFO - module:category(IRAY_CLOUD_SERVER:NETWORK): 1.1 IRAY_C net info : iray_bridge_server (build 231000.7639, 21 Feb 2015) initialized
    Iray INFO - module:category(EEMI:IO): 1.1 EEMI io info : .mi exporter (build 231000.7639, 21 Feb 2015) initialized
    Iray INFO - module:category(EIMI:IO): 1.1 EIMI io info : .mi importer (build 231000.7639, 21 Feb 2015) initialized
    Iray INFO - module:category(IRT:RENDER): 1.6 IRT rend info : Host 1 connected to the cluster.
    NVidia Iray GPUs:
    GPU: 1 - GeForce 840M
    Memory Size: 1.9 GB
    Clock Rate: 1124000 KH
    Multi Processor Count: 3
    Cuda Version: 5.0
    WARNING: dzneuraymgr.cpp(256): Iray WARNING - module:category(IRAY:RENDER): 1.0 IRAY rend warn : There is no GPU available to the iray renderer.
    Iray INFO - module:category(IRT:RENDER): 1.0 IRT rend info : Resource assignment for host 0 has changed.

    Please help :(

  • SickleYieldSickleYield Posts: 7,622
    edited December 1969

    0127 said:
    Please, can anybody help me? I've installed D|S 4.8 and tried to render something via NVIDEA IRay. Simple scene took about 2 hours of rendering. In log file it says that I'm using only CPU.

    There is a part from log:

    Iray INFO - module:category(IRAY:RENDER): 1.1 IRAY rend info : NVIDIA display driver version: 34043
    Iray INFO - module:category(IRAY:RENDER): 1.1 IRAY rend info : Your NVIDIA driver supports CUDA version up to 6050; iray requires CUDA version 6050; all is good.
    Iray INFO - module:category(IRAY:RENDER): 1.1 IRAY rend info : Using iray plugin version 4.0, build 231000.7639 n, 21 Feb 2015, nt-x86-64-vc11.
    Iray INFO - module:category(IRAY:RENDER): 1.1 IRAY rend info : CUDA device 0: "GeForce 840M" (compute capability 5.0, 2048MB total, 1959MB available)
    Iray INFO - module:category(BLEND:RENDER): 1.1 BLEND rend info : blend render (build 231000.7639, 21 Feb 2015) initialized
    Iray INFO - module:category(IRAY_CLOUD_CLIENT:NETWORK): 1.1 IRAY_C net info : iray_cloud (build 231000.7639, 21 Feb 2015) initialized
    Iray INFO - module:category(IRT:RENDER): 1.0 IRT rend info : Initializing context.
    Iray INFO - module:category(IRT:RENDER): 1.1 IRT rend info : irt (build 231000.7639, 21 Feb 2015) initialized
    Iray INFO - module:category(EIAXF:IO): 1.1 EIAXF io info : axf importer (build 231000.7639, 21 Feb 2015) initialized
    Iray INFO - module:category(ICB:IO): 1.1 ICB io info : cb_importer (build 231000.7639, 21 Feb 2015) initialized
    Iray INFO - module:category(IRAY_CLOUD_CLIENT:NETWORK): 1.1 IRAY_C net info : iray_bridge_snapshot (build 231000.7639, 21 Feb 2015) initialized
    Iray INFO - module:category(IRAY_CLOUD_SERVER:NETWORK): 1.1 IRAY_C net info : iray_bridge_server (build 231000.7639, 21 Feb 2015) initialized
    Iray INFO - module:category(EEMI:IO): 1.1 EEMI io info : .mi exporter (build 231000.7639, 21 Feb 2015) initialized
    Iray INFO - module:category(EIMI:IO): 1.1 EIMI io info : .mi importer (build 231000.7639, 21 Feb 2015) initialized
    Iray INFO - module:category(IRT:RENDER): 1.6 IRT rend info : Host 1 connected to the cluster.
    NVidia Iray GPUs:
    GPU: 1 - GeForce 840M
    Memory Size: 1.9 GB
    Clock Rate: 1124000 KH
    Multi Processor Count: 3
    Cuda Version: 5.0
    WARNING: dzneuraymgr.cpp(256): Iray WARNING - module:category(IRAY:RENDER): 1.0 IRAY rend warn : There is no GPU available to the iray renderer.
    Iray INFO - module:category(IRT:RENDER): 1.0 IRT rend info : Resource assignment for host 0 has changed.

    Please help :(

    Two hours is not an unusual time for a regular scene at 1000x1300 or larger on a quad core CPU. Turning on OptiX acceleration in your advanced render settings can help a bit.

    It says your graphics card only has 1.9 gb VRAM, so there aren't many renders it will be able to handle without running out of memory and switching over to the CPU (possibly none). Since that's a laptop GPU, though, and it's saying "there is no GPU available to the Iray renderer," it's possible it's also a driver problem. Some people have had to use different Nvidia drivers in order to get Iray to use a GPU on a laptop (I don't know what version or where to get them, you'd have to google it).

  • LITTLEFiskyLITTLEFisky Posts: 22
    edited December 1969

    0127 said:
    Please, can anybody help me? I've installed D|S 4.8 and tried to render something via NVIDEA IRay. Simple scene took about 2 hours of rendering. In log file it says that I'm using only CPU.

    There is a part from log:

    Iray INFO - module:category(IRAY:RENDER): 1.1 IRAY rend info : NVIDIA display driver version: 34043
    Iray INFO - module:category(IRAY:RENDER): 1.1 IRAY rend info : Your NVIDIA driver supports CUDA version up to 6050; iray requires CUDA version 6050; all is good.
    Iray INFO - module:category(IRAY:RENDER): 1.1 IRAY rend info : Using iray plugin version 4.0, build 231000.7639 n, 21 Feb 2015, nt-x86-64-vc11.
    Iray INFO - module:category(IRAY:RENDER): 1.1 IRAY rend info : CUDA device 0: "GeForce 840M" (compute capability 5.0, 2048MB total, 1959MB available)
    Iray INFO - module:category(BLEND:RENDER): 1.1 BLEND rend info : blend render (build 231000.7639, 21 Feb 2015) initialized
    Iray INFO - module:category(IRAY_CLOUD_CLIENT:NETWORK): 1.1 IRAY_C net info : iray_cloud (build 231000.7639, 21 Feb 2015) initialized
    Iray INFO - module:category(IRT:RENDER): 1.0 IRT rend info : Initializing context.
    Iray INFO - module:category(IRT:RENDER): 1.1 IRT rend info : irt (build 231000.7639, 21 Feb 2015) initialized
    Iray INFO - module:category(EIAXF:IO): 1.1 EIAXF io info : axf importer (build 231000.7639, 21 Feb 2015) initialized
    Iray INFO - module:category(ICB:IO): 1.1 ICB io info : cb_importer (build 231000.7639, 21 Feb 2015) initialized
    Iray INFO - module:category(IRAY_CLOUD_CLIENT:NETWORK): 1.1 IRAY_C net info : iray_bridge_snapshot (build 231000.7639, 21 Feb 2015) initialized
    Iray INFO - module:category(IRAY_CLOUD_SERVER:NETWORK): 1.1 IRAY_C net info : iray_bridge_server (build 231000.7639, 21 Feb 2015) initialized
    Iray INFO - module:category(EEMI:IO): 1.1 EEMI io info : .mi exporter (build 231000.7639, 21 Feb 2015) initialized
    Iray INFO - module:category(EIMI:IO): 1.1 EIMI io info : .mi importer (build 231000.7639, 21 Feb 2015) initialized
    Iray INFO - module:category(IRT:RENDER): 1.6 IRT rend info : Host 1 connected to the cluster.
    NVidia Iray GPUs:
    GPU: 1 - GeForce 840M
    Memory Size: 1.9 GB
    Clock Rate: 1124000 KH
    Multi Processor Count: 3
    Cuda Version: 5.0
    WARNING: dzneuraymgr.cpp(256): Iray WARNING - module:category(IRAY:RENDER): 1.0 IRAY rend warn : There is no GPU available to the iray renderer.
    Iray INFO - module:category(IRT:RENDER): 1.0 IRT rend info : Resource assignment for host 0 has changed.

    Please help :(

    Two hours is not an unusual time for a regular scene at 1000x1300 or larger on a quad core CPU. Turning on OptiX acceleration in your advanced render settings can help a bit.

    It says your graphics card only has 1.9 gb VRAM, so there aren't many renders it will be able to handle without running out of memory and switching over to the CPU (possibly none). Since that's a laptop GPU, though, and it's saying "there is no GPU available to the Iray renderer," it's possible it's also a driver problem. Some people have had to use different Nvidia drivers in order to get Iray to use a GPU on a laptop (I don't know what version or where to get them, you'd have to google it).

    Thank you so much! GeForce 840M was unchecked xD

    Сохраненное_изображение_2015-3-27_23-4-42.282_.jpg
    326 x 612 - 28K
  • SickleYieldSickleYield Posts: 7,622
    edited December 1969

    *Dies*

    ...Or it could just be that.

  • DAZ_SpookyDAZ_Spooky Posts: 3,100
    edited March 2015

    Due to testing, by default, any Video card with less than 4GB of RAM is unchecked.

    That is not saying you can not get use out of it, provided you are careful, but that with 2GB there are more likely to be driver conflicts which cause crashes, if anything else is also using the video card. (This is especially true on the Mac which forces an automatic reboot on the driver crash.)

    Post edited by DAZ_Spooky on
  • Arnold CArnold C Posts: 740
    edited December 1969

    Found the time to let it render the scene. So here are the benchmarks for the GeForce GT 730.

    GPU only, since my slow Athlon creates a hefty bottleneck trying to help out rendering. Which doesn't help much at all. :-)

    For my renders I set the "Instancing Optimization" to Speed and using the "Optix Prime Acceleration" for it's the fastest setup.

    Windows 7 64 Ultimate
    6GB DDR2 RAM
    AMD Athlon 64 X2 Dual Core Processor 4600+ 2,40 Ghz
    Gainward GeForce GT 730 2048MB SilentFX [Barcode 426018336-3224]

    (Instancing Optimization: Speed / Optix Prime Acceleration: On / GPU Only)

    Full scene:
    CUDA device 0 (GeForce GT 730): 8646 iterations, 52.315s init, 3617.769s render
    Total Rendering Time: 1 hours 1 minutes 13.89 seconds

    Spheres 8 & 9 disabled:
    CUDA device 0 (GeForce GT 730): 6328 iterations, 50.641s init, 2484.912s render
    Total Rendering Time: 42 minutes 19.25 seconds

    IMO the GT 730 is okay for rendering simple sceneries at a reasonable amount of time, but not much more.

    In my area the next best choice would be a GTX 760 (4GB GDDR5, 1152 CUDA Cores) at around 220 €.
    (Less expensive cards lack either the amount of preferable 4GB RAM and/or amount of Processor Cores.)
    But I guess than you could even go for a GTX 960 "Phantom" or "Jetstream" (4GB GDDR5, 1024 CUDA Cores) for around 240 €.

  • ScarletX1969ScarletX1969 Posts: 587
    edited March 2015

    dsexton72 said:
    Agreed. I just need to replace the display card now (still counting days, lol), for many OpenGL related reasons. The render card will wait.
    (my 8600gt vs what I want)

    Those GT 730 cards are actually kinda cheap on Amazon.com. Anyone do a benchmark with those or can point me to one?

    E&J has a more crippled one 92 cores (I'm afraid to ask, or be a bother), me, not yet. I'm still counting days.
    There are some OpenGL benchmarks around with the 1GB and 2GB cards, I was not able to find any on the 4GB card.
    http://www.daz3d.com/forums/viewreply/786546/

    So, I did some posting. Zarcondeegrissom, I bought a GT 740 Super Clocked video card from Amazon and got it yesterday. When I did this test originally using my GT430 card with 2GB of DDR3 RAM, it took over 2 hours. Was thinking it was the card after reading thru these posts. Installed the new card and it still took over 2 hours, plus I got the NVIDIA OpenGL Error 3 message.

    Before going into panic mode, I decided to stop some processes and kill my screen saver. Let the render run over night.

    Here are my results. Don't laugh at my rig...lol.

    Computer: Dell XPS 410
    OS: Windows 7 Ultimate
    RAM: 8GB
    Processor: Core 2 Quad Q6600 2.40 GHZ
    Video Card: NVIDIA Geforce GT 740 SC (super clocked)

    CPU (3 threads): 1011 iterations
    CUDA device 0 (Geforec GT 740): 3989 iterations

    OptiX Prime Acceleration enabled
    Render time (including spheres 8 and 9) : 35 minutes, 4.9 seconds

    Definitely an improvement. Now I'm thinking the GT430 would've come in at an hour. I have an i5 that I use for my renderfarm. Since it supports two video cards, thinking about switching the two and using both the GT430 and GT740 side by side with CPU.

    test_render_of_IRAY.jpg
    400 x 520 - 79K
    Post edited by ScarletX1969 on
  • edited March 2015

    Here is some data from my PC with an outdated GPU. Kept it from my previous PC.
    Win 7 64bit
    i5-3570K @4.1GHz
    GTX560 Ti 1GB OCed
    16GB DDR3 1600 MHz RAM

    Complete Scene to 90%
    GPU only - 13min 40sec
    CPU & GPU - 11min 55sec
    CPU only - 41 min

    Monitor connected to internal GPU.

    From what I have seen here, the results are not consistent with other posts. My GPU seems to be a lot faster than it should.
    And my CPU is too slow. Very strange.

    Post edited by darkhound1_2c7433f604 on
  • DAZ_SpookyDAZ_Spooky Posts: 3,100
    edited December 1969

    Here is some data from my PC with an outdated GPU. Kept it from my previous PC.
    Win 7 64bit
    i5-3570K @4.1GHz
    GTX560 Ti 1GB OCed
    16GB DDR3 1600 MHz RAM

    Complete Scene to 90%
    GPU only - 13min 40sec
    CPU & GPU - 11min 55sec
    CPU only - 41 min

    Monitor connected to internal GPU.

    From what I have seen here, the results are not consistent with other posts. My GPU seems to be a lot faster than it should.
    And my CPU is too slow. Very strange.

    your CPU is an i5 , not sure how many cores, without looking it up :) but with anything but an i7 the number of cores is the number of threads available, so for rendering, all other things being equal, the i5 will render at roughly half the speed of an i7.

    The 500 series are not bad cards, they just don't have enough Ram for most scenes, especially if they are also running monitors.

  • DustRiderDustRider Posts: 2,691
    edited December 1969

    Thought I''d try this while setting up the new/replacement laptop. I'm quite satisfied with the performance so far, especially for a laptop. Wish I could have gotten the GTX 980M, but it would have been $300 more :bug:

    Specs:
    Model: ProStar 177SM-A
    i7 4710MQ 2.5GHZ (4 Cores)
    32 Gb RAM (DDR3 1600)
    GTX 970M 6Gb

    GPU = 6 min 27 seconds
    CPU = 33 min 55 seconds (only went to 90% with this one)
    CPU + GPU = 6 min 43 seconds

  • nicsttnicstt Posts: 11,714
    edited December 1969

    Someone was curious how a 970 compared to a 980, and as I was curious myself, I thought I'd run some benchmarks. Cost wise, it seems the 970 offers the best value; curious on the 960 with 4GB though, and I was initially going for that but considered the extra CUDA cores worth the extra £70. The full 4GB of memory is available on the card, I'm curious though how the issues that NVidia reported affect performance on the 970 and 960.

    The scene was allowed to complete in all instances. What is amazing, is how quickly all get to about 75 percent.

    Strix 970 - I like quiet.
    GT 640 2GB GDDR 3 - passively cooled
    CPU I7 4770k (3.9MHz) - water cooled (I don't run it over-clocked as the increase in performance isn't worth the reduced life for me. I do really, really like quiet. :D )
    16GB memory

    A Titan X or 980Ti is beginning to look tempting though.

    Scene
    Spheres: All

    GPU ONLY - Optix for gpu only
    Iray INFO - module:category(IRAY:RENDER): 1.0 IRAY rend info : CUDA device 0 (GeForce GTX 970): 5000 iterations, 14.304s init, 376.238s render
    Total Rendering Time: 6 minutes 31.89 seconds

    GPU ONLY - Optix for gpu and cpu only
    Iray INFO - module:category(IRAY:RENDER): 1.0 IRAY rend info : CUDA device 0 (GeForce GTX 970): 5000 iterations, 14.090s init, 376.400s render
    Total Rendering Time: 6 minutes 31.72 seconds

    GPU ONLY - Optix - not enabled
    Iray INFO - module:category(IRAY:RENDER): 1.0 IRAY rend info : CUDA device 0 (GeForce GTX 970): 5000 iterations, 13.926s init, 376.267s render
    Total Rendering Time: 6 minutes 31.47 seconds

    GPU x2 & CPU and all Optix
    Iray INFO - module:category(IRAY:RENDER): 1.0 IRAY rend info : CUDA device 1 (GeForce GT 640): 347 iterations, 15.194s init, 342.989s render
    Iray INFO - module:category(IRAY:RENDER): 1.0 IRAY rend info : CUDA device 0 (GeForce GTX 970): 4041 iterations, 14.534s init, 344.041s render
    Iray INFO - module:category(IRAY:RENDER): 1.0 IRAY rend info : CPU (6 threads): 612 iterations, 13.318s init, 344.842s render
    Total Rendering Time: 5 minutes 59.86 seconds

    GPU x2 & CPU - Optix - not enabled
    Iray INFO - module:category(IRAY:RENDER): 1.0 IRAY rend info : CUDA device 1 (GeForce GT 640): 332 iterations, 14.494s init, 335.300s render
    Iray INFO - module:category(IRAY:RENDER): 1.0 IRAY rend info : CUDA device 0 (GeForce GTX 970): 4071 iterations, 14.938s init, 335.453s render
    Iray INFO - module:category(IRAY:RENDER): 1.0 IRAY rend info : CPU (6 threads): 597 iterations, 13.321s init, 336.851s render
    Total Rendering Time: 5 minutes 51.65 seconds
    (Second time)
    Total Rendering Time: 5 minutes 20.24 seconds
    (Third time)
    Total Rendering Time: 5 minutes 24.32 seconds

    GPU & CPU - Optix - not enabled
    Iray INFO - module:category(IRAY:RENDER): 1.0 IRAY rend info : CUDA device 0 (GeForce GTX 970): 4294 iterations, 14.548s init, 319.011s render
    Iray INFO - module:category(IRAY:RENDER): 1.0 IRAY rend info : CPU (7 threads): 706 iterations, 13.300s init, 320.748s render
    Total Rendering Time: 5 minutes 35.28 seconds
    (Second time)
    Total Rendering Time: 5 minutes 45.86 seconds
    (Third time)
    Total Rendering Time: 5 minutes 31.55 seconds

    Scene
    Spheres: 8 & 9 deleted

    GPU Only (970) - Optix - not enabled
    Iray INFO - module:category(IRAY:RENDER): 1.0 IRAY rend info : CUDA device 0 (GeForce GTX 970): 5000 iterations, 14.193s init, 341.497s render
    Total Rendering Time: 5 minutes 56.92 seconds

    GPU Only (640) - Optix - not enabled
    Iray INFO - module:category(IRAY:RENDER): 1.0 IRAY rend info : CUDA device 1 (GeForce GT 640): 5000 iterations, 14.007s init, 3096.679s render
    Total Rendering Time: 51 minutes 51.91 seconds

    CPU Only - Optix - not enabled
    Iray INFO - module:category(IRAY:RENDER): 1.0 IRAY rend info : CPU (8 threads): 5000 iterations, 13.485s init, 1879.276s render
    Total Rendering Time: 31 minutes 33.96 seconds

    I watched SickleYield's video and she reported some say there is a speed up and others report a slow down when using CPU; that may relate to the number of threads the CPU has available; note mine uses 6 when working with both GPUs and all 8 when rendering CPU only. A CPU with fewer threads will switch processes more and the overheads from that may be what outweigh any gains?

    I was further curious when testing the difference between both cards and CPU and just the 970 and CPU; so much so that I reran the tests after I obtained the first results.

  • ZarconDeeGrissomZarconDeeGrissom Posts: 5,412
    edited December 1969

    Thanks for the test dsexton72. I have had this new card in the computer for about a week, and am dreading pushing it. First off tho.

    GT430 and GT740 side by side with CPU.There is a few CUDA core versions that don't play with other versions. I don't recall what the GT430 has, tho the GT730 is "CUDA 3.5", and is one of the ones that only work with other cards with it's kind. Unless there has been a fix for that, that I am unaware of.
    http://www.daz3d.com/forums/viewreply/785620/
    Note:
    The following device configurations are not supported:
    Mixing Fermi- and Kepler-based devices
    Combining Compute Capability 3.0 and Compute Capability 3.5 devices
    Switching at runtime between different GPU generations, for example, switching from a Compute Capability 3.0 to a Compute Capability 2.0 device.

    My GT730 issue, It's making disturbing sounds. Whenever I do something in the view-field, the workload of the GPU goes up, and sounds come out of my speakers (clicks, buzzes, etc). I once again, threw an oscilloscope into the computer to watch the power, and it looks good where I can get the probes. So I can only guess I got a bad card, or it pulls WAY MORE then 23 watts when under load. So I can only guess that when the card is stressed with OpenGL work, it is causing realy nasty power fluctuations on the rail the sound-card also happens to be on.

    Oddly, this only happens in Studio in the view-field. It dose not happen with web pages, file-manager, etc. :coolhmm:

  • prixatprixat Posts: 1,585
    edited December 1969

    Zarcon, have you got a CineBench OpenGL score on that card?

    23Watts! I'm pushing 120W under load on an old 550Ti :-)

  • Takeo.KenseiTakeo.Kensei Posts: 1,303
    edited December 1969

    Thanks for the test dsexton72. I have had this new card in the computer for about a week, and am dreading pushing it. First off tho.

    GT430 and GT740 side by side with CPU.There is a few CUDA core versions that don't play with other versions. I don't recall what the GT430 has, tho the GT730 is "CUDA 3.5", and is one of the ones that only work with other cards with it's kind. Unless there has been a fix for that, that I am unaware of.
    http://www.daz3d.com/forums/viewreply/785620/
    Note:
    The following device configurations are not supported:
    Mixing Fermi- and Kepler-based devices
    Combining Compute Capability 3.0 and Compute Capability 3.5 devices
    Switching at runtime between different GPU generations, for example, switching from a Compute Capability 3.0 to a Compute Capability 2.0 device.

    My GT730 issue, It's making disturbing sounds. Whenever I do something in the view-field, the workload of the GPU goes up, and sounds come out of my speakers (clicks, buzzes, etc). I once again, threw an oscilloscope into the computer to watch the power, and it looks good where I can get the probes. So I can only guess I got a bad card, or it pulls WAY MORE then 23 watts when under load. So I can only guess that when the card is stressed with OpenGL work, it is causing realy nasty power fluctuations on the rail the sound-card also happens to be on.

    Oddly, this only happens in Studio in the view-field. It dose not happen with web pages, file-manager, etc. :coolhmm:

    The 23 W may be correct at idle but certainly not under load. The GT 730 is a slight evolution of the GT 630 with better TDP and updated software support. But under load you should at least go over 70W

  • mjc1016mjc1016 Posts: 15,001
    edited December 1969


    My GT730 issue, It's making disturbing sounds. Whenever I do something in the view-field, the workload of the GPU goes up, and sounds come out of my speakers (clicks, buzzes, etc). I once again, threw an oscilloscope into the computer to watch the power, and it looks good where I can get the probes. So I can only guess I got a bad card, or it pulls WAY MORE then 23 watts when under load. So I can only guess that when the card is stressed with OpenGL work, it is causing realy nasty power fluctuations on the rail the sound-card also happens to be on.

    Oddly, this only happens in Studio in the view-field. It dose not happen with web pages, file-manager, etc. :coolhmm:

    Sounds more like RF type interference than power problems, to me.

  • ZarconDeeGrissomZarconDeeGrissom Posts: 5,412
    edited April 2015

    prixat said:
    Zarcon, have you got a CineBench OpenGL score on that card?
    23Watts! I'm pushing 120W under load on an old 550Ti :-)A week, and No 3dmark scores, I am slipping, lol. Hold the press, cancel the launch sequence, NoGo, NoGo, NoGo, lol. Yea, I wanted to do that the first day it was here, and got distracted with PSU tests, lol. Here you and everyone else goes, 8600GT vs GT730 Zone Edition (With actual 4GB ram on the card, not faked).


    The 23 W may be correct at idle but certainly not under load. The GT 730 is a slight evolution of the GT 630 with better TDP and updated software support. But under load you should at least go over 70W
    Nope, it looks like somewhere around 23 watts under load, tho the computer is pulling over twenty seven times that from the wall (630 watts nominal, lots of heard drives and other stuff), excluding the displays and other sound-processing stuff.

    Sounds more like RF type interference than power problems, to me.The only EMI I had, was with a V-reg next to the USB ports on the back of the motherboard. I fixed that a month or so ago. Got a PCI USB plug, and tapped 5V from the HDD rack at the bottom of the computer. And yes, I tossed a Dinner-mint at it as well, with a schottky diode.
    http://www.daz3d.com/forums/viewreply/747444/
    Yea, I don't mess around when it comes to noise in my power rails.
    http://www.daz3d.com/forums/viewreply/801006/

    It's an OpenGL sound effect, that the old card's OpenGL didn't support.
    Make it STOP, Make it STOP, Make it STOP... :lol:

    funny-pictures-cat-wants-baby-to-stop.jpg
    500 x 374 - 32K
    8600GT_vs_GT730_ZoneEdition_R15.png
    1930 x 835 - 196K
    Post edited by ZarconDeeGrissom on
Sign In or Register to comment.