Minimum Spec For nVidia Card To Use Iray ?

3dcheapskate3dcheapskate Posts: 2,728
edited November 2015 in The Commons

Edit2: Found another useful thread Iray: minimum card spec that's worked for you - http://www.daz3d.com/forums/discussion/59760/iray-minimum-card-spec-that-s-worked-for-you

Edit: After seeing nicstt's comments a few posts down I googled "rendering times for various IRAY enabled devices site:daz3d.com" and found these three threads which look helpful:

nVidia GeForce specshttp://www.geforce.com/hardware

 - - - (original OP starts here) - - -

At the bottom of the DAZ Studio product page ( https://www.daz3d.com/daz_studio ) it says...

Windows®

32 bit

  • ...
  • OpenGL 1.6 compatible graphics card with at least 128 MB RAM (Hardware accelerated OpenGL 2.2, or higher, compatible recommended with 512MB RAM)
  • DirectX 9 (used for audio processing only)

64 bit

  • ...
  • Hardware accelerated OpenGL 1.6 compatible graphics card with at least 512 MB RAM (OpenGL 2.2, or higher, compatible recommended)
  • DirectX 9 (used for audio processing only)

Iray Render Engine: 64 bit only. NVIDIA video card with 4+GB vram and 4 CUDA cores recommended.

 

Now 'recommended' is usually a few steps higher than 'minimum spec', so does anybody know the min spec nVidia card for using Iray ? I've had no luck finding info on the nVidia website either.

I'm wondering whether my several-years old Dell Inspiron 15 with built-in nVidia GeForce GT525M will be up to the job.

Anybody know the answer?

 

Post edited by 3dcheapskate on
«13

Comments

  • mjc1016mjc1016 Posts: 15,001

    As far as it goes...ANY machine that is running 64 bit Windows can run Iray, in CPU mode. 

    And ANY Nvidia card, that has CUDA cores and can hold the scene can run it in GPU mode. 

    The problem is...the scene must fit in the card's memory to be able to use the card.  My question is, how much DEDICATED memory does that mobile 525 have? 

    A real practical minimum is 2 GB, a 1 GB card will work, but what it can hold, in the way of a scene is very little.  That limits the scene size to 'pretty darn small'...basically portrait type renders.  4 GB is a moderately complex scene.

  • I'd like to know as well, for the next time I add new bits and bobs to my system.

    The critical parameters seem to be onboard card memory type and speed, and number of CUDA cores. My even-more-ancient-than-yours GT240 only has 1GB of DDR3 RAM and 96 cores, and it refuses to even try rendering the simplest of test scenes. I've read recommendations of not less than 4GB of GDDR5 RAM and the more cores the merrier... or at least faster. Apparently if you shop around you can find 4GB cards with 500+ cores for a not too exorbitant price. Time to save my pennies and rummage down the back of the sofa. 

  • namffuaknamffuak Posts: 4,406

    I'd like to know as well, for the next time I add new bits and bobs to my system.

    The critical parameters seem to be onboard card memory type and speed, and number of CUDA cores. My even-more-ancient-than-yours GT240 only has 1GB of DDR3 RAM and 96 cores, and it refuses to even try rendering the simplest of test scenes. I've read recommendations of not less than 4GB of GDDR5 RAM and the more cores the merrier... or at least faster. Apparently if you shop around you can find 4GB cards with 500+ cores for a not too exorbitant price. Time to save my pennies and rummage down the back of the sofa. 

    I started with a GT 740 - 384 cores, 4 GB of Vram; it works, it does improve render speed over straight cpu, and it runs around $100 US. But - big caveat here - if this is your ONLY video card and it drives your monitor(s) (it can do up to 4) - your video response will be atrocious while using it for renders. I couldn't even play solitaire properly. There are other cards with 3 or 4 GB and a varying number of cores. Oh - and if you have a slot available - the 740 is a single-slot card and if you can keep the 240 to drive the monitor - well, at least the monitor updates won't be terribly slow.

  • mjc1016mjc1016 Posts: 15,001

    Basically, a 440 is the lowest card I know that works...a 1 GB card won't hold a big scene, but when it does, it is a heck of a lot faster than CPU alone.

     

  • 3dcheapskate3dcheapskate Posts: 2,728
    edited September 2015

    mjc1016 said:

    ...ANY machine that is running 64 bit Windows can run Iray, in CPU mode...

    I eventually found that information from the DAZ Studio 4.8 page in my product library page - via the "View Product Store Page" link ( https://www.daz3d.com/get_studio ), "Read more" ( https://www.daz3d.com/technology ), "Latest features" ( http://docs.daz3d.com/doku.php/public/software/dazstudio/4/new_features/start ).

    What's the performance hit for iray CPU rendering ? I note that in your later post you said that "...it is a heck of a lot faster than CPU alone."

    mjc1016 said:

    ...The problem is...the scene must fit in the card's memory to be able to use the card.  My question is, how much DEDICATED memory does that mobile 525 have? ...

    Finally tracked that info down (hidden in plain sight as usual, NVIDIA Control Panel > System Information) - 96 CUDA cores, DirectX 11, Total Available Graphics Memory=2769MB, but Dedicated Video Memory=1024MB DDR3 (the other 1745MB appears to be shared system memory?)

    So it looks like it's worth a shot, but there'll be definite scene-size limitations. Since all my scenes are PZ3 (250MB seems a sort-of average-ish size) does anybody have a ball-park figure for what to expect if I convert them to DUF ?

    ~ ~ ~

    After years of bizarre never-resolved problems with graphics (e.g. Blender freezing for 5 seconds every now and again) on my dual graphics (Intel and nVidia) laptop I read somewhere that there are known problems with dedicated additional graphics processors built into laptops. Can't recall the URL where I found that little gem, or the details. But I'm wondering if that's going to rear it's head again.

    Post edited by 3dcheapskate on
  • nicsttnicstt Posts: 11,715
    edited September 2015

    I use a GT640 with 2GB of RAM as a display card (although not for much longer), and if I use it for rendering, it is slower than the CPU (not overclocked); if you do render overclocked it will have a detrimental effect on the life of your machine. Well excessive rendering on consumer based builds will have an affect as the products aren't designed for that use.

    A simple scene rendered to 75% convergence; all settings other than where to render where the same.

    CPU(i7 4770) Total Rendering Time: 7 minutes 2.1 seconds

    GT640 Total Rendering Time: 10 minutes 32.83 seconds

    For comparrison I used the card I have for rendering; soon it will be my display card as I'll be upgrading to a 980ti

    GT970 Total Rendering Time: 1 minutes 41.66 seconds

    There is a thread somewhere that compares rendering times for various IRAY enabled devices.

    Cores, be they on a CPU or a GPU, the more of them, and the faster and more recent their architecture, then the faster the render. There is also the fact as we are talking CUDA that they need to be NVida cards; there was talk some years back of AMD implementing CUDA, it never amounted to anything; so you need a NVidia card to render on the GPU; of course, a dual Xeon system could also be an extremely good method and they last longer (usually) with that kind of strain - you do pay for it though.

    But to specifically answer your question; decent (preferably top of the range) i5 with 16GB of memory and a 970 GPU; gold rated PSU

    SSD is nice, more memory (32 or 64); a better cpu - an i7. Two 970, one for display and when not using the machine you can also use it to assist the rendering. 980ti maybe. Beefy PSU, make sure you over-provision to some extent.

    Set your budget; spend it first on CPU, Motherboard, GPU and PSU.

    Don't underestimate the importance of a good PSU; if one fails it shouldn't take components with it, but cheaper ones are more likely to fail, and failure is a big risk to other components. Cheaper PSUs also cost more to provide the same amount of power to the components.

    EDIT:

    I presume you have a good mouse, keyboard and monitor; don't underestimate these; all three can affect your health.

    Post edited by nicstt on
  • My GT 630m has 2GB VRAM and works with Iray, but I still prefer to use the i7 processor in my laptop for rendering.  I've assigned DAZ Studio to the NVidia GPU for display purposes since it's faster than the HD 4000 integrated.

  • RogerbeeRogerbee Posts: 4,460
    edited September 2015

    I have been researching graphics cards myself and have settled on a GTX960 with 4gb of VRAM. This has 1024 CUDA cores and is ample for the type of thing I do. I'm also on a budget and the 960 is smack bang where my budget lies, which is just on the £200 mark. There are many manufacturers, but, I've narrowed it down to 2, Asus and MSI. I have an Asus AMD card right now and it has been flawless since I first installed it, so I am tempted by a Strix. I'll read up some more and make more decsicions.

    As we near Autumn in the UK, it's probably going to be a good idea to wait till after Christmas to get one as there may be more deals around, I'll just have to keep my eyes open...

    CHEERS!

    Post edited by Rogerbee on
  • nicsttnicstt Posts: 11,715

     

    Rogerbee said:

    I have been researching graphics cards myself and have settled on a GTX960 with 4gb of VRAM. This has 1024 CUDA cores and is ample for the type of thing I do. I'm also on a budget and the 960 is smack bang where my budget lies, which is just on the £200 mark. There are many manufacturers, but, I've narrowed it down to 2, Asus and MSI. I have an Asus AMD card right now and it has been flawless since I first installed it, so I am tempted by a Strix. I'll read up some more and make more decsicions.

    As we near Autumn in the UK, it's probably going to be a good idea to wait till after Christmas to get one as there may be more deals around, I'll just have to keep my eyes open...

    CHEERS!

    I have a 970 strix, and am considering the 980ti strix; I've no complaints over it, and it is absolutely silent for normal use; when under load, still very quiet; I keep my eye on things, and turn up the radiator fan for the system when i;m afk or it starts to warm up inside.

  • RogerbeeRogerbee Posts: 4,460

    I have 7 cooling fans in my case and they keep things at a nice even temperature, even on the longer Iray renders. I reckon anything else will go in just fine. It'll be a case of who has the best deal as both the cards I'm interested in have got good reviews. The recommended PSU is 400w and I have a 750w so there will be no problems there.

    CHEERS!

  • nicsttnicstt Posts: 11,715

    It isn't really about the number of fans; the situation to aim for is positive or negative airflow; for that you only need to have more air being pulled in, or more pushed out. Both have their advantages and disadvantages. I go for negative pressure, meaning that air is sucked in everywhere; I just have to ensure I keep the inside dust free as air gets in other ways than through the filters. An odd number of fans doesn't necessarily mean their is more of one as it is about the volume of air they move.

  • It's also a matter of how well the inside of the case is designed to allow air flow everywhere with no "dead" volumes. Doesn't do much good to have a bunch of fans apart from the dedicated CPU/GPU cooling systems, if they're laid out so that some of them are blowing against others; there needs to be a pretty much one-way flow all through the case. It's even possible a bad fan layout could make things worse, recirculating hot air in some places and not sucking it out of others. Hopefully it's not such a problem these days, but I remember there used to be regular overheating reports due to inefficient design of cheap tower and mini-tower cases.

  • kyoto kidkyoto kid Posts: 41,851
    edited September 2015

    ...the case I have (which was not cheap) is large which allows for a lot of breathing space inside.  It also has a large exhaust fan on the left side panel right next to where the GPU sits. Sadly this case is no longer available and most of what I've seen today have gone to that ridiculous window on the left side which serves no practical purpose at all.

    For the best basic performance at a reasonable price, I would go with what Rogerbee is looking at, the 4GB GTX960.

    For the level of scenes I do, I need a Titan X. as even a 6GB 980 TI is not enough and the only 8 GB solution. Nvidia offers is in their professional Quadro line (2.5 times as expensive as a Titan X).  Not about to toss 650$ on a GPU that will be only render about a half to two thirds of my scenes before dumping to the CPU.

    Post edited by kyoto kid on
  • 3dcheapskate3dcheapskate Posts: 2,728
    edited September 2015

    After reading nicstt's comments I did another few searches and found some other DAZ threads (probably including the one referred to?) - so I've posted those links at the top of the OP.

    Also found this GeForce page http://www.geforce.com/hardware which has links for Desktop GPUs and Notebook GPUs  (that reminds that an "M" suffix like on my GPU indicates "Mobile", i.e. for laptop - so it seems like I'm not the only one running DAZ Studio / Poser on a laptop)

    From what I've read, as far as a minimum-spec nVidia is concerned, 2GB vRam* is probably it (you might be lucky with 1GB, but the odds are against you). And I assume that for a min-spec any type (is DDR3 the "lowest"?) would fit. Number of CUDA cores doesn't seem to be a major min-spec factor - even the ancient GeForce G100 has 8 of them, double the DAZ "Recommended" spec (but it's 512MB GDDR2 would make it useless)

    nicstt's comments about CPU rendering being faster than GPU on his system were interesting. My system (laptop) is bottom-of-the-range i5 quad-core with 4GB, so I reckon that'll be (much) slower than GPU (if it works) for me.

     

    *I assume that vRam = mjc1016's "DEDICATED memory" = nVidia Control Panel's "Dedicated video memory" ?

     

    Post edited by 3dcheapskate on
  • 3dcheapskate3dcheapskate Posts: 2,728
    edited September 2015

    I now have DS4.8 installed and running. So how do I tell whether it's using the GPU or falling back to use the CPU for iray renders?

    I assume I have to do something in (either) the nVidia control panel and/or the DS4.8 nVidia render settings ?

    It seems to be using just the CPU at present - probably because I haven't told the DS render settings about the nVidia card ?

    Screenshots attached - so what do I need to do? (I assume the nVidia control panel should be left at "Auto-select", and that under in DS4.8 under Render Settings > Advanced > nVidia Iray > Photoreal Devices I should tick GeForce GT525M ? )

     

     

     

    nVidia1.png
    524 x 355 - 50K
    nVidia2.png
    552 x 440 - 56K
    nVidia4.png
    215 x 382 - 15K
    nVidia3.png
    337 x 63 - 14K
    Post edited by 3dcheapskate on
  • nicsttnicstt Posts: 11,715
    edited September 2015

     

    It's also a matter of how well the inside of the case is designed to allow air flow everywhere with no "dead" volumes. Doesn't do much good to have a bunch of fans apart from the dedicated CPU/GPU cooling systems, if they're laid out so that some of them are blowing against others; there needs to be a pretty much one-way flow all through the case. It's even possible a bad fan layout could make things worse, recirculating hot air in some places and not sucking it out of others. Hopefully it's not such a problem these days, but I remember there used to be regular overheating reports due to inefficient design of cheap tower and mini-tower cases.

    This is true, which is why I use the method I do; it sucks in air from all tiny gaps, this will then ensure that those dead spaces at least have some movement. And no matter how well the design of the case, a poorly installed system can introduce dead spaces; they can be hard to spot, another advantage of my method as by default all air is moved. The more there is a case, the more chance of dead space; the irony there is the more gfx cards the more chance of dead spaces. As you say, fans can be an issue; which is why it is important to ensure you do have either a positive or negative airflow.

    It is normal for system builders to provide systems with positive pressure; this ensures that the air is pushed out of those unsealed gaps, thus reducing the need for the consumer to clean the inside; this doesn't mean there won't be issues as not all system builders have a clue; and adding in components or even replacing one can change the internal dynamics.

    Post edited by nicstt on
  • RogerbeeRogerbee Posts: 4,460
    edited September 2015

    I now have DS4.8 installed and running. So how do I tell whether it's using the GPU or falling back to use the CPU for iray renders?

    I assume I have to do something in (either) the nVidia control panel and/or the DS4.8 nVidia render settings ?

    It seems to be using just the CPU at present - probably because I haven't told the DS render settings about the nVidia card ?

    Screenshots attached - so what do I need to do? (I assume the nVidia control panel should be left at "Auto-select", and that under in DS4.8 under Render Settings > Advanced > nVidia Iray > Photoreal Devices I should tick GeForce GT525M ? )

     

     

     

    In order for Iray to use the GPU in DS, you will have to have it ticked. For pure GPU, tick the GPU and untick the CPU. You can have both CPU and GPU ticked, if you don't have much VRAM this is a wise option, if you run out of memory on the card you can then take up the remainder with the CPU. How much VRAM does your card have!?

    CHEERS!

    Post edited by Rogerbee on
  • RogerbeeRogerbee Posts: 4,460
    nicstt said:

     

    It's also a matter of how well the inside of the case is designed to allow air flow everywhere with no "dead" volumes. Doesn't do much good to have a bunch of fans apart from the dedicated CPU/GPU cooling systems, if they're laid out so that some of them are blowing against others; there needs to be a pretty much one-way flow all through the case. It's even possible a bad fan layout could make things worse, recirculating hot air in some places and not sucking it out of others. Hopefully it's not such a problem these days, but I remember there used to be regular overheating reports due to inefficient design of cheap tower and mini-tower cases.

    This is true, which is why I use the method I do; it sucks in air from all tiny gaps, this will then ensure that those dead spaces at least have some movement. And no matter how well the design of the case, a poorly installed system can introduce dead spaces; they can be hard to spot, another advantage of my method as by default all air is moved. The more there is a case, the more chance of dead space; the irony there is the more gfx cards the more chance of dead spaces. As you say, fans can be an issue; which is why it is important to ensure you do have either a positive or negative airflow.

    It is normal for system builders to provide systems with positive pressure; this ensures that the air is pushed out of those unsealed gaps, thus reducing the need for the consumer to clean the inside; this doesn't mean there won't be issues as not all system builders have a clue; and adding in components or even replacing one can change the internal dynamics.

    All the research I did suggested intake fans at the front and side, with exhaust fans at the top and rear. It seems to work for me.

    CHEERS!

  • nicsttnicstt Posts: 11,715
    edited September 2015

    After reading nicstt's comments I did another few searches and found some other DAZ threads (probably including the one referred to?) - so I've posted those links at the top of the OP.

    Also found this GeForce page http://www.geforce.com/hardware which has links for Desktop GPUs and Notebook GPUs  (that reminds that an "M" suffix like on my GPU indicates "Mobile", i.e. for laptop - so it seems like I'm not the only one running DAZ Studio / Poser on a laptop)

    From what I've read, as far as a minimum-spec nVidia is concerned, 2GB vRam* is probably it (you might be lucky with 1GB, but the odds are against you). And I assume that for a min-spec any type (is DDR3 the "lowest"?) would fit. Number of CUDA cores doesn't seem to be a major min-spec factor - even the ancient GeForce G100 has 8 of them, double the DAZ "Recommended" spec (but it's 512MB GDDR2 would make it useless)

    nicstt's comments about CPU rendering being faster than GPU on his system were interesting. My system (laptop) is bottom-of-the-range i5 quad-core with 4GB, so I reckon that'll be (much) slower than GPU (if it works) for me.

     

    *I assume that vRam = mjc1016's "DEDICATED memory" = nVidia Control Panel's "Dedicated video memory" ?

     

    Yes to the dedicated vRAM. Which might mean that is actually uses DDR main system memory as opposed to having a dedicated Gfx card with its own memory onboard.

    I would avoid less than 4GB, unless your budget is so absolutely tight; a fairly simple scene will take up around 2GB, and by increaseing resolution I've taken a scene using just over 2GB to 3.8GB of the card's memory. A graphics card should utilise GDDR3 or 5 generally; any card being advertised as using DDR, is either a printing error or should be avoided at any price. Although there is quite a performance difference between the two: GDDR3 and GDDR5 (in part because the cheaper memory is put with slower components, thus exaserbating the potential difference), with 5 being the current fastest, I would only chose 3 over 5 if it meant getting 2GB instead of 1GB.

    I would hesitate to recommend using a laptop for rendering; rendering on a consumer system is going to wear it out quicker, unless it's only on occasions. So presuming you push it to its limits on a regular basis, you're wearing out something that costs more for the same spec as a desktop. I'm not really going to comment on the performance v cost v wear-and-tear beyond this as I don't know enough about laptops in general, to give good advice.

    You could build a relatively inexpensive desktop system, which would provide better rendering for lower cost; the only caveat I would add is if you would need a keyboard, mouse and monitor; that would certainly affect the cost, but would still be more cost effective; especially if you consider the ease with which upgrades and repairs can be carried out and the cost involved; I hate repairing laptops, desktops are generally fun; sort of...

    Don't think I've made any errors in what I've written; yes this is a disclaimer. :)

     

    I now have DS4.8 installed and running. So how do I tell whether it's using the GPU or falling back to use the CPU for iray renders?

    I assume I have to do something in (either) the nVidia control panel and/or the DS4.8 nVidia render settings ?

    It seems to be using just the CPU at present - probably because I haven't told the DS render settings about the nVidia card ?

    Screenshots attached - so what do I need to do? (I assume the nVidia control panel should be left at "Auto-select", and that under in DS4.8 under Render Settings > Advanced > nVidia Iray > Photoreal Devices I should tick GeForce GT525M ? )

     

     

     


    It can be a problem deciding which  it is using; knowing the card you have, look at the render times you are getting and compare with CPU (by selecting CPU only) in the render panel in Daz; use the same scene and you should then be able to determine which it is using.

    Post edited by nicstt on
  • Rogerbee said:

    I now have DS4.8 installed and running. So how do I tell whether it's using the GPU or falling back to use the CPU for iray renders?

    I assume I have to do something in (either) the nVidia control panel and/or the DS4.8 nVidia render settings ?

    It seems to be using just the CPU at present - probably because I haven't told the DS render settings about the nVidia card ?

    Screenshots attached - so what do I need to do? (I assume the nVidia control panel should be left at "Auto-select", and that under in DS4.8 under Render Settings > Advanced > nVidia Iray > Photoreal Devices I should tick GeForce GT525M ? )

     

     

     

    In order for Iray to use the GPU in DS, you will have to have it ticked. For pure GPU, tick the GPU and untick the CPU. You can have both CPU and GPU ticked, if you don't have much VRAM this is a wise option, if you run out of memory on the card you can then take up the remainder with the CPU. How much VRAM does your card have!?

    CHEERS!

    According to the nVidia control panel 2769MB total, but only 1024MB DDR3 Dedicated Video Memory.

    With both GPU and CPU checked the Iray render progresses as expected, but the nVidia systray icon tells me there's no nVidia activity.

    With GPU ticked but CPU unticked the rendering progress window appears momentarily, although it's so quick that I can't see what's on it, and then vanishes.

    (The same thing happens with a random fairly simple scene (V4 + clothing), and with a scene that's empty except for a cube primitive)

    So does this mean that my system is automatically falling back to CPU rendering ?

     

  • RogerbeeRogerbee Posts: 4,460

    At the time I bought my AMD GPU, Asus seemed to be the only brand that had GDDR5 as standard, the rest seemed to have GDDR3. However, in just a year, it seems that GDDR5 is the norm right across the board. This can only be a good thing. I don't doubt that at some point 4gb will become the minimum memory size, you're hard pushed to find that many under 2gb now. I found an old 3D World from 2009 yesterday and they reviewed a pair of NVidia Quadros, the top end one had 4gb of VRAM, but the rest of the specs were way down on what the equivalent GTX has now.

    CHEERS!

  • RogerbeeRogerbee Posts: 4,460
    edited September 2015
    Rogerbee said:

    I now have DS4.8 installed and running. So how do I tell whether it's using the GPU or falling back to use the CPU for iray renders?

    I assume I have to do something in (either) the nVidia control panel and/or the DS4.8 nVidia render settings ?

    It seems to be using just the CPU at present - probably because I haven't told the DS render settings about the nVidia card ?

    Screenshots attached - so what do I need to do? (I assume the nVidia control panel should be left at "Auto-select", and that under in DS4.8 under Render Settings > Advanced > nVidia Iray > Photoreal Devices I should tick GeForce GT525M ? )

     

     

     

    In order for Iray to use the GPU in DS, you will have to have it ticked. For pure GPU, tick the GPU and untick the CPU. You can have both CPU and GPU ticked, if you don't have much VRAM this is a wise option, if you run out of memory on the card you can then take up the remainder with the CPU. How much VRAM does your card have!?

    CHEERS!

    According to the nVidia control panel 2769MB total, but only 1024MB DDR3 Dedicated Video Memory.

    With both GPU and CPU checked the Iray render progresses as expected, but the nVidia systray icon tells me there's no nVidia activity.

    With GPU ticked but CPU unticked the rendering progress window appears momentarily, although it's so quick that I can't see what's on it, and then vanishes.

    (The same thing happens with a random fairly simple scene (V4 + clothing), and with a scene that's empty except for a cube primitive)

    So does this mean that my system is automatically falling back to CPU rendering ?

     

    If you've only got 1gb of DDR3 VRAM then any scene will eat that, I'm no expert, but, I think that yes, the system is falling back to the CPU very quickly. If you want to continue with Iray I would say that a significant upgrade is most definitely on the cards.

    CHEERS!

    Post edited by Rogerbee on
  • I'd like to know as well, for the next time I add new bits and bobs to my system.

    The critical parameters seem to be onboard card memory type and speed, and number of CUDA cores. My even-more-ancient-than-yours GT240 only has 1GB of DDR3 RAM and 96 cores, and it refuses to even try rendering the simplest of test scenes. I've read recommendations of not less than 4GB of GDDR5 RAM and the more cores the merrier... or at least faster. Apparently if you shop around you can find 4GB cards with 500+ cores for a not too exorbitant price. Time to save my pennies and rummage down the back of the sofa. 

    I have the same GT240 card and it renders very well. Of course it falls back to CPU rendering but that's oke I guess. with 8GB RAM I still can do other things on my computer while rendering, not as much as I used to do when there's no render process going on I know but still ....

    However, end of this month I'm going to get me a new computer with a newer card and I don't know yet if I can afford the 960 4GB card but I know it will have more power than this oldie. And it'll have 32 RAM then so I expect no problems rendering then either.

     

    Love, Jeanne smiley

  • RogerbeeRogerbee Posts: 4,460

    I'd definitely go for the 960, I have 16gb of RAM and I think the card will work well with it, so it should be fine with 32gb.

    CHEERS!

  • I really want to go for the 960 but my finances do not really listen to what I want... wink

  • RogerbeeRogerbee Posts: 4,460

    Neither do mine, which is why I'm waiting till after Christmas when I know I'll have more money and there may be better deals.

    CHEERS!

  • BejaymacBejaymac Posts: 1,942

    It's surprising just how much you can get away with on a 1 Gb GT430 and it's 96 CUDA cores, even when it's being used to run the monitor, attached pic has all three generations of Genesis, Genesis and G3F aren't using Iray shaders, and lighting is a Studio lighting based HDR set to finite sphere w/ ground, in dual mode it took less than 10 minutes to render on this ancient Vista box.

    Gen 3.jpg
    587 x 623 - 154K
  • Rogerbee said:

    I now have DS4.8 installed and running. So how do I tell whether it's using the GPU or falling back to use the CPU for iray renders?

    I assume I have to do something in (either) the nVidia control panel and/or the DS4.8 nVidia render settings ?

    It seems to be using just the CPU at present - probably because I haven't told the DS render settings about the nVidia card ?

    Screenshots attached - so what do I need to do? (I assume the nVidia control panel should be left at "Auto-select", and that under in DS4.8 under Render Settings > Advanced > nVidia Iray > Photoreal Devices I should tick GeForce GT525M ? )

     

     

     

    In order for Iray to use the GPU in DS, you will have to have it ticked. For pure GPU, tick the GPU and untick the CPU. You can have both CPU and GPU ticked, if you don't have much VRAM this is a wise option, if you run out of memory on the card you can then take up the remainder with the CPU. How much VRAM does your card have!?

    CHEERS!

    According to the nVidia control panel 2769MB total, but only 1024MB DDR3 Dedicated Video Memory.

    With both GPU and CPU checked the Iray render progresses as expected, but the nVidia systray icon tells me there's no nVidia activity.

    With GPU ticked but CPU unticked the rendering progress window appears momentarily, although it's so quick that I can't see what's on it, and then vanishes.

    (The same thing happens with a random fairly simple scene (V4 + clothing), and with a scene that's empty except for a cube primitive)

    So does this mean that my system is automatically falling back to CPU rendering ?

     

    You can open your Task Manager to see how high your CPU usage goes.  When I used my GT 630m for rendering in DAZ Studio, my CPU usage stayed less than 50%.  Using Blender Cycles, I found that my GPU renders with CUDA cores just about as fast as my i7 @ 2.4 GHz in a simple test on that particular render engine.  Since it is a laptop, I don't think it's a good idea to make both CPU and GPU run at the maximum for any length of time.  I use either one or the other, but not both.

  • nicsttnicstt Posts: 11,715
    edited September 2015
    Rogerbee said:
    nicstt said:

     

    It's also a matter of how well the inside of the case is designed to allow air flow everywhere with no "dead" volumes. Doesn't do much good to have a bunch of fans apart from the dedicated CPU/GPU cooling systems, if they're laid out so that some of them are blowing against others; there needs to be a pretty much one-way flow all through the case. It's even possible a bad fan layout could make things worse, recirculating hot air in some places and not sucking it out of others. Hopefully it's not such a problem these days, but I remember there used to be regular overheating reports due to inefficient design of cheap tower and mini-tower cases.

    This is true, which is why I use the method I do; it sucks in air from all tiny gaps, this will then ensure that those dead spaces at least have some movement. And no matter how well the design of the case, a poorly installed system can introduce dead spaces; they can be hard to spot, another advantage of my method as by default all air is moved. The more there is a case, the more chance of dead space; the irony there is the more gfx cards the more chance of dead spaces. As you say, fans can be an issue; which is why it is important to ensure you do have either a positive or negative airflow.

    It is normal for system builders to provide systems with positive pressure; this ensures that the air is pushed out of those unsealed gaps, thus reducing the need for the consumer to clean the inside; this doesn't mean there won't be issues as not all system builders have a clue; and adding in components or even replacing one can change the internal dynamics.

    All the research I did suggested intake fans at the front and side, with exhaust fans at the top and rear. It seems to work for me.

    CHEERS!

    If it works, then all is fine. :)

     

    I forgot to mention to OP.

    Download GPUz it allows you to monitor your card; just select the card being used, or which you think is being used; the sensors TAB has a memory useage; this should be more than a few MB; it might be on a few as opposed to none if you have a scene stopped but not saved/closed.

    Post edited by nicstt on
  • RogerbeeRogerbee Posts: 4,460

    That it does, it was GPUz and a screaming CPU fan that convinced me I needed better cooling.

    CHEERS!

Sign In or Register to comment.