Using a second GPU for display

Just wanted to get a bit of insight. I am going to install a second GPU for display only but I have a few questions. First, I have an older GPU that has 2GB on board and a newer one that has 1GB how much difference would it make for display only? Next, what exactly do I need to do inside DAZ Studio to select the second GPU for display? I cannot find where the selection is made to use one GPU for display one for rendering. Unless the option menu only shows up after both cards are installed. Last any other considerations that I simply don't know about.  Thanks for the assistance.

«1

Comments

  • Richard HaseltineRichard Haseltine Posts: 107,953

    The GPU(s) used for display are determined by the OS, but I'm not sure what you are doing this for - Windows still reserves memory for possible dispalys, and even if it didn't nether a 1GB nor even a 2GB GPU would do much for Iray (I doubt they'd manage even a single nude figure with simple materials in DS 4.12.x.x).

  • kwanniekwannie Posts: 870

    Thanks Richard,

     What I am trying to do is use a dedicated card for just display. I have read about many times here in the forums where many people have suggested that this would take some of the display burdon off the main GPU used for rendering.

    I have read that once you have two cards you can choose in DS what card (GPU) you want to use for what. I don't intend to install a 1GB video card to have it assist the main GPU with rendering, I just want to use the second card in DS for display. Obviously this would be the card I have my monitor attached to for use in my other windows apps. I am just trying to understand if there is a process in DS to select one card for display only.

  • tj_1ca9500btj_1ca9500b Posts: 2,057
    edited April 2020

    I use two GPUs, one AMD and one Nvidia.  The AMD side is actually an integrated GPU (Ryzen 2400G APU), so it only has about 1-2GB of system ram set aside for it.

    How this works is that I have my monitor hooked up to the motherboard graphics port, but of course you could hook your monitor to the second discrete GPU instead.

    The advantage of this is that I can do other tasks on my computer while Daz Studio is busy crunching away at an Iray render, assuming that the render is not CPU only, and if the CPU box in the render settings in Daz Studio is unchecked for rendering.  If a render goes CPU only, then the system bogs down a lot, so I try to avoid CPU only renders.  Plus they are much slower to begin with.

    Other viewport modes use the 2nd GPU, but for the Iray Viewport mode Daz Studio still uses the Nvidia GPU.  Iray Viewport mode is generally more sluggish for me to work with in the first place, so I only switch to Iray mode for texture checks and final adjustments of the scene I'm working on.  Even when I used to have dual GTX 1080s driving the viewport and my desktop environment, I'd switch to texture shaded, hidden line, etc. often to improve viewport performance

    The other advantage of this is if you are running two or more instances of Daz Studio simultaneously on the same computer.  When doing this, you can have one instance crunching away at a render with the 'dedicated' Nvidia GPU doing the render, while using the second instance (in non-Iray viewport nodes) to work on something else while you are waiting for your render to complete.

    The other advantage of multiple instances is where you might need to grab an asset in one of your other scenes via a 'save as scene subset' for the scene you are currently working on, but don't want to stop working on your current scene while you wait for that other scene to load.  In short, I love the ability to run two instances of Daz simultaneously, it definitely improves my workflow.

    I remember reading not too long ago where, in the Daz 4.12 beta at least, that multiple Daz Studio instances was no longer working.  Was this ever fixed?  The computer that I usually use is using an older version of Daz Studio currently (4.10), so this currently isn't an issue for me.  Of course, I don't get to use the latest goodies like strand based hair, but the plan was to build a new computer sometime this year, with a newer Nvidia graphics card, with the latest version of Daz Studio installed  The COVID thing has prompted me to delay my build...

    Post edited by tj_1ca9500b on
  • rrwardrrward Posts: 556

    I run three GPUs in my system: one for the display and two for rendering. Windows will use whichever GPU is plugged into the monitor as the main display GPU. Put the weaker GPU on the monitor as you don't need all that horsepower for the display. The main purpose for doing a dedicated GPu for rendering and a dedicated GPU for the dsplay is to let you do other things while the render is processing.

  • kwanniekwannie Posts: 870

    tj,

    I understand all the advantages that you mentioned. I was just curious about if there were any particular blocks to check off in the preferences menu or somewhere else in DS to select one video card as  the one for rendering and if there was a block to check off for a display only video card. Like Richard said this the system is responsible for managing how the display card is used but do I need to inform DS somewhere what my intentions are with the 2 seperate cards?

    I have read a few times where people have said you need to choose which card you want to use for rendering I just don't really know where you do the choosing.

  • kyoto kidkyoto kid Posts: 41,847
    edited April 2020

    ...selecting which devices are used for rendering are on the Hardware tab of Advanced Render Settings  

    That said, 4 GB of VRAM is borderline. as it can handle a single clothed figure with hair in a very simple setting . I have a 4GB card in my assembly system for test/proof rendering of sub scene elements and characters I design.  For full scene rendering I have a second system that has both a 2 GB GPU to drive the display and a 12 GB Titan X which is dedicated only for rendering.

    Post edited by kyoto kid on
  • outrider42outrider42 Posts: 3,679

    You go into the render settings and click on advanced. From there you will see the checkboxes for any Iray capable GPUs you have installed. You have two different ones. You have a selection for which one is actually rendering, the other option is for which GPU is running the viewport in Daz Studio itself. To save as much VRAM as possible, you do not want to have your rendering GPU to be used for the viewport. So have the 1GB card for 'interactive devices' while the 2GB selected for 'photoreal devices'.

  • kwanniekwannie Posts: 870

    Thank you outrider!!! This is the information I have been wanting to aquire!!! I guess I wasn't clear about the fact that I was only trying to choose between the 1GB and 2GB card for only the display video card. I have a Nvidia 1080 already installed that I will use for rendering. Thanks for getting me strait everyone! I do appreciate it.

  • outrider42outrider42 Posts: 3,679
    edited April 2020

    No problem. Do keep in mind they must be supported to show up. If they are really old, they may not, and Daz Studio may not work properly if the display GPU is not supported. Anything from the 500 series and older will not work in Daz Studio anymore. The 600 and 700 series (Kepler) are on the chopping block. They currently work, but are expected to be pulled from support in the next full version of Iray, which could come in 2020. So if your card is one of those, just be aware of that its time is coming up soon.

    As a display, 1GB is super low. I would go with the 2GB for display. And that might even still cause issues with how much memory things use these days. Your browser can easily eat that up, and you will probably be unable to watch any 4K or even 1440p videos. So if you use you computer for other things when you are not using Daz, this is something to be very aware of.

    Post edited by outrider42 on
  • kwanniekwannie Posts: 870

    I already have DAZ 4.12 and and 1080 GPU installed and functioning I am not going to install any more or different cards for rendering. All I was trying to do is establish what the process is to add another video card to my system to use to plug my monitor into so that the 1080 would not have pick up the tab for my display needs inside or outside of DS. I assume that after I install the 1GB card it will show up where you indicated and I simply will leave it unselected for rendering. That should do it.....I think.

  • kenshaw011267kenshaw011267 Posts: 3,805

    Windows detects which card a monitor is plugged into to determine video output. DS has nothing to do with it. Be aware a 1 or 2Gb card will struggle with video playback at 1080p, based on both the ages of the cards and the very low VRAM. In particular the 1 Gb card will be pretty awful.

  • outrider42outrider42 Posts: 3,679

    Yeah, its easy to do this. After all, the whole reason Windows reserves some VRAM on all GPUs is because it makes it easier to do this. It is not out of bounds to simply plug the monitor up to the 1080 when you are done with Daz, because Windows 10 is designed with this exact possibility in mind.

    The issue we are having is that 1GB is just pitiful for anything in 2020. Yes, you can connect a display to that card to keep the 1080 free. But can Windows 10 even run well on 1GB of VRAM? I'm not thinking it can, certainly not very well. And even more of a concern is again, like I said, if Daz Studio can use this GPU. What model are the GPUs in question here. You said you have a 2GB card and a 1GB card, what are they? I mean a 1GB GPU, that has to be very old. If that is a 500 series, many of which had 1GB VRAM, it will not work for this task. So if you have a 500 series...this whole conversation is moot.

    A third option would using your CPU as the driver instead of a GPU, if it has a integrated GPU on it. Many Intel chips do. Depending on what CPU you have, it might actually be better than the GPU you are talking about, because VRAM is no longer a concern at all. As long as it can handle the task, and many modern Intels can. Daz only runs on one thread. If this is possible, then just plug the monitor to the motherboard's output.

  • kenshaw011267kenshaw011267 Posts: 3,805

    I don't think it matters if it is a Fermi or older card for display purposes. DS displays through Windows and that would work on pretty much any card. 1Gb just won't be very much on a 1080 display so video playback or games would be awful.

  • kyoto kidkyoto kid Posts: 41,847
    edited April 2020

    ..yeah, I thought that was only for Iray support. The only possibly issue it might have is if you use Iray view mode. 

    Post edited by kyoto kid on
  • outrider42outrider42 Posts: 3,679

    Could you use Studio's viewport with Pascal when Pascal launched (before Daz Iray got Pascal support?) I didn't have a Pascal then, so I don't know.

    Nvidia has stopped updating drivers for Fermi all together. So anything that requires a higher driver to use will be unusable on Fermi, like dforce (though you wouldn't want to try that anyway.) Fermi also uses OpenCL 1.1, which version does the current version of Studio use? These could all still be roadblocks. And of course, 1GB of VRAM.

    Also, some GTX 730 and 710 cards were based on Fermi, and some low end 600s as well. So Fermi is not strictly limited to the 500 series.

  • kyoto kidkyoto kid Posts: 41,847

    ...the 750 Ti I have is one of the early Maxwell cards (along with with the 745, 750 and 760)  so for at least the foreseeable future, it along with my Titan-x should be OK. 

  • kenshaw011267kenshaw011267 Posts: 3,805

    Could you use Studio's viewport with Pascal when Pascal launched (before Daz Iray got Pascal support?) I didn't have a Pascal then, so I don't know.

    Nvidia has stopped updating drivers for Fermi all together. So anything that requires a higher driver to use will be unusable on Fermi, like dforce (though you wouldn't want to try that anyway.) Fermi also uses OpenCL 1.1, which version does the current version of Studio use? These could all still be roadblocks. And of course, 1GB of VRAM.

    Also, some GTX 730 and 710 cards were based on Fermi, and some low end 600s as well. So Fermi is not strictly limited to the 500 series.

    I'm sure iray viewport won't work on Fermi cards. the iRay viewport uses iRay to render the viewport.

  • tj_1ca9500btj_1ca9500b Posts: 2,057
    edited April 2020

    Just wanted to note that my Ryzen 2400G APU, even when using just 'regular' ram, does just fine in Windows 10, even with 4K video playback.  The Radeon/Windows software scales the ram use automatically, up to 2GB of system ram if needed.  In this particular usage case, it is NOT recommended to set a fixed amount of ram in BIOS (at least for my setup), as most people just haven't seen any benefit from doing so, at least for Ryzen + Vega.  I have 32GB of system ram installed BTW...

    I wouldn't recommend AAA Gaming on it at uber resolutions, but for more 'typical' use, it just hasn't been an issue.  Note that I don't do Triple A gaming, I"m more of a 'turn based' kinda guy so my Ryzen 2400G handles the titles I play quite well actually.

    As for video editing, I don't do much of that at all, but if you DO a lot of 4K+ video editing, you probably should be using a dedicated card with lots of VRAM to begin with...

    Post edited by tj_1ca9500b on
  • kwanniekwannie Posts: 870

    Thanks for all suggestions. I found out that I was better off just leaving the system as is. When I tried to istall a second card I could not get the driver to install without wiping out the driver for the first card and since it was only a 1GB card I just gave up on the venture. I do have onboard graphics capability but the only output is to DVI and neither of the adapters that I have will fit it; such is life.

  • kwanniekwannie Posts: 870

    By the way...........Through all this process I ended up reinstalling a few things with DIM, including DS 4.12, not the Beta. Maybe I'm wrong but it looks like the OptiX option was put back in, and now I can open multiple instances of DS again.

  • kenshaw011267kenshaw011267 Posts: 3,805

    Optix does nothing in 4.12, That's why the check box is gone in the beta.

  • kwanniekwannie Posts: 870

    OK dumb question, Did the Beta come out after 4.12 ?

  • LeanaLeana Posts: 12,748

    Current beta is more recent than current general release.

  • kwanniekwannie Posts: 870

    Ahh, OK thats why I could open up more than one instance of DAZ. The Beta screwed that capability up aparently. Thanks!

  • Richard HaseltineRichard Haseltine Posts: 107,953
    kwannie said:

    Ahh, OK thats why I could open up more than one instance of DAZ. The Beta screwed that capability up aparently. Thanks!

    The beta organises the ability to run instances, to avoid issues that did occur with the older versions.

  • LeanaLeana Posts: 12,748

    You can do it in the beta too but not directly. Having several instances open with the older versions could cause problems in some cases, so they changed the system.

    See here for details of the new system: https://www.daz3d.com/forums/discussion/comment/5112696/#Comment_5112696

  • kwanniekwannie Posts: 870

    So exactly what new goodies were added to The Beta from 4.12. It looks to me like the timeline, IKchains, Strand Based Hair were all there in 4.12.

  • LeanaLeana Posts: 12,748
    edited April 2020

    Current general release is 4.12.0.86.

    Betas released since then have integrated new versions of Iray, improvements to the timeline and dForce, the new launch system, various bug fixes (including fixes for the mac version to work with Catalina)... 

    Post edited by Leana on
  • kwanniekwannie Posts: 870

    Got it..........Thank you for the insight.

  • outrider42outrider42 Posts: 3,679

    Yes, the 750ti was actually the very first consumer Maxwell GPU. It was a really odd little release. So yeah, it will be fine even after Fermi gets the axe. I would imagine that Maxwell would hang for a while. However the GTX 760 is a Kepler card, not a Maxwell. I am pretty sure that the 760 is actually rebranded 670 with a slightly tweaked clockspeed. The 670 was a cut down 680, so the 760 uses the same chipset as the 680. The GTX 770 is basically the 680 rebranded.

    The way to check is to look at the code of the GPU. The first two letters designate what arch is used. The 700 series has cards with GF, meaning Fermi, GK for Kepler, and GM for Maxwell. There are a couple of Fermi cards in the 700 series, the 750 and the 730. Its pretty confusing, because what is newer versus what is older is all over the place.

    The GTX 600 series has numerous Fermi cards in its lineup, in fact nearly half of the 600 series is Fermi. Only the higher end cards are true Kepler.

    This was a strange period for Nvidia with all these rebranded cards. There are some Fermi cards that were used in 3 different generations of product lines, which is crazy. Thankfully since the 900 series, Nvidia has mostly stopped doing that and each series has been exclusively one arch.

    That doesn't mean Nvidia is beyond releasing a confusing product line. Now we have a 1600 to go along with the 2000 series, there are TI versions and Super versions of cards, and some cards even have versions of each!

     

    kwannie said:

    Thanks for all suggestions. I found out that I was better off just leaving the system as is. When I tried to istall a second card I could not get the driver to install without wiping out the driver for the first card and since it was only a 1GB card I just gave up on the venture. I do have onboard graphics capability but the only output is to DVI and neither of the adapters that I have will fit it; such is life.

    If the motherboard only has a DVI out, then it may not be a great idea to use it that way. This could be a pretty old CPU. But if you wanted to try it, there are DVI to Displayport or HDMI adapters, and they are pretty cheap.

Sign In or Register to comment.