Two Nvidia Titan X enough for gpu rendering today 2024?

Is dual Nvidia titan X non sli enough for Daz Studio these days? I have one and can afford a second one soon. Is that enough for gpu rendering these days? I'm pretty sure they're still supported and I can't afford anything else. I use Vdb Hdri obj imported files lights and PBR Daz Studio stuff (g9 etc) just fyi

Comments

  • Alternatively a quadro and m40 24gb Tesla is an option. I know i got that kinda setup working a while ago a few years back. I had a Tesla gpu server that died and it had three m40s. My current setup can hold two m40s and a full height quadro easily 

  • Richard HaseltineRichard Haseltine Posts: 96,888

    Does the oen you have work? Adding a second will enable the iterations to be shared, speeding the process, but each GPU still hasto be able to render its iterations solo.

  • What does one mean? Sorry for the stupid question 

  • If you mean the Titan X then yes it works? I just need to know if one is enough given what i mentioned or if I need to get a second one or a quadro Tesla setup 

  • Also if one or two Titan X is enough will scene optimization plugins be useful for me. I use render queue by manfriday 

  • Richard HaseltineRichard Haseltine Posts: 96,888

    As i said, each card still has to render the scene separately so another card won't allow you to handle more complex scenes, it will just get the job done more quickly.

  • GordigGordig Posts: 9,169

    Richard Haseltine said:

    As i said, each card still has to render the scene separately so another card won't allow you to handle more complex scenes, it will just get the job done more quickly.

    Does SLI not pool VRAM like NVLink?

  • Richard HaseltineRichard Haseltine Posts: 96,888

    Gordig said:

    Richard Haseltine said:

    As i said, each card still has to render the scene separately so another card won't allow you to handle more complex scenes, it will just get the job done more quickly.

    Does SLI not pool VRAM like NVLink?

    I don't believe so, my recollection was that it was only the top-end cards with nVlink that supported memory sharing for materials.

  • Gordig said:

    Richard Haseltine said:

    As i said, each card still has to render the scene separately so another card won't allow you to handle more complex scenes, it will just get the job done more quickly.

    Does SLI not pool VRAM like NVLink?

    Nope, it just splits the work load between the gpus for video output.

     

  • edited March 29

    Richard Haseltine said:

    Gordig said:

    Richard Haseltine said:

    As i said, each card still has to render the scene separately so another card won't allow you to handle more complex scenes, it will just get the job done more quickly.

    Does SLI not pool VRAM like NVLink?

    I don't believe so, my recollection was that it was only the top-end cards with nVlink that supported memory sharing for materials.

    Started with Pascal, specifically p100, never got adopted in the 10 series consumer cards.

    20 series were a mix, with about half having it and the rest not.

    30 series dropped it to just the 3090 and 3090 ti.

    40 series doesn't suppor it at all.

     

     

    Post edited by DrunkMonkeyProductions on
  • Yeah, you can use them, for now.

    Be aware there are two different Titan X on the market, one maxwell, one pascal.

    If it's maxwell, it's the same chip as you're m40's(gm200), and is going to be a negligable improvement in performance, as they're only ~100mhz difference in clock and memory speed.

    The pascal version will give better performance compared to the maxwell, due to higher core count, faster speed, and newer generation.

    My p40(slightly faster than the titan) benchmark's(see benchmark thread) at 8m 26.99s, with my m40, 11m6.2s, and 5m41.4s with two.

    The real issue, for the maxwell series, is going to come later this year, as it's about time, if nvidia keeps to the usual schedule and rumors are true, for driver support to end.

    Might want to start looking into something newer instead of sinking money into this.

     

     

  • milliethegreatmilliethegreat Posts: 155

    DrunkMonkeyProductions said:

    Yeah, you can use them, for now.

    Be aware there are two different Titan X on the market, one maxwell, one pascal.

    If it's maxwell, it's the same chip as you're m40's(gm200), and is going to be a negligable improvement in performance, as they're only ~100mhz difference in clock and memory speed.

    The pascal version will give better performance compared to the maxwell, due to higher core count, faster speed, and newer generation.

    My p40(slightly faster than the titan) benchmark's(see benchmark thread) at 8m 26.99s, with my m40, 11m6.2s, and 5m41.4s with two.

    The real issue, for the maxwell series, is going to come later this year, as it's about time, if nvidia keeps to the usual schedule and rumors are true, for driver support to end.

    Might want to start looking into something newer instead of sinking money into this.

     

     

     I can't afford newer. I'm on a fixed income. I can't afford $300+ gpus which is what you're saying I have to get. Or implying rather.

  • milliethegreatmilliethegreat Posts: 155

    I have about $290 -$300 to spend on hardware assets plugins per month so newer is COMPLETELY UNOBTAINABLE!!!! Sooooo....how good (crappy) is my i9-14900k for cpu rendering cause $300+ gpus are not a thing for me.

  • milliethegreatmilliethegreat Posts: 155

    milliethegreat said:

    DrunkMonkeyProductions said:

    Yeah, you can use them, for now.

    Be aware there are two different Titan X on the market, one maxwell, one pascal.

    If it's maxwell, it's the same chip as you're m40's(gm200), and is going to be a negligable improvement in performance, as they're only ~100mhz difference in clock and memory speed.

    The pascal version will give better performance compared to the maxwell, due to higher core count, faster speed, and newer generation.

    My p40(slightly faster than the titan) benchmark's(see benchmark thread) at 8m 26.99s, with my m40, 11m6.2s, and 5m41.4s with two.

    The real issue, for the maxwell series, is going to come later this year, as it's about time, if nvidia keeps to the usual schedule and rumors are true, for driver support to end.

    Might want to start looking into something newer instead of sinking money into this.

     

     

     I can't afford newer. I'm on a fixed income. I can't afford $300+ gpus which is what you're saying I have to get. Or implying rather.

     

    What I find weird about that is Maxwell  is still supported by the latest generation of unreal engine for things like nanite and lumen and software raytracing (for hardware raytracing you need at least pascal I have a Titan xp but I was told it's not enough vram for unreal engine by someone earlier in this thread so I never bothered to swap them out but I do have it laying around. I have scene optimizer. Guess I could get a p40 Tesla next month and use my Titan xp for when Maxwell is no longer supported maybe?

     

  • milliethegreatmilliethegreat Posts: 155

    milliethegreat said:

    DrunkMonkeyProductions said:

    Yeah, you can use them, for now.

    Be aware there are two different Titan X on the market, one maxwell, one pascal.

    If it's maxwell, it's the same chip as you're m40's(gm200), and is going to be a negligable improvement in performance, as they're only ~100mhz difference in clock and memory speed.

    The pascal version will give better performance compared to the maxwell, due to higher core count, faster speed, and newer generation.

    My p40(slightly faster than the titan) benchmark's(see benchmark thread) at 8m 26.99s, with my m40, 11m6.2s, and 5m41.4s with two.

    The real issue, for the maxwell series, is going to come later this year, as it's about time, if nvidia keeps to the usual schedule and rumors are true, for driver support to end.

    Might want to start looking into something newer instead of sinking money into this.

     

     

     I can't afford newer. I'm on a fixed income. I can't afford $300+ gpus which is what you're saying I have to get. Or implying rather.

    Just wondering do you pair your p40 with a quadro or your titan? And will pascal end later this year too cause a p40 I can afford and a cheap p series quadro for display output too used off eBay as the two are definitely sub $300 together from USA sellers. I'm in the USA just fyi

  • KazeKaze Posts: 51

    Those cards are still pretty good if you have them. They probably won't be power efficient, but you should be able to render just fine. Cuda core count has always been more significant for render times and those have a good amount. V Ram is V Ram. The more the merrier, but those should have a good amount. Just be prepared for the heat.

  • milliethegreatmilliethegreat Posts: 155

    Kaze said:

    Those cards are still pretty good if you have them. They probably won't be power efficient, but you should be able to render just fine. Cuda core count has always been more significant for render times and those have a good amount. V Ram is V Ram. The more the merrier, but those should have a good amount. Just be prepared for the heat.

     sooooo.... should I absolutely get a p40 or stick with my Titan xp? Cause someone on this thread said earlier the Titan xp isn't enough vram. Not I have the scene optimizer plugin for Daz. But I'm asking which gpu is enough to get the job done and if it's 100% necessary to get the p40 and is support for a p40 ending when Maxwell gpu support ends. P40 is pascal like my titan xp. Also can I pair a p40 with my titan xp and get it to work with Daz even if it's not used in the rendering process but for output dforce etc and because they're the same chipset or is a quadro a must?

  • PerttiAPerttiA Posts: 9,479

    The Titan XP has 12GB's of VRAM and that is enough, even with the process taking an additional 1GB of VRAM for emulating the RTX features.

    No news have been given about the end of Pascal support so far.

  • KazeKaze Posts: 51

    The main reason why I would see someone wanting to switch from the Titan X to the P40 would be the increase in V Ram. Though if you do choose to run the Titan X and the P40 in the same system for rendering the same scene, you will be limited to the V Ram total that is the lowest of all your cards. This means that if you run the Titan X at 12 GB and the P40 at 24 GB then at most, you can support a scene that would use 12 GB of V Ram. I don't suspect that you would want to stop using the older Titan X cards. They are still usable and 12 GB is enough to be able to render a good amount of objects in your scene. With that said, I would ditch the idea of getting the older cards and go with the RTX 3060 12 GB. The number of cores is only slightly lower than the Titan XP and the P40, plus it is of the newer generation of cards so you won't have to worry too much about having cards that will no longer work because of dropped supportability.

    Another thing to point out is that the RTX 3060 cards are much more efficient with a similar core count. This means that over time you could save a good amount on energy usage. Pretty much all power consumption results in heat dissipation into your room. The TDP of the cards you are looking at is about 250 W while the 3060 has a TDP of about 170. This also means that the additional load on your power supply would be smaller as well.

    You can find a used RTX 3060 card for cheaper than the Titan cards and the P40 so it also makes it more economical.

  • PerttiAPerttiA Posts: 9,479

    Kaze said:

    The main reason why I would see someone wanting to switch from the Titan X to the P40 would be the increase in V Ram. Though if you do choose to run the Titan X and the P40 in the same system for rendering the same scene, you will be limited to the V Ram total that is the lowest of all your cards. This means that if you run the Titan X at 12 GB and the P40 at 24 GB then at most, you can support a scene that would use 12 GB of V Ram.

    No, if the scene fits in both cards, rendering would use them both.
    If the scene would take more than 12GB's to render, only the P40 would be used.

  • KazeKaze Posts: 51

    PerttiA said:

    Kaze said:

    The main reason why I would see someone wanting to switch from the Titan X to the P40 would be the increase in V Ram. Though if you do choose to run the Titan X and the P40 in the same system for rendering the same scene, you will be limited to the V Ram total that is the lowest of all your cards. This means that if you run the Titan X at 12 GB and the P40 at 24 GB then at most, you can support a scene that would use 12 GB of V Ram.

    No, if the scene fits in both cards, rendering would use them both.
    If the scene would take more than 12GB's to render, only the P40 would be used.

     Oh sorry you're right. You could use the P40 by itself, but if you wanted to try using both you would be limited to 12 GB.

  • GordigGordig Posts: 9,169

    Kaze said:

    PerttiA said:

    Kaze said:

    The main reason why I would see someone wanting to switch from the Titan X to the P40 would be the increase in V Ram. Though if you do choose to run the Titan X and the P40 in the same system for rendering the same scene, you will be limited to the V Ram total that is the lowest of all your cards. This means that if you run the Titan X at 12 GB and the P40 at 24 GB then at most, you can support a scene that would use 12 GB of V Ram.

    No, if the scene fits in both cards, rendering would use them both.
    If the scene would take more than 12GB's to render, only the P40 would be used.

     Oh sorry you're right. You could use the P40 by itself, but if you wanted to try using both you would be limited to 12 GB.

    No, a scene requiring more than 12GBs would use only the P40, a scene under that would use both. The idea that you're "limited" to the VRAM of the smallest card is a common misconception.

  • KazeKaze Posts: 51

    That's what I'm saying. If you want to use both in a render you can't use more than the lowest. If you want to use more V Ram you can only use the card with more V Ram for the render. Am I not saying this right?

  • KazeKaze Posts: 51

    I think I see where the confusion may be from what I wrote. When I was saying you could use the P40 by itself I was meaning you can utilize the full 24GB on the P40 if you use it without the Titan X during a render operation. No limitations of V Ram even if you have both a Titan X and P40 installed on your system. If you want to use the Titan X and the P40 during a render operation then your scene must fit within the V Ram of both cards. If any of the cards are not able to fit the scene then they will not be used for that render operation. It won't stop you from trying to render if you are utilizing more V Ram available on one of your cards. Though I don't know if it will automatically switch to only using cards with enough capacity. I have mostly avoided the scenario of mixing lower capacity with higher capacity cards so I haven't tried the scenario to see how Daz Studio responds. I know that you are able to specify which GPUs to use.

    It's been a bit since I have gone over the V Ram on my card. Last I remember it should try to use CPU rendering if the scene would consume more V Ram than is available on all the cards. I believe that is still the case right?

    Hoping that my message is clearer this time.

  • PerttiAPerttiA Posts: 9,479
    edited April 4

    @Kaze, the confusion in your message comes from saying "If you want to use".
    The process is automatic and all the supported nVidia cards that are installed in the system and selected in DS, will be used if the scene can fit the VRAM of the individual cards.

    One word of caution though... One should only use cards that are using the same driver.

    Post edited by PerttiA on
  • KazeKaze Posts: 51

    If all cards are selected it would follow the automatic path you are describing. You still have a choice of manually selecting specific cards. So if you want to use only the P40 you can select just the P40. I don't know what the overhead is when Daz Studio has to make that determination on which card to use based on V Ram. Still, you should be able to skip some of it if you already know that the scene won't fit on all of your cards when you limit Daz Studio to the cards that have enough V Ram. Last I remember Daz Studio will try to fill the the V Ram with the scene and if it finds that there isn't enough V Ram on a single gpu set up, then it will try to use CPU rendering if you allow it. I'm suspecting something similar would happen in the case of multiple gpus with varying V Ram amounts. You can avoid going through that load attempt on the card that doesn't have enough V Ram and just specify the card that you know will have the right capacity. It has been some time though since I had gone through that scenario though. If Daz Studio has had changes so that it knows whether you have enough before trying to load the scene into V RAM then by all means keep all the cards selected to let the automated path determine the right card. At least when I last encountered the low V Ram scenario, Daz Studio was smart enough to remember to use the CPU after the first attempt. Restarting Daz Studio would clear that for me in the past. If similar logic is in place, I would imagine it would default to the same cards that it determined within the session you are in and lose the info when you restart Daz Studio. So I imagine this would be most helpful when you start up Daz and have that first render.

    I didn't think about the drivers. Very good point. Do you know what happens when someone tries?

  • PerttiAPerttiA Posts: 9,479

    Kaze said:

    I didn't think about the drivers. Very good point. Do you know what happens when someone tries?

    If the cards do not use the same driver, they may be using different library versions, which may not be compatible with the other.

    There have been cases, where even having Iray developer tools installed, has been causing problems because of version conflicts.

  • KazeKaze Posts: 51

    Ah I see. That could lend itself to a level of chaos that may end up encountering many unhandled exceptions which could crash Daz. A user could lose a lot of unsaved work in that case.

    I can foresee that when the minimum driver version goes past what the cards are supported for, then you will not only be stuck on the version of Nvidia driver, but you will also be stuck on the version of Daz that still supports the driver as well.

    It sounds like maybe it would be a safer choice to adopt the newer cards that are not too close to Maxwell if the rumors are true for dropped support.

    I suspect getting at least into the RTX line of cards may stave off obsoletion a bit longer.

  • milliethegreatmilliethegreat Posts: 155

    If I'm not mistaken there's checkboxes sunder under the part where it shows cpu and your gpus render settings and if you select only the p40 or p40 and cpu it will only use that. Am I wrong?

Sign In or Register to comment.