Adding to Cart…
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2026 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2026 Daz Productions Inc. All Rights Reserved.
Comments
Thanks, but I've already placed the order. I didn't really want two mid-range cards anyway.
Thanks Saxa, I hope I'll see as big a difference as you have. :)
Both have 11GB - remember that memory is not pooled, each crd lods the full scene. The two cards would work in parallel, I'm not sure - now that we have RTX-enabled versions of Iray - whether two 1080Ti would be faster or slower than a single 2080Ti.
I think it would depend on the scene. In scenes with lots of surfaces or lights, where ray bouncing would be a big part of the render, the 2080ti would likely be as fast if not faster than 2 1080ti.
I've compared my 2070 to my 1080ti in interior scenes and the 2070 is either as fast or slightly faster than the 1080ti. Moving up 2 more steps in the product stack I'm pretty confident the 2080 could beat 2 1080ti's in at least some scenes.
Now if you can really get 1080ti's for $400 I'd have to seriously consider the 1080ti. It's not like the card isn't great.
well if they don't there is at least hope other engines might one day be developed, I just assumed it was a marketting ploy to sell cards, admit was speculation which we all know is forbidden
Its always a tough call whether to wait or not. And we do not know for a fact what Ampere will bring, we just have rumors. So if you need a better card right now, well that kind of makes the decision for you. I'm sure you'll be very happy, the jump from a 980ti to a 2080ti is pretty large, both in speed and VRAM.
I was about to say there is also a chance that Ampere might offer more VRAM at the top, but that s my own speculation. For one, 11 and 12 has been the top for a while, and then the Titan RTX jumped all the way to 24GB. Both the future consoles by MS and Sony are going to ship with very large pools of VRAM available to them, which makes me think that PC will follow suit purely because no high end PC gamer would be happy that a console beats them at anything, even if most games do not need that much VRAM. It would be insulting for a console to have more VRAM than top GPUs.
Anyway, I don't remember if you mentioned it, but don't forget you can use both of these GPUs at the same time for even more performance if the scene fits both cards. Since you have 1000 W PSU, you have plenty of juice to supply them if your motherboard can run both cards. It might get a little warm in your room if your room is a bit small. The choice is yours. I was using a 970+1080ti for a while.
Thanks very much for the support, we'll see how things are in a few days when it gets here. This isn't going to bankrupt me, but it's never easy to push that final purchase button when that much money is on the line. ;)
Outrider, thanks, I need to research dual-card setups a little to see what the advantages and proper setups are. I don't want the 980ti to lessen the effectiveness of the 2080ti in rendering, gaming, or anything else, but I also don't want to have to worry about which card any particular thing is utilizing. I went through a week of frustration early last year when I couldn't figure out why my Oculus Rift wasn't working properly; turned out it was plugged into the card I owned before the 980ti. Durr.
There's nothing to it. For Iray, it can use any GPU you have connected. There is no penalty for using the 980ti with a more powerful card. If the scene does not fit the 980ti, then the 980ti will simply not be used and stay idle during rendering. If the scene does fit the 980ti, then it will render along with the 2080ti and you should see a decent gain in speed from that. There are some exceptions. If you make a scene that is right at the 6GB limit of the 980ti, then it is possible that it will not properly drop out like it should. I had this happen a few times with my 970+1080ti. What happened is one half of the render would be transparent. This only happened when the scene borderline to my 970's VRAM limit. This was a while back, and it is possible that the newer versions of Iray do not have this glitch. Otherwise, everything was fine.
For gaming, just use the 2080ti. Just make sure your monitor is plugged into the 2080ti. Do the same with your VR. Windows 10 is designed so that you swap the connections between GPUs while still running without causing a crash. This is part of the whole "Windows 10 is stealing my VRAM" complaint you have seen on these forums. That behavior was designed for this particular purpose. You will probably want the 2080ti in the first GPU slot to get the most out of it. Though this depends on your motherboard. Most motherboards give the first GPU the full 16 pcie lanes, while other GPUs only get 8. Now I want to point out this ONLY effects gaming, not Daz Iray. My aging motherboard actually has two x16 slots, so for gaming it does not matter at all which slot the GPU is in, and my own testing confirms this. But that is my motherboard, yours may well be different.
The issue you may run into is the physical size of these cards. That 980ti is probably pretty big itself, but the 2080ti is possibly even bigger. So depending on just how big they are, that could derail any ideas of using both. In my case, I bought eventually got a second 1080ti. So one is EVGA, the other is MSI. The MSI is a monster, and I didn't quite expect it to be as big as it is. This forced me to use them is a specific way, the MSI is too large to be in the 1st slot, the EVGA below would not have enough space for them to breath. I tried it, and both cards went above 80C with Daz, which is not ideal to me (and gaming would have been worse, games actually make the card hotter than Iray will). So I had to place the EVGA in the first slot with the MSI below it. Now the EVGA hits 77C and the MSI only hits 65C in Iray. If I game on the EVGA, it can hit 84C. The MSI will hit about 75C in games. So I ended up using my MSI as my gaming card from the 2nd slot. But that only works for me because my motherboard has x16 in the second slot. Otherwise that would not have worked.
So hopefully the cards will physically fit together well enough in your case to make it all work. If you are unsure about your motherboard, just look at it. It often says right on the board what the pcie slot can do. On my board it says "PCEIX16_2" on the 2nd GPU slot. Otherwise you can check your motherboard documentation.
What's funny is I bought this motherboard never thinking I would actually use that 2nd slot, but I wanted to be safe. That was before I got into Daz Studio. So I am quite thankful I picked a board that worked out that way.