Almost 2080ti time
SnowSultan
Posts: 3,773
I might finally be ready to pull the trigger on a 2080ti, but I wanted to run it by those of you who might be more familiar with detailed specifics and how they relate to rendering rather than gaming. Here is the card I'm considering:
ASUS ROG Strix GeForce RTX 2080TI 11GB
My case can handle it, although I'll have to remove unused drive bays (this card is huge) and I have a 1000w power supply which should be plenty even at load. In the reviews I've read, this card is always mentioned as running cooler than others and has two HDMI ports, which would be helpful in my case. It would be replacing a reliable but rather outdated 980ti and used almost exclusively for working in DAZ Studio (Iray) and Substance Painter. I'm actually not quite as concerned with rendering speed as I am with decent real-time Iray previewing and the expanded memory for handling complex scenes with large textures.
If anyone has any objections to this card and my system being joined at the expense of a good part of my checking account, please speak now or forever hold your peace. Thanks very much for any information or suggestions.

Comments
I'd say that this is the defacto standard for straddling the line between performance and price. It's a VERY solid card and you can't go wrong! Even though your 980ti isn't super old, you'll see some amazing performance gains with it being RTX and all. Thumbs up from me!
I think it goes more towards price than performance. I think 2070 super is the sweetspot for performance/price.
I have a 2070 super and it's decent for interactive Iray to see what is happening in the scene (as long as the scene is not too crowded)
But if you have the money for it or you get a good deal, 2080TI will be sweet.
Strix is a very well thought of line. I have no experience with the 2080ti but my 2070 is a Strix. I've been very happy with it, tbh I've owned many differenty companies cards over the last decade and never been truly unhappy with any of them. I do stick to cards that are close to the MSRP of the card. I don't pay $50 extra for a factory OC that might add 2 fps in some game I almost never play.
And for Daz what really matters are the CUDA cores and that doesn't change with OC anyway.
well Nvidia are getting what they hoped for letting DAZ have iray for free
congratulations and good luck on your new purchase
So I'm here looking for the outfit used in Orestes Burnt Lands and obviously am lost which often happens to me when searching this forum for sumtin'. So about the 2080ti, placement is important. If you get two and you want some speed, you will need a NVLink for bi-directional communication. Recently, after 8 mos, my RTX 2080's stopped working as a pair and decided they didn't like being so far apart, so now I am awaiting a new NVLink because they are now closer together and the old NVLink is too wide :(. I have SLI on my dual 1080 ti's but this would be fway to slow on 2080's. You also should invest in a minimum of 32GB RAM.
So nobody thinks it is worth hanging on f or a few months for the 30xx range? I know there is always a delay before new cards are supported (especially with IRay) but there is talk about more VRAM across the range, low power consumption and lower prices, not to mention significant speed gains.
I have a 1070 and sold my iMac in order to buy a 2070 Super but was put off by all the problems with IRay and 4.12 so I'm holding back for now.
Well, it took almost a year for the 20X0 series to get propper DAZ support, so I'm guessing most people are not waiting.
Thanks for the replies so far. I've waited a very long time to upgrade my card and am definitely not rushing into this. I even doubted I could justify spending this much on a card because I wasn't making art often enough, but I'm starting to improve my methods and think I could make more with this sort of an upgrade.
Marble, I've thought about that constantly for at least a year and put off buying a card twice because of it. There's just something about what I read and hear that's giving me the impression that the 30xxs will be good, but not a massive improvement or more reasonably priced over the 2080tis; not to mention it took a while for 2080tis to be more stable and reliable after their release date. If I'm wrong and the 30xxs are great, maybe I can consider adding one further down the line.
Kenshaw: Yeah I don't really want a heavily OCd card either, it's mainly the 2 HDMI ports and the improved cooling that I like about this one. I read some comparison charts of the various 2080tis, and this one kept popping up as one of the more stable, cool, and still slightly less expensive of the line.
NVlink has no effect on 2080ti's used for rendering.
On 30xx cards, you'll be waiting at least till June for the 3080ti and longer for the lower tier cards (and that assumes Nvidia follows past history). Usually by CES, which is going on right now, we've had fairly substantial leaks on the new generation. On Ampere we've gotten little more than vague rumors. I wouldn't count on Ampere this summer or even this fall.
I beg to differ, right now, with the NVLink removed and returned to vendor, Daz only sees one card. Here is some info on NVLinks Sli Links and such.
If you're just going to render things and going for pure value, I recommend gigabyte 2070's on ebay. if you get 2 of them, you outperform a 2080ti by 60% AND save you about 200 bucks. I got one and it's going strong for 2 months now (been rendering every day). only thing is it has 3gb less memory (although i heard DAZ might implement shared GPU memory soon)
sources:
https://www.pugetsystems.com/labs/articles/V-Ray-NVIDIA-GeForce-RTX-2070-2080-2080-Ti-GPU-Rendering-Performance-1242/
https://gpu.userbenchmark.com/Compare/Nvidia-RTX-2080-Ti-vs-Nvidia-RTX-2070/4027vs4029
I don't think I would call the 2080ti a price to performance standard. That's $1000+ card there. No x80ti has ever been a price to performance product. However it is the fastest single GPU for consumers, and once you factor in the RT cores it can nearly double the 1080ti in some Iray benchmarks. The Titan RTX is only very slightly faster and offers 24GB VRAM but it is also more than double the price.
Take a look at my sig. There you will find a nicely made benchmark thread that now has all of the RTX cards in it, including all of the Super versions and even the Titan RTX. You can take the test yourself and compare your card directly to what is listed so far.
However, I would caution waiting if you can. You have already waited this long, so why not a little longer? Your 980ti goes back to like 2015, so you are somebody who doesn't upgrade too often. I would imagine you being quite bummed dropping $1000+ now and then in 6-9 months finding that being completely outclassed and possibly for a better price. The next generation is coming, and yes, it is possibly 9 months away from now. It might be a little less, but 9 months I think is more likely. While Ampere is mostly speculation right now, I do think it is quite reasonable to expect that Ampere will offer huge performance gains over Turing when you consider what is going on.
Ampere will be made on a 7nm node, which offers nearly twice the density than Turing's 12nm. This not only allows for greater performance, but also greater efficiency. AMD has made huge strides jumping to 7nm with both its Ryzen CPU and Radeon GPU line. Nvidia has been ahead of AMD all this time in both performance and power draw, so it stands to reason that Nvidia will also make a similar large leap by jumping to 7nm.
But the node is not the only factor. Turing was the first generation of hardware based ray tracing, and frankly it is more of a test than anything else. Gamers have given Nvidia a lot of flack for ray tracing performance (it tanks frame rates a lot compared to not using it). So Nvidia will likely focus even more on ray tracing. And that means better performance for Iray.
Also, Nvidia will have great incentive to push hard. AMD is making moves and "big Navi" is set for 2020. Nvidia will not allow AMD to possibly take the performance crown, not even for a week or two. Just look back at how Nvidia teased "Super" right at the exact same time AMD announced the 5700XT. Nvidia will always be trying to upstage AMD at every turn and that will not change in 2020.
Oh, and one more thing. While it took forever for Daz to add Turing support the general release, and another forever to add the RT core support, I believe waiting for support for each new generation is a thing of the past. You see, the old OptiX Prime that Iray used needs to be recompiled for every new architecture. However, the new Iray with full OptiX 6 does NOT need this. This means that barring a complete change in arch, that Iray will probably work with Ampere out of the box on day with Nvidia drivers. It is possible that it might not be optimal performance, and update will help with that specifically, but the cards will still work. And also, even before the new Iray, Turing worked with the beta on Turing's release even without any update. It just had no RT core support. I believe an update that added Volta support was what helped Turing in that case, either way, it still worked.
Thanks David, but I'd prefer one card if possible, and if DAZ does ever allow shared GPU memory, I'll just put the 980ti back in alongside whatever card I end up with.
Outrider, those are some good points and I am putting a lot of thought into waiting, at least for a little bit. Some of my concerns are based more in common sense than in anything I'm reading (like the 3080ti being more powerful and cheaper than a 2080ti, that sounds a bit like wishful thinking and akin to those who believe the Playstation 5 will handle 60fps in true 4K), and even if the new generation is released in 9 months, we're likely to encounter both trouble in the initial batches and very limited supply due to cryptominers buying them in bulk upon launch. I have a feeling it might be close to a year before I could get a 3080ti, and I don't know if I can wait quite that long. I could be wrong about all of that, and I might at least wait a little longer for a sale or at least an included game or rebate.
I've looked at the benchmark thread you mentioned before, it's extremely helpful and actually helped me decide on eventually getting a 2080ti a while back.
Going to throw this out there as a RTX2080ti owner-user.
Luv my GPU. Moving from GTX1070 best thing ever did.
BUT, I did that I March 2019. My biggest issue with 2080Ti is the 11GB memory. Under the new Iray (BVH structure building) regime -- 11GB is not actually 11GB anymore. Seems more like 8GB when I compare to a scene I ran on my GTX 1070 under Pre DS 4.12.*.*. That's cos of different memory computations/dermands apparently (in part see Ray D'Ant's many comments in DS Beta Thread).
Where am going with this, is IF 3k series offers another leg up with memory, then that extra wait is going to worth it. All depending on your $ availability.
I am at point where will buy a Titan if 3k series don't increase memory. So it all depends how "heavy" you render your scenes. Dropping to CPU is just hair-pulling - scream-worthy. And that's speaking civilly. And that's after having spent a pile of $$ for higher end parts. Think we'll know in 10-11 months for sure what new specs are.
So if you want to do more complex scenes, I would pass on RTX 2080Ti if your budget can only take one hit. But I can say that from the luxury of enjoying RTX2080Ti now. And yes, it is sweet. Would add extra words to describe this, but TOS seems more ....so this all you get.
That is interesting, I'd read a little about how memory usage has changed with the new Studio Beta but didn't know that. Couple things though:
- Do you really think the 30xx series will offer more than perhaps 12 gig on their ti class? Don't we have to spend well over $2500 right now to get anything more than 12 gig of VRAM? I would really find it tough to believe that the 3080ti would offer something like 16 gig for anywhere near the price of a 2080ti, and I really can't justify going over about $1300 on a video card either now or in a year.
- If Studio does ever support shared VRAM across video cards (does any 3D software currently support this?), do you think you'll have to have two identical cards to do so?
Wouldn't want to offer my impression as fact about memory usage. But I am left with that strong impression. And the matrix that Ray D'Ant described to make RTX calcualtions work along with a link to Nvidia presentation showing it's on the list to improve their optimization. So it seems very likely memory usage is increased is true under RTX (for now anyway).
Do i think for sure that 30XX series will have more memory? Sorry, no idea. Gotta believe at some point the makers decide to ratchet up the entry level of memory. Anyone who has been around computers for any time knows first hand we have come a long LONG way from what min specs used to be. lol. It feels like it's time again for next step up, but maybe just crazy wishful thinking on my part.
If i had to decide right now, if they even decide to add 2 more GB to the cards that would be worth the wait given the crazy prices. Would guess it does depend on the economics of memory increase and what their marketing department feels is best way to structure product tiers. Which is a way of saying, it's all guessing right now.
Amy sorry, but don't have anything more to offer beyond that as I don't know anyone that works at Nvidia tech lab.
As for your 2nd question, more often than not, your first and biggest-badest card in one way or another defines what the limits are. Sure they could break that barrier, but again we are guessing as to what the terms are.
You will enjoy an RTX2080Ti for sure. I can usually render out 1xG8F at SubD4 with HDRI in usually just over 1 minute - 95% Converged at 1220px1220p with DS 4.12.0.86 and 441.66. But there is a fair chance they do finally bump up memory to get us early adopters to upgrade again. What that fair chance is up for serious chatting. Just wanted to post my feelings, since seems like rendering is a big part of your life, and thought I'd add what I am feeling about my card and would I buy one now.
Not an easy decision. But when is it? lol. Good luck!
Did we look at the same benchmark?
2070 Super is very close to the 2080TI performance wise. and as performance/dollar it is almost double.
I just wonder if the huge 754mm^2 die of the 2080 ti has any effect on longevity of the card,
or is it only matters for increased production costs due to a higher chance of defect per GPU?
You can differ all you want but plenty of people have multiple cards in their systems, I do, and don't need SLI or Nvlink. As a matter of fact SLI and Nvlink require identical cards (Nvlink is just SLI on consumer cards like the 2080ti) while people routinely use mismatched cards in DS. Nvidia actually recommends against using SLI in iRay.
Yea, 4 here.. no NVLink/SLI bridge in sight...?
What makes you think Daz gets Iray for free? I would imagine that they pay some kind of license fee for most or all the non-Daz add-ons included with DS.
My original post referred to have two identical cards. I was not referring to two mismatched cards. So we are both right as we are each discussing two different situations.
RAM is arguably more important.
So sacrificing some CUDA cores for more RAM would be the way to go imo, but that isn't really an issue with the current series; more cores and RAM go hand-in-hand.
... But this nicely segues into my next point, if the card you can afford will be mostly idle because your scenes require more RAM, then spend you cash elsewhere or save up more.
... And then of course, only compare CUDA cores from the same generation.
Hello SnowSultan, I recently bought myself exactly this card in the OC version.
However, I upgraded from 16GB RAM to 64GB at the same time.
So keep that in mind when I tell you now about my experience with rendering.
Before this card, I used a GeForce GTX 980 AMP! EXTREME from Zotac in my rig.
After I upgraded, I made a test with a render that took me more than 14 hours to render with my old rig, with the new rig the image was done after 1 hour and 8 minutes.
Now, when I render images that would have taken me somewhat between one and two hours, they are done in around 2 minutes with one G8F and around 11 minutes with three G8F.
All in all, I do not regret having bought the upgrades.
Alex, the 2080ti has twice the CUDA cores of the 2070, that makes a big difference in what we do here.
Thanks Saxa, your opinions are some of the only ones around here that I've yet to disagree with. ;)
Thank you for that information AbyssalEros, it's good to hear actual comparisons of render times using the exact same card. I have 32GB of RAM, so I should be OK for a while.
I'm probably going to order it in the end. Outrider made some good points that I thought a lot about, but I ended up asking myself if this card would make a difference in my art - not considering whether it would be outdated in a year or whether I'd kick myself for not waiting for a bigger sale. I'm pretty confident it will; I have two large projects I have to get done this year, one possibly including some animation, the two HDMIs on this card would allow me to use my Occulus Rift again and get it out of the bottom drawer after almost a year, and chances are if I had to render a gigantic scene with a ton of characters, I'd do it in sections or use canvases anyway. I bought a TV last year and six months later it was outdated - but it still serves me well and looks great. The cost is probably worh the frustration I will avoid every time Studio reverts to CPU rendering or having to render twice and composite in order to show HD morphs on a figure.
Thanks to everyone who shared their opinions, I appreciate them and your time. Take care.
No. Nvlink is still just SLI on consumer cards and Nvidia specifically recommends it be turned off for iRay.
LOL, that was my feeling when I bought last March. Goal was to enjoying more the making of art. And it makes a big difference. Enjoy if you do get it!
Honestly, I'd suggest a pair of used 1080Tis over it. It'll give you more ram, more speed, and while it'll take more power consumption, your PSU is generally complete overkill and could hadle it with ease.
These days you can get 2 of them for 800 bucks compared to a single 2080Ti for 1200. Hell, if you're willing to go full stupid and have the space, get three...