Oh great, another RTX question...
Starkdog
Posts: 176
in The Commons
Hello all,
I recently got an email offer for summer savings on new PCs. I see several with the new RTX cards, but having trouble figuring out which is the best bang for the buck.
Here's what I'm looking at:
RTX 3060 12GB GDDR6
RTX 3060 Ti 8GB GDDR6
RTX 3070 8GB GDDR6
My gut is telling me to go with the 3060, as it has 12GB GDDR6 compared to 8GB; however the 3070 has roughly 1000 more CUDA cores.
I've looked through the other threads, and now I'm even more confused. Based on the experts opinions, which is the better card to go with if I'm mainly doing Iray renders, dForce simulations, and modeling in Hexagon as well as Blender?
Thanks, -David

Comments
I could be mistaken, but I don't think any GPU will help you with modeling. For Iray renders, I would favor VRAM over CUDA cores, since it doesn't matter how many CUDA cores your GPU has if the scene exceeds your VRAM.
Go with the most Vram. Doesn't matter how many CUDA cores it has if it defaults to the CPU
Prices are still nutty! And at those prices go with the 12GB VRAM card.
8GB VRAM will not be enough for long. Based on how the VRAM consumption has increased in the past year, I would estimate that this time next year the 8GB version would be comparable to todays 6GB version as far as amount of VRAM is concerned
....a 3060 is still faster than a Maxwell Titan-X (which it is basically comparable to in VRAM and CUDA cores) given that it has RTX cores, Tensor cores, GDDR6, and is PCIe 4.0 compatible. At MSRP it would be a heck of a bargain, but prices right now are almost at what the Maxwell flagship card cost new back in 2015.
I have a 4GB nVidia GTX 1650 I was given just before the 3000 series RTX released and it runs out of VRAM quite easily though it tends to do that when I switch to Filament rather than when using iRay. It has run out for a few iRay scenes though.
At those high prices and scant availability why not go for all of the 3000 series RTX cards and just buy the very first 1 within $100 of MSRP? Especially if you are one prone to buyer's regret.
...well for one thing it's something of a crapshoot. You can get on wait lists and hope you get that email (and are around when it hits). It's also best to know what you need before you put your name on a list. If you have only 16 or even 32 GB of RAM, a 3090 is a waste of money as the programme and process will likely crash well before you could ever reach the card's full potential
Basically as others have said, you need to make sure the memory your system has can support the card's maximum VRAM (2X minimal 3X preferable). So for a 3060, 32 GB would be the bare minimum while 64 would give you ample room (not many 48 GB MBs around these days given that DIMM slots are generally in sets of 2, 4, or 8 and you're better off populating all of them for more efficeint operation). You also need to take into account memory reserved for the OS and system utilities as well as that used by anything else you may have running at the time (like a browser or other graphics programmes).
I'm running with 24 GB (the maximum my old X-58 board can support) and a Titan-X and have already seen system memory loads of up to 17 GB (and that is with no other programmes running, just Daz ,W7, and whatever system processes are needed).while the VRAM load is around 4.5 to 5GB.
For now I'm content with my Titan-X as my MB only has PCIe 2.0 expansion slots (one generation below the card) Not even sure the board would support a PCIe 4.0 card.. Also still running the Daz 4,12.0.47 Beta so not concerned when Nvidia ceases W7 Driver support as that version has an older build of Iray. (I have all driver installers up to the 471.11 release archived).
Agree, my graphics card has 8GB of VRAM and I can hardly fit 2 Genesis 8 characters in iray Daz Studio scene without dropping to rendering on CPU.
My approach is completely different and I'm not looking too much at VRAM and other technical specifications, although it makes a difference when it comes to speed as long as a scene fits into the VRAM.
All depends on what is your goal. E.g. you can run very easily out of VRAM with a huge and complex scene, regardless how expensive your GPU was.
In order to Illustrate what I mean, I attached (see the dropbox link below) the first 38 seconds of a 6:30 minutes long movie from an unfinished project that I began 7 years ago, it was a first prototype and test movie for camera movements I did with Carrara and Daz Studio the very first time, please be aware of the goal and that it wasn't targeted to make a perfect movie and just a first shot and far from perfect. But how would this scene fit in an even 24GB VRAM GPU? If you want create serious and very complex scenes the answer is that all depends on layers! You can create a very complex scene with 100GB in size with layers, but none of today's GPU is capable to do this.
https://www.dropbox.com/s/ax9rezsbndxfw6p/FullStage-00.00.00.000-00.00.38.200-seg1.mp4?dl=0
First bunch of posts are right on, extra CUDA cores won't matter if the scene won't fit in VRAM. For mainly doing Iray renders, the 12 GB card.
Or you can save up and buy a 24Gb 3090 FE when it becomes more readily available, I'm on a fixed income, and even I pretty much saved up for 7-8 months for one and it's 100% worth it!
This is the best thing but only at $1500 MSRP. I'd totally buy the 3090 at $1500 or less; or the equivalent for the next generations RTX with 24GB or more. That means waiting until, well, who knows?
It is important to remember that everybody does things differently. While I personally would not want to go back to anything with less VRAM than my 1080ti's 11GB, this is up to the buyer. Maybe all they want to do is put a single figure in a scene?
Besides, I have put 10 Genesis 8 characters into a single scene on my 1080tis and successfully rendered them. I would imagine you can get 6 or 7 on a 8gb card. It is possible, it just depends on how everything is set up. Obviously having more VRAM gives you more options, but I will not insist that someone buy a 8gb card, I don't know what they are doing with it.
...I'm the same after working on a Titan- X. Either a 3060 which ahs a slight edge in specs and performance, or for more VRAM at the very least, an A4000 with 16 GB.
Good Lord! I read today that they're now talking about an RTX 4000 for Q4 next year.
People are always talking about what comes next. Some have been talking about 4000 before the 3000 series even launched, LOL.
Nvidia, Intel, and AMD have road maps that they show at conferences outlining some of their plans for several years in the future. These plans give a hint to when they plan to launch, and sometimes offer codenames.
Tech always moves forward. Once upon a time new generations of GPUs could launch just one year apart. It was fairly normal. CPUs still do this, both AMD and Intel have released new CPUs pretty much on a yearly schedule for many years. GPUs have kind of moved to a biannual release now. Don't forget Nvidia launched the 2000 Super series around the middle of Turing's life. So they can do refreshes sometimes between generations.
AMD was trying to get more aggressive and intended to release their next GPUs more frequently, though current rumors suggest that RDNA3 has been shifted a little back.
Consider this, Ampere is actually closing in on its first year, and 2022 will mark its 2nd. So logically a new generation would be coming 2022 to keep pace.
Personally I think 4000 (or whatever it is called, Nvidia likes to be funny sometimes), will correct many of the shortcomings of Ampere.
The bigger question though is how will the market look in 2022? Will there be any kind of return to normal? Or will a whole new crypto boom be taking place in 2022 to cause GPU prices to spike yet again? I personally expect crypto to be as strong as ever next year. I think what will eventually happen is that gamers and other hobbyists will be the ones who ban together to try and get crypto regulated once and for all, so that the masses can finally buy hardware once again. I am being completely serious.
Yeah, I lucked out on January 5th, as that's when I got mine via Best Buy through hotstock, in spite of 3090s dropping at least twice per month you still have to be fast, otherwise, they get snagged pretty quick!
As far as I know, Best Buy is the solitary store that you can get a 3090 FE at MSRP!
@ outrider42,
...yeah if Crypto is still going strong and indeed, I see this whole debacle happening again leaving us in the lurch with obscene prices. What "grinds my gears" about this is crypto was originally intended to be "untraceable" which rpimarily benefits criminal elements.
My other concern is for those with older tech on a tight budget (such as myself). We cannot afford the latest shiny state of the art hardware. So we will be left behind. I'm currently running a Maxwell Titan-X which already has been given a "depreciated" status by Nvidia for Iray. The next step is full abandonment. I cannot even afford a 3060 (which has the same basic specs and performance) at the prices scalpers are asking (I've seen prices upwards of 3X the MSRP of 329 USD). With Maxwell architecture now in a "depreciated" status for Nvidia's Iray engine, It probably won't be long before my Titan-X becomes a paperweight.
Like you mentioned I refuse to backtrack on VRAM from what I now have. It need it to be either the same or an improvement over what I already have and definitely do not look forward to going back to glacial CPU rendering. .
That Best Buy is actually the place I came closest to buying from as a matter of fact. Had one in my card but it seemed to hang and then when I refreshed it was gone! Happen two different times! I'll try again in Winter 2022!
When I went to buy mine, it hung, but that was by design, as it eventually allowed me into the cart, where I pay-pal'd my azs as soon as I could, and found out it actually went through! My guess is that it's an anti-bot measure, and you won't have to wait 'til winter as they have been dropping at least twice per month, just look at the hotstock link...
So next time it hangs, wait a bit until it goes through!
OK, thanks. I'll stick to that method then.