Building the Perfect Beast
I've been running Daz Studio on an old machine for several years now. It was a screamer back in the day, now it's a dog.
I was hoping I could get some input on building a fantastic machine that eats Daz Studio for lunch. If you were to build a computer for the purposes of making Studio run as good as it can, what woulld that system be?
I'd like to be able to simulate Dforce hair in less than a minute.. can that be done? I'd love to be able to grab a couple grouped characters and move them around without a lag between adjusting the tranlation dial and the group moving. I want the program to run fast and smoothe.
Any suggestions on the system I should build? Lets pretend money is no object, I want this to be the ULTIMATE DAZ STUDIO MACHINE. Any suggestions?

Comments
Start with 64GB's of RAM and an Nvidia RTX 30xx GPU that has 12GB's or more VRAM
A perfect beast at this time with all the shortages in silicon chips will set you back around $5.5K to $6K
1. nVidia RTX3090 GPU
2. X570 Motherboard AM4
3. AMD Ryzen 9 5950X CPU
4. 1000W 80 Platinum PSU
5. 128 GB DDR4 3600Mhz system RAM (64 GB is not enough to push the 3090 to it's limit)
6. 360 mm triple fans AIO (All in One CPU cooler)
7. A midtower case with good airflow and cooling capabilities and enough room to accomodate 2 radiators just in case you want to go with liquid cooling
8. A 1 TB mVNe gen 4 M.2 Drive for appz, one or two more 2 TB for content
Both the CPU and GPU run very hot,, so you can watercool both of them as an option (add another $1000 for that option)
Others can chime in with their suggestions.
And after you've built it, this beast will be perfect for at least 6 months.....becuz another perfect beast will come along by then.
"And after you've built it, this beast will be perfect for at least 6 months.....becuz another perfect beast will come along by then."
So true!
...sadly the price of a 3090 right now pretty much would equal or possibly even exceed the cost of the all the other components combined so prepare to have some deep pockets.
If money is no object, then I would go with a threadripper and a TRX40 board, throw in 256 Gb(8 DIMMs x 32 Gb each) & 2x RTX 3090's. Renders & dforce sims should be fast, but I can't say much for moving a bunch of high poly-count stuff around simultaneously in the viewport. iirc, I think that's mainly down to single-core CPU speed. When working with big scenes, you just have to hide enough stuff until object movement is fluid enough for you.
...or with that much graphics driving power use Iray view mode as that is GPU based.
I don't think it would help. If anything, it would be slower trying to move a bunch of geometry-heavy stuff around in the viewport while using iray view mode because Daz still needs to process the changes in the scene.
Make sure you watercool both those RTX3090s Cuz with both side by side, there isn't enough air to cool both and they will start to thermal throttle a few minutes into a full load.
...wasn't sure if a GPU is used to do the calculations would help.
Yeah, very frustrating for those like myself who often create fairly involved scenes. A little trick I came up with to speed things up a bit was to put small low poly cubes as placeholders where I wanted scene elements to be placed (sort of like "blocking" for a stage production) and then after inserting say a prop or pre posed character, pasting it to the desired cube and deleting the cube.
Maybe one of the things to put on the Daz 5 (or 4.16?) "wishlist", Daz finally going multithreaded for scene building.
I often use 3DUniverse's scene tools to turn things on and off, that helps.
I really appreciate the responses here.... so more cores the better for moving geometry around, the GPU isn't calculting that, eh?
JV is pretty spot on.
Speak of the devil (Perfect Beast)
Hi Ly!
I prefer Intel as well, but at this moment, the Core i9 11900K looks like a pussycat next to the Ryzen 5950X and is just not beastly enough. Perhaps in a year with the new 12900K... but at presence the Ryzen 9 is king of the beast of the consumer CPU market
Cannot stress enough that if you have a Titan RTX or 3090 with 24gb of vram, you will need to have 128gb of system ram to take advantage of it. 64gb won't be enough. I learned that the hard way
...well that would be the ideal situation, however the Daz programme only makes use of a single CPU core for all other functions except rendering on the CPU in either Iray or 3DL.
Same here, my mobo is 3 yr old and maxed out at 64 Gb, so I just have to be content with 17 Vickies in a scene instead of 20.
Yes, this is extremely important if you want to be able to use all of the VRAM the 3090 has. Otherwise you will come up short, depending on the content of your scene. You might use up all of your 64GB of RAM but only hit 17GB of VRAM. If you have no plans to actually use this much memory, then it doesn't matter. But if you do want to use it, you will need AT LEAST 128GB of RAM.
You can also go much further. After all, you did say beast, correct? There is the A6000, which is the pro GPU that packs 48GB of VRAM. You can Nvlink two of them together to get up to 96GB. And since you would want 128GB RAM with a 3090, well, you would want a whole lot more for 48 or 96GB of VRAM. Basically 256 to 512GB...or more.
The A6000 is around $5000-$6000, so grab a few of them, however many will fit into your case. Or you can buy a mining rack and use it as a base for your render rig. Then you could a good dozen A6000s in there.
Dual NVLinked A6000s and 512 GB of memory, yeah, wouldn't have to worry much about dumping to the CPU.or having to optimise.
Now just need the 18,000$ to build the system.
A while back, after years of just barely limping along to do even a simpe Iray render, (forget about LAMH or USC items), I just cut to the chase & went with these folk. Their clientele pretty much speaks for itself:
https://www.boxx.com/why-boxx/whats-in-the-boxx
You can spend more of course, but considerably less than $18K. Use their "Build Your System" pages to play What-If.
Top notch, USA made components. Tech support answers calls on the 2nd or 3rd ring. No high pressure sales.
Although a definite consideration, you shouldn't consider $$$ as the primarily of cost; the real cost is always time. That'll become evident when a scene that was taking you hours getting to an acceptable percentage, renders fully in 20 minutes.
...unfortunately their configuration editor keeps crashing with a "Failed to create Session" error even with advert blocking off.
I'd like to suggest a full tower case for all those HDDs you're gonna end up filling with all those renders/scene files!
But yeah, that looks like my upgrade path for sure! My 3900x has served me well, but it's time to move forward!
A full tower case is not really ideal for rendering,
1. Full tower is too large to put on a desk (you need the bottom and lower part of the case for intake (in some cases), If the case is on the floor, they';; take in more dust.
2. you will need more fans to get the air flowing to cool all the component because of the volume of space in a full tower case. with more fans spinning,, the noise level incre3ase.
3. most cables are made for mid tower cases, so you may need to get extension cables in some full towers
Most Mid Tower cases nowadays can accommdate 5-9 HDs. Most folks don't use mechanical HDs nowadays NVMes and SSDs are now the preferred storage method.
My Lian Li Lancool II mesh (small mid tower) can accomodate 6 2.5" and 3 3.5" HDs, and my motherbaord has room for 2 NVMe M.2 drives, so that's a total of 11 drives. If that's not enough, build an NAS.
Mine is a full tower, sits on the desk and has optimal airflow because of the extra room inside and is silent by design. I found a few years back that a full tower suited my needs more than not and have used one on the last 2 builds.
A Stadia version of the renderer would be really nice.
For me, would not say a full tower is not ideal for rendering. Would say it means you need to plan and adjust accordingly. A full tower allows for more heat to be moved out of case (after that, still need to do more). To me, this is totally worthwhile.
Have a 2nd computer with a Thermaltake Tower 900.
(1) My 1st full tower case is on desk. 2nd bigger full sized case is on oak file cabinet dolly right next to desk way off floor.
(2) Bought lotsa fans all with low dB ratings, which are still quiet even with lots running. Almost silent. However, new AMD CPU with its heat abundance and 2 fans is noisest by far. Still good compared to what I read.
(3) KBM cables are fine in length. PSU cable is fine with good interruption Powerbar. Headphone cable length and controller length not a prob. Have latest ver HDMI so that is only custom length & quality cable. Have my own Cat5 cables so length there not an issue.
So to me, temp management is way more important for a rendering rig.
Just to add another opinion.
Certainly! Depending on one's aversion to custom watercooling & tech knowledge, you could also undervolt the 3090 so you're not generating as much heat. Its ridiculous how much power these cards require.
...my old P-193 is classified as a "mid tower" but is actually in between mid and full height (leaning more towards "full"). I have two desks arranged in an "L" configuration in a corner of the room with a large cube table between them. Currently the P-193 is on the other desk until I can get a riser for the cube table so it's at the same level for ease of cleaning and servicing. The system is rather heavy mainly in that the case for the most part is all steel construction) but the desk it is currently on is very solid (as I have carpeting I would never put a computer on the floor). It also has 7 fans including a large 200mm intake fan on the left side panel (all intake fans have mesh filters) In spite of this, it runs pretty quiet. The CPU cooler I have is huge and has a heat pipe system that looks like a V-8 Engine manifold. If I could get my hands on one I'd have no issue with installing a 3090 or an extended ATX MB.
As is my Titan-X runs at about 61° - 65° during rendering with GPU fan speed set at 70%. Being an older case, supporting venting for liquid cooling would be a bit of a challenge.
I've been undervolting my GPUs. First when I had the 2 2080Ti (even those run quite hot under load) and you don't want them to cook for 2-4 hrs. I got rid of those and upgraded to a single RTX3090. Eventhought the 3090 ran a bit cooler than the 2080Tis, the VRAM temps consistently went over 100C. I undervolted the 3090 and replaced the thermal paste and thermal pads and managed to keep the VRAM around 80C. I don't have the $$ to watercool it atm, so I just have to be content with those numbers
...yeah the 3090 is nice with 24 GB however the RTX A5000 TDP conceumes 120 w less than a 3090, meaning less heat output. Of course the fradeoff is the higher base price, but then no need for exotic cooling..
Still considering a 16 GB RTX A4000 as a compromise, Consumes about the same power as my old 1 GB GTX 460.
I have a system coming with a Kingpin 3090 and I don't think I'll be using it for rendering anything over 10-15 minutes for quick preview renders until I get it broken in and see how things run. I'll leave the long renders for the A-series card. I have the personal preference of not wanting to deal with custom/exotic water cooling and the added maintenance of draining/replacing fluid. There's just too much that can go wrong, which is not a good thing for someone such as myself with little to no experience with those kinds of cooling systems.
You might be interested to know. My Aorus RTX3090 GPU with 3 PSU connectors runs at 43degC avg and 47 deg max during AC Vallhalla Session for 3 hours with everything maxed out (MSI afterburner readings). Did not check VRAM temps, if that's very different than just GPU. Am i missing something by not checking VRAM vs GPU? No undervolting, no upgrades to GPU, and no liquid cooling, just mondo case and lotsa fans.