Adding to Cart…
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.
Comments
When I was 13 I ran away from 8 years of entrapment and torture and became a ward of the court. A poor foster child.
When I was 19, I had $300.00 dollars in my bank account when I seperated from my husband, and started my own business and rented part of a shop with the first month rent free.
When I was 21 I got an FBDB loan and built a two story building 1 day before the grand opening it collapsed. I tried but against all odds I was forced to go bankrupt at age 24.
There is more adversity but who cares? The point is I did it without any inheritance, no help and not having enough money as a single Mom to buy diapers, nevermind lottery tickets. At one very low point in my life I scrounged ditches for beer bottles and stole a sack of potatoes because I was starving. Dreaming works. Without dreams I would not be mortgage free, own several vehicles, have no credit card debt and be able to feed stray dogs and assist homeless people. Dreams enable me to aim higher. Without dreams I would never have survived eight years of torture and confinement that started when I was only six. The point is you are never too young or too old to dream or achieve.
The X570 boards, as well as pretty much all of the other AM4 boards, mostly have only 2 PCIe 16 SLOTS. Using both of those slots will drop the speed to X8, but as you point out, yeah it'll be PCIe 4. That's not the problem
The problem is the SLOT COUNT. Most of the Threadripper boards have 4 PCIe 16 slots, (some will run at x8, that's almost a non-issue for rendering). That allows for 4 graphics cards to be used on the board, if the slots are properly spaced, without any special risers or ribbon cables or anything. THAT is what I"m looking for. Otherwise, the 16 core Ryzen 3850X (whatever it ends up being designated) would be an awesome choice.
I'm looking to build a 4 GPU system, and AM4 Ryzen boards just aren't set up for that.
? So what? You can't fit 4 GPU's in any case on the market. There isn't the airflow even if they fit. If you want 4 GPU's in a rig you are going to need to mount them outside the case on risers or you'll thermal throttle.
Either you can fit 4 GPUs in a case or photoshopping that very thing is a very very popular pasttime.
You mean besides this case..:
https://www.avadirect.com/2nd-Gen-AMD-Ryzen-Threadripper-X399-Chipset-4-way-GPU-Tower-Workstation-PC/Configure/12242324
Or these:
https://www.pugetsystems.com/recommended/Recommended-Systems-for-OTOY-OctaneRender-192
Actually, there are a number of 'full tower' cases that'll work, and a number of quad GPU solutions for Intel, as well as as few for AMD. As for throttling, well that's what liquid cooling is for.
The hard part is finding a motherboard with properly double spaced x16 slots for the GPUs.
Google 'quad gpu workstation'. You'll even see an 8 GPU solution on the first page of that google search. There ARE some 8 GPU solutions out there as well, although most if not all of those are rackmount solutions. Here's a dual socket EPYC system that can accomodate up to 8 GPUs:
https://www.aspsys.com/servers/Supermicro-2U-Ultra-Server-2023US-TR4-p3196.htm
That being said, I'm good with just 4 GPUs. With liquid cooling of course, probably with separate loops for each pair of GPUs and maybe a third loop for the CPU.
Ebay makes me nervous, but heres a prebuilt quad RTX 2080 Ti with Threadripper 2990WX prebuilt liquid cooled system:
https://www.ebay.com/i/312207464316?chn=ps
It's more than a bity pricey though. I'm sure you could DIY and drop that $22,999 price tag a bit...
Of course, if you buy Quad RTX Titans, that would be around $10,000, and then add $300 apiece for the water blocks, and a bunch more money for everything else... I think you could build a Quad RTX Titan + Threadripper 2990WX water cooled system for less than $23K though...
Yeah thay would be an awesome setup, can only imagine the speed and would certainly would want to be very rich to be able to pay the power bill.. lol I am wondering if nvidia have released or are working on the T100's..
The upside r.e. the powerbill is that you may be able to turn your furnace off in the wintertime with that system, since the DGX system will be pumping out a bunch of heat!
Might suck in the summertime though...
...mmmmm, DNI.
Someday.
...well an Epyc 7601 is around 3,700USD for the single socket version (4,400USD for the dual socket model), so I would imagine the 64 core version about double the cost.
The only limiting factor would be the maximum number of CPU cores/threads the software will support. Carrara tops out at 100 threads, not sure where Daz cuts off.
The 7551P is also a 32 core/64 thread EPYC, and currently retails for $2400. The 7601 allows dual EPYC use, while the 'P' only allows for a single EPYC CPU to be used.
That being said, the 64 core 7nm EPYC incorporates other improvements as well, so if they do a 'P' version of the 64 core, I'm not sure where the price will land. It'll all depend on how agressive AMD wants to be with the single socket server/workstation market. $4800 for a single socket workstation CPU sounds a bit high, but the 48 core Xeons that were teased in November are still an unknown quantity. The 28 core Xeons I found via Google are all over $10,000...
*facepalm*
You're showing off a bunch of cases with no airflow and you don't seem to understand what I wrote.
As to multiple loops in a single box? 3 pumps/reservoirs, 3 rads plus fans and tubing in a single box? LOL. I've seen rigs like that in custom modded set ups but none of the artists who create them would sell one for as little as $23k. The parts plus the labor would cost nearly that much. Anyone offering to ship something like that is definitely trying to defraud you.
In regards to rack mounts with multiple GPU's, yes, there are lots of those around. But they're meant for pro cards not for commercial ones, which run at much lower clock speeds and therefore produce much less heat. Also no one cares how much noise they make, the one you use as an example has 8 80mm fans that are setup to run at full speed at all times. You would find working in the vicinity of such very distracting.
As I said, DIY. Except for store bought systems and laptops, I generally build my own systems. I also void the warranties on my laptops regularly to upgrade the ram and storage by breaking the warranty seal to open up the case to upgrade said components. Although I seem to remember some lawsuit or something that found in favor of allowing for self upgrades...
Setting up a liquid cooled system isn't all that hard to do once you get the hang of it, as long as you are meticulous. And since GPUs get hammered pretty heavily when rendering, yeah keeping the temps down is a good thing to do, and water cooling does that quite nicely.
But if you are content with air cooling, more power to ya!
Well said, and congratulations on your achievements.
I've been thinking about the title of this thread.
... And well, no budget is not the same as unlimited budget.
If I had no budget, I would spend zero, as I had no budget.
However, if my budget is unlimited, then there have been some great links here; hadn't realised there was a single socket Epyc. I guess, when I upgrade my threadripper I'll be considering that. Plus a Titan. I'm going to guess though, that I really don't like the price.
If my budget really was unlimited, I've no idea what I'd do.
A big solar or wind powered rack of 2080ti’s connected via a Thunderbolt USB and a big fan would do, especially if I could reharnass the heat,
well it is a dream
want the smart energy efficient house to go with it too
Setting up a water cooled system properly so it runs long term without leaking or constant fiddling is actually pretty hard. Even the people who do it a lot find it quite challenging.
Setting up 3 loops in a single box? I'd guess no one would do that except for a show piece not meant to run for more than 3 or 4 days.
The actual reality is there is very little difference between water and air cooling except that water cooling takes longer to reach operating temp. There have been tests that show this. Water in a loop serves to store a large amount of heat but once that loop reaches the point where it is dissipating as much heat as it is taking in, thermal equilibrium, the cooled components aren't operating at substantially lower temps than air cooling would provide. And water coolers have the added power, and therefore heat, from the pumps and extra fans to deal with in the system.
That's a nice theory, but in practice CPU temps and GPU temps often peak at much lower temps with ADEQUATE water cooling, unless you skimp on the flow rates, radiator sizes and such. Pump redundancy is also a good idea, so if one pump in the series fails, you'll still have some cooling, and if you are using the system at the time you'll probably notice the temps slowly climbing. Good luck with your air cooling though!
I'm sure the mods are thinking it, so I'll say it.. it's getting rather off-topic isnt it?
..that's why I mentioned it. That way you could get those 128 threads now.and at a reasonable clock rate 2.2 GHz base/3.2 GHz boost.
My "dream" system that I mentioned at the top of this thread is a duo CPU setup to get the maximum core count I could (88 which is 12 shy of the maximum count Carrara supports) while still having appreciable base clock and boost speeds. Still am not sure what the upper end core support is for Daz Studio.
In my current rendering system, I have a single dedicated Titan-X, there is no room for a second one as the two PCIe x16 slots are too close together. I am running a 2 GB GPU on an x8 slot just to support the displays and 6 core/12 thread Westmere Xeon (yeah, old generation tech).
People often want to cram multiple cooling loops into one case, but it's not really necessary for proper cooling and it probably raises complexity, cost, and servicability for the entire system.
In your example, you'd have to design three or more systems, each complete with pump, reservoir, and radiator for each loop, plus associated tubing, power, and sensor wiring going every which way. Prebuilt systems only eliminate the reservoir from that design, and anything you build from scratch will need a reservoir for each loop.
Of course, this is the "no limited budget" thread, so you could do all of that under the guiance for this thread.
But all that extra equipment might be impractical or even outright impossible to fit into the case, particularly the added reservoirs and radiators.
And really, we just need to remember the #1 reason for having any cooling system. First of all, not all heat is harmful. Only excessive heat is harmful. And excessive heat is (or at least should be) defined as that heat which will cause damage to the components over an unacceptably short period of time. For me, that time is 3-5 years. For you, it might be a different value, within the parameters of reality and physics, of course.
But the upshot is that we really only want to/need to transfer excessive heat to a less harmful place than in the components themselves. Transferring "all heat" is neither productive nor efficient, nor will it necessarily extend component life in any significant way, therefore I submit to you that it's completely unnecessary to even attempt this unless you are conducting studies on the law of thermodynamics or you just "want to", both of which are perfectly valid reasons in my book.
Some components can even do just fine without liquid cooling at all, and others don't even need a heatsink or fan to avoid excessive heat.
I guess I'm just saying that sometimes it's perfectly okay to stop when we've reached "good enough". We just need to be reasonable in our definitions of "good enough".
Now if you have more devices that you want to include in the water loop, then you'd probably be better off including more radiator surface area (whether it's done by a bigger radiator with more surface area (my preference) or more than one radiator) and (if appropriate), a bigger pump and bigger reservoir so as to increase the overall size of the cooling "system". This would be more efficient than building separate loops.
If running a cooling loop through 4 GPUs and a CPU is found to be not effective in removing harmful heat from the last device in the series, then I propose to you that the radiator surface area is not adequate to the total heat being generated. Furthermore, I suggest to you that all devices will eventually be impacted, even if they are not last in the series.
Carry on.
My 'dream machine' is still probably another 5 years in the future but I imagine that it will be a 256 core AMD machine with 256 core integrated future variant of AMD Vega GPU. Running at 10 GHz. I'll guess it'll have some sort of new tech replacement for RAM, so maybe a 4TB SSD RAM device.
And to be extra sweat the machine would only utilize 20 watts and stay at 55C on the cores even when utilizing all 256 CPU cores and 256 GPU cores concurrently to the max. Honestly the 1st paragraph seems possible but the 2nd paragraph I doubt is possible in 5 years but maybe? Anyway, it's a 'dream machine'.
I have no ideal what such a machine would cost but it's clearly more machine than any current complex SW or combination of SW I know of can utilize to it's maximum capabilities.
Thermodynamics! The hardware produces a set amount of heat. How much cooling is immaterial.
There are two possibilities for the cooling. First the cooling can dissipate the heat the hardware creates and will reach thermal equilibrium after some time. Or it can't and the temp of the cooling liquid, even air coolers use liquid in the heat pipes, will reach the same temp as the hardware and no more thermal transfer will occur. Generally speaking the hardware will thermal throttle or shut down before this but that would be the end result as far as the cooling is concerned.
How long it takes to reach thermal equilibrium may result in some minor differences in the final temperature of the hardware but the cooler will either work or it won't. There is nothing magical about water. Custom loops in some cases work better at cooling simply because big radiators with multiple high pressure fans can dissipate a lot of heat but putting 3 such loops in one box would simply never work. I think there's a couple of cases on the market with mounts for 3 360mm radiators but you'd never fit 9 tubes (minimum), 3 pump/resevoirs and all the associated cabling.. And that completely ignores where the case would get the intake to feed 9 high pressure fans.
It would simply waste an enormous amount of money and power for no gain. Putting the GPU's on risers and letting them cool using their factory fans would be cheaper and many times more efficient cost, power and labor wise.
There’s a topic?
I just thought was dream machine no budget in which case I will grab a couple of those Microsoft server farms they have under the sea
Only $999 apparently!
https://www.masterreplicasgroup.com/products/hal-9000-with-command-console-limited-edition
Just an mp3 speaker? They could of at least added Alexa to it
...+1.
."...open the garage door, Hal"
"I'm sorry, I can't do that KK."
So the interesting thing is in LA there is no motherboard available to accomodate the i9-9980XE proccessor. The only way to make this happen is buy the board, buy an an old i-k chip online, update the bios, return the chip and cross your fingers it worked. Dream machines don't come easy.
Why would they manufacture a cpu motherboards don't have bios settings for?
Newegg, which has stores in LA, has X299 MoBo's for sale.
https://www.newegg.com/p/N82E16813144054?Description=x299&cm_re=x299-_-13-144-054-_-Product
So does Microcenter.
https://www.microcenter.com/product/480758/rog-strix-x299-e-gaming-lga-2066-atx-intel-motherboard
Where are you shopping?
Going with Asus Strix because it has a better warranty. Not interested in ASRock or other brands.
You can get a 5 year or 10 year extended warranty from EVGA for a nominal fee.