What would be on your wishlist for the ultimate rendering PC?

KaribouKaribou Posts: 1,325
edited December 2016 in The Commons

Been thinking about adding a second new GPU to my current rig, but I also need a bigger SSD boot drive, plus my case is getting very cramped and I'm running Windows 7.  I've been thinking that I should just save my pennies and invest in a new, "SUPER rig."  Looking for input on the "ultimate" rendering PC -- what would be on your wishlist?  Note: I really don't game (first-person games make me throw up) or do anything with my computer except render and browse the internet, lol.  My current PC is still nice, so I'm not in a rush, but I'm going to need to budget and am therefore starting the process now. 

Here are my non-negotiables:

  • Generous but not insane budget -- around $4k, and I figure half of that will be GPU.
  • Windows, not Mac
  • NOT water-cooled.  Nope. Nope. Nope. 

Wondering what people recommend for CPU, motherboard, and GPU.  I can take care of the rest. I've always gone with the 4-core i7, but wondering if a Xenon would be better?  Never considered a server processor before and have no idea what is best.

Current PC's Specs, build circa 2013 with GPU upgrade last year:
Processor: Intel Core i7-3770K CPU @ 3.50 GHz (Ivy Bridge)
Motherboard: ASRock Z77 Extreme4 LGA 1155 Intel Z27
Memory: 16 GB RAM
Power Supply: Corsair, 650 Watt
Storage: 240 GB SSD boot, plus 2TB internal HDD
GPUs: EVGA GeForce GTX 970 (4GB, 1664 CUDAs) -- Rendering only
and EVGA GeForce GTX 660 (2GB, 960 CUDAs) -- Original card I installed before GPU rendering was a thing, just runs monitor.

Post edited by Karibou on
«1

Comments

  • IvyIvy Posts: 7,165
    edited January 2017

    This one right here my dream server, I copy the price list for this home build render farm server.  Drool

     

     well you did ask...lol

    p1.JPG
    1025 x 818 - 136K
    p2.JPG
    964 x 947 - 136K
    p3.JPG
    897 x 805 - 87K
    Post edited by Ivy on
  • KaribouKaribou Posts: 1,325

    LOL! Yes, I suppose I DID ask.  Um, this is perhaps a *little* (exponentially) outside my price range.  I don't think I spent that (combined!) on the two cars currently parked in my driveway!  (And I need a new one of those, too.)

    That's actually pretty close to the specs on the render server that my hubby assembled at work -- he's an electrical engineer for GE. Apparently, they use it for developing MRI image rendering software.  They, however, can afford such a beast!

  • joseftjoseft Posts: 310

    well the first question you need to answer is this: What hardware does your render engine use? I am going to assume you are using iRay, in which case there is no point getting a Xeon CPU. If you were using a render engine that only works on the CPU, then a Xeon would perform better than an equivalent i7.

    Out of curiosity, why dont you want water cooling? its a very good solution, especially for rendering. Rendering pushes the hardware to its limits, and the only way it wont hit the hardware thermal throttling limit is with very good aftermarket aircoolers for the hardware, a spacious case that has good air flow, and clean hardware installation and cable management inside the case. And you will get excessive noise levels due to fans trying to keep it all cool. With water cooling, you will never hit the thermal throttling limit, which means your renders will be faster, and there will be much less noise from fans. I run 3 GPUs in my rig, and i only went with water cooling on my CPU, since i would have had to get a full custom water block system for the 3 GPUs and didnt want to spend that much extra - and i regret it. the heat generated from my computer turns the room it is in into a very noisy oven - that is the other downside to air cooling, cases properly made for air cooling achieve that by venting the heat outside the case.

    If you insist on air cooling, i would reccomend Gigabytes G1 gaming or extreme gaming series for the GPU. The air cooling they put on them is very good, but the cooling system makes them a bit bigger, so be mindful of spacing if you have multiple cards

     

  • nicsttnicstt Posts: 11,715
    edited December 2016

    I'd start with 128GB RAM, and two 22 Core Xeons; however, before making any decissions, I'll be seeing what AMD's Ryzen processor can do, which is due out next year - it's looking very promising.

    Currently, I need all my organs, so those Xeons are out. :(

    Post edited by nicstt on
  • Takeo.KenseiTakeo.Kensei Posts: 1,303
    edited December 2016

    Xeon may be needed for three reasons

    1/ You need more than 64 GB memory

    2/ You want ECC Memory

    3/ You need maximize performance out of PCIe. Processors have a maximum of 40 Pcie lanes. If you use 4 x Graphic cards at 16x Pcie 3.0 you need 64 lanes. To get that you need a dual Xeon (2 x 40 lanes). You can run the cards at 8x which will make the data transfer slower with only one processor but you must be carefull with the processor choice

    Some thoughts :

     - 4K is not enough for a quad gfx render station. Rather count 5-7 K

    - If only used for GPU Rendering you don't need powerfull CPU. Just one that can handle 40 PCIe lanes

    - For a 1 Processor Build, you can buy a Desktop 6 core Processors that are overclockable. You can get big performance gain with Watercooling if CPU performance is needed

    - For maximum performance you always must fill all memory slots.

     

    Example build below

    e5 dual gtx 1080.JPG
    655 x 802 - 71K
    Post edited by Takeo.Kensei on
  • IvyIvy Posts: 7,165
    edited December 2016
    Karibou said:

    LOL! Yes, I suppose I DID ask.  Um, this is perhaps a *little* (exponentially) outside my price range.  I don't think I spent that (combined!) on the two cars currently parked in my driveway!  (And I need a new one of those, too.)

    That's actually pretty close to the specs on the render server that my hubby assembled at work -- he's an electrical engineer for GE. Apparently, they use it for developing MRI image rendering software.  They, however, can afford such a beast!

    I have been actually researching these render servers to build one for my self.   the one i have in mind is more in the $12,000 dollar range. I trying to find the parts i need to get built under $10,000 ,..lol what kills me is I have to have a network subcription service to network all the NIVIDA 24 gig quantro and the Operating system Drive so the vram can be run at its optimal specs and to point the render frames at which card needed for rendering. in my research I can use one of these render servers with a win7 operating system with daz studio, and Maya .    This has been a lot of research trying to get one of these together. . we have already been approved for a home equity loan or a small business loan from the SBA.  depending if i can get it for the right price on it.. anything more than $12,000 will make my monthly payments to much .

    Post edited by Ivy on
  • http://www.daz3d.com/forums/discussion/112451/carrara-not-using-all-of-my-render-nodes-solved

    this thread in the Carrara forum might interest you Ivy

    while D|S does not use multiple machines like Carrara you can still render multiple projects or scenes at once.

  • IvyIvy Posts: 7,165
    edited December 2016

    http://www.daz3d.com/forums/discussion/112451/carrara-not-using-all-of-my-render-nodes-solved

    this thread in the Carrara forum might interest you Ivy

    while D|S does not use multiple machines like Carrara you can still render multiple projects or scenes at once.

    Thanks Wendy for the link. . From my understanding  this render farm server we want to build, uses the systems resources differently than that of a PC mounted gpu cards. The render farm will stack vram through the network management software., or you can use the gpu separately if you want to render more than one scene or access the server system with more than one work station.( which works like a PC)   so I can use daz using the cloud Beta access port from in the render options settings by using the systems operating system.  and Maya FTP settings panel will use a API script so it will use all available ram on the system either way  . The server does not run in SLi and there no bridges  so depending how you want to set up your network  for each scene to eb rendered  you can Stack the vram for total GPU or uses the gpu separately,  the access  uses a networking software instead of the operating system to manage the gpu. we going to have 2 - 1.6 TB SSD for 2 - operating systems that can run on the server 1 windows the other Linux,. . that is why I would need to have a network system management service software , which I never knew anything about until we talked to NIVIDA  about setting up a home render farm.. Then all I will be required is to purchase the gpu licenses for each card and then of course the  Maya cloud software, which if you purchase of Maya by the year is about 1/2 of what a monthly fee is. or for a one time fee, I can  buy the standalone version,  I think was about $6000  so that not really a option for me at this time. I like carrera , But I afraid to invest anything into it because it appears the support has stopped. and I want to use software that is always moving forward.

    the reason We have been reseaching all this stuff is because I'm going to need a new computer soon,  this one I am on is 5 years old.   So I can spend $6000 for a new PC to be built through cycberpower.com  which I'll  have to replace in 4 or 5 years when its out dated  or I can go all out and buy  non-integrated server render farm & set it up with a dual operating system instead for about $10,000 or $12,000, a few more $$$ than I want But hoepfuly it will last me the rest of my life with just upgrades.

      Just the 1 NIVIDA 24 gig quantro serve cards would be more than 3 pascal 1080 cards i could get into a desk top PC. and the system we want to build has 3 of NIVIDA 24 gig quantro's in it with a 197 gigs of CPU Ram. with acess for 4 works stations with FTP access into the server,   so you see why we are looking going this route for computing , pound for pound a render farm is much cheaper in the long run over a new PC build.. all it takes is money.

    Post edited by Ivy on
  • I was not suggesting using Carrara just some of the ebay hardware bargains listed in the thread.

     

  • IvyIvy Posts: 7,165

    I'm sorry Wendy . I mis Understood what you ment. I'll check it out.

  • KaribouKaribou Posts: 1,325

    joseft said:

    well the first question you need to answer is this: What hardware does your render engine use? I am going to assume you are using iRay, in which case there is no point getting a Xeon CPU. If you were using a render engine that only works on the CPU, then a Xeon would perform better than an equivalent i7.

    Out of curiosity, why dont you want water cooling?
    <snip>

    I was thinking about a Xenon for the memory expandability, not for rendering.  Though I do use Poser and Vue occasionally, so the extra CPU power wouldn't hurt. As for water cooling... BAD past experience.  I might be convinced to try a self-contained type that you install like a high-end heat sink, but no tubes.  They also take up soooo much space -- the tube variety, at least.  Plus, I don't mind the fan noise. In my house, it's actually VERY soothing and drowns out the noise of a thirteen year old girl and her autistic twin brother, lol!  Idk, I guess I'd have to do more research to be convinced.

    Xeon may be needed for three reasons

    1/ You need more than 64 GB memory

    2/ You want ECC Memory

    3/ You need maximize performance out of PCIe. Processors have a maximum of 40 Pcie lanes. If you use 4 x Graphic cards at 16x Pcie 3.0 you need 64 lanes. To get that you need a dual Xeon (2 x 40 lanes). You can run the cards at 8x which will make the data transfer slower with only one processor but you must be carefull with the processor choice

    Some thoughts :

     - 4K is not enough for a quad gfx render station. Rather count 5-7 K

    - If only used for GPU Rendering you don't need powerfull CPU. Just one that can handle 40 PCIe lanes

    - For a 1 Processor Build, you can buy a Desktop 6 core Processors that are overclockable. You can get big performance gain with Watercooling if CPU performance is needed

    - For maximum performance you always must fill all memory slots.
    <snip>

    Thanks, this is the kind of info I was looking for -- very helpful.  And the budget can flex if needed.  I generally build my own rigs, so I shop the sales and buy components when I see a deal.  Newegg is good for this. Sometimes they'll give you a $50 gift card with purchase of a large component, where you can then use it to buy other parts.  Think it took me the better part of a year to buy all the pieces of the current machine.

    Again, thanks, all!

     

  • Well, here's my latest build plan. Currently without the pair of cards for rendering.

    https://pcpartpicker.com/list/BmPcBP

  • namffuaknamffuak Posts: 4,409

    Just a few suggestions -

    An X99 chipset board; this will support up to 4 high-end gpu cards and an I7 or Xeon cpu

    64 GB if you do the I7; only consider the Xeon if you really think you'll need to go to 128 GB or more. Note that this automatically makes the memory more expensive for the same size configuration.

    A mid-tower case with multiple fans and good airflow (I'm not into liquid cooling either) - and make sure it supports 8 slots or you won't be able to get that 4th double-slot gpu into the case. Front intake and rear/top exhaust is pretty standard; a side fan over the PCI slots is a good option when that 3rd gpu card goes in. Filters on all the intake fans.

    A good cpu cooler (I favor the Noctua line)

    A 1200 watt gold or platinum power supply

    You can economize by installing gpu cards as money becomes available - but go big up front on the case, motherboard, cpu, and memory. And don't forget a dvd writer - the cost is peanuts compared to anything else and they always seem to come in handy.

  • kyoto kidkyoto kid Posts: 41,859
    edited January 2017

    ...for a full workstation, a little more modest than Ivy's dream machine (while 1/10th the cost I'd still need a small windfall to build it):

    Dual 8 core Xeon E5-2680 Sandy Bridge-EP 2.7GHz (3.5GHz Turbo Boost) (still need to research dual socket LGA 2011 motherboards that support both quad channel configuration and Win 7 Pro as the OS).

    128GB DDR3 1600 (8 x 16 GB)

    1 x 1 TB SSD (runtime/library) 1 x 500 GB SSD (boot/application)

    2 x 2 TB 7200 RPM Storage HDDs

    1 x 2 TB External Backup HDD.

    x 2 Nvidia GTX 1070 (primarily for running the displays and Daz viewport in Iray mode so I can have decent refresh rates).

    Win 7 Pro (for various reasons).

    While it does have dual 1070s it is primarily geared towards faster single frame CPU rendering. 128 GB of quad channel memory should pretty much ensure the process will not drop into swap mode.

    My question with Ivy's dream build, when dumping over 36K on a system, why not go totally SOTA with one P5000 (which is priced the same as the M5000) for the displays and 4 P6000's (which are only 300$ more than the Maxewll version) for rendering?

    Post edited by kyoto kid on
  • kyoto kidkyoto kid Posts: 41,859

    ...a pared down version of the above for just a dedicated render box networked to my current workstation would drop the two 1070s

  • namffuaknamffuak Posts: 4,409

    A couple of additional thoughts - 512 GB SSD as the OS drive; also install the Postgress DB here and point the Studio temp directory here. Don't put content (from any application) on this.

    Two external USB drives for backups - alternate between the two; may want them larger than the internal drive. (one could be replaced by a cloud storage solution if you have the bandwidth - but remember, "cloud" is just a fancy way of saying "someone else's disk on somebody else's server you have no control over").

  • KaribouKaribou Posts: 1,325
    namffuak said:

    A couple of additional thoughts - 512 GB SSD as the OS drive; also install the Postgress DB here and point the Studio temp directory here. Don't put content (from any application) on this.

    Two external USB drives for backups - alternate between the two; may want them larger than the internal drive. (one could be replaced by a cloud storage solution if you have the bandwidth - but remember, "cloud" is just a fancy way of saying "someone else's disk on somebody else's server you have no control over").

    I already backup in triplicate. :) A second HDD plus a RAID-configed pair of networked HDDs.  smiley  I've done this for years.  It's actually a runing joke that, in case of a tornado, I corrall the kids, the cats, and grab my external library backup! 

  • namffuaknamffuak Posts: 4,409
    edited January 2017
    Karibou said:
    namffuak said:

    A couple of additional thoughts - 512 GB SSD as the OS drive; also install the Postgress DB here and point the Studio temp directory here. Don't put content (from any application) on this.

    Two external USB drives for backups - alternate between the two; may want them larger than the internal drive. (one could be replaced by a cloud storage solution if you have the bandwidth - but remember, "cloud" is just a fancy way of saying "someone else's disk on somebody else's server you have no control over").

    I already backup in triplicate. :) A second HDD plus a RAID-configed pair of networked HDDs.  smiley  I've done this for years.  It's actually a runing joke that, in case of a tornado, I corrall the kids, the cats, and grab my external library backup! 

    Good - although I've been burned by wintel-based raid and mirrored systems so I do the mirroring by hand and won't touch raid-5 on a system I can afford. Make sure you test your backups from time to time - finding out they haven't worked correctly when you need them is a very bad thing.

    I - ah - spent my last 15 years of professional life doing backup/recovery, large systems disk management, and Disaster Recovery hot-site procedural documentation, written at a level that a non-tech could use. Also specifying server upgrades every three years (IBM rs-6000 unix systems) and always used the upgrades to move the performance bottleneck to the next subsystem to upgrade.

    And - to your tagline - I'm either borderline or high-functioning asperger's syndrome myself.

    Post edited by namffuak on
  • kyoto kidkyoto kid Posts: 41,859

    ...also in all these builds, never forget the value of a rock solid UPS.

  • I got my eye on a 10-core Xeon workstation over at http://www.titancomputers.com/ that's $3.5K, but nothing I'm going to use it for (3Delight rendering, video editing in Premiere Elements) depends heavily on the GPU so I was planning on buying a consumer-grade desktop GPU that only costs a couple hundred bucks.

  • IvyIvy Posts: 7,165
    kyoto kid said:
     

    My question with Ivy's dream build, when dumping over 36K on a system, why not go totally SOTA with one P5000 (which is priced the same as the M5000) for the displays and 4 P6000's (which are only 300$ more than the Maxewll version) for rendering?

    Yes its a dream machine i stated that when i posted it.

     

  • namffuaknamffuak Posts: 4,409
    kyoto kid said:

    ...also in all these builds, never forget the value of a rock solid UPS.

    What he said! smiley

  • kyoto kidkyoto kid Posts: 41,859
    edited January 2017

    I got my eye on a 10-core Xeon workstation over at http://www.titancomputers.com/ that's $3.5K, but nothing I'm going to use it for (3Delight rendering, video editing in Premiere Elements) depends heavily on the GPU so I was planning on buying a consumer-grade desktop GPU that only costs a couple hundred bucks.

    ...yeah for the specs I am looking at would require the Titan X 499 which comes to about 5,300$ (128 GB memory, GTX 1070, 2 TB SATA SSD [runtime/library drive], x2 2 TB 7,200 RPM HDDs Cooler Master case option - don't care much for those windowed cases, rather have a fan on the left panel).  Don't' save much dropping the primary 1 TB M.2 NVMe Drive which is twice the capacity I need just for the application boot drive (my build above calls for a 512 GB SSD).

    I can probably get by with under 3,000$ to build a similar system myself though using the specs I outlined in my post above.

    Post edited by kyoto kid on
  • SickleYieldSickleYield Posts: 7,649

    Moving to Salt Lake City, learning to pickpocket, stealing and duplicating someone's keys (I haven't decided if Kevin or Tony is the better target) and sneaking into the Daz offices to do my renders at night.

    But assuming we're limited to things that are legal and probable, there's no limit to how much I could spend on graphics cards.  My ideal machine would cost more than $10k because I'd probably go for one of the server mobos with two CPUs and then load it up with as much RAM and cards as it could hold, then add an external GPU enclosure with a couple more Quadro cards and a rack of backup hard drives.  I would probably not use liquid cooling because it's hard to service and it scares me, but the roar of the fans would be a real risk to hearing, so I'd probably need to wear noise-canceling headphones to work.

  • Moving to Salt Lake City, learning to pickpocket, stealing and duplicating someone's keys (I haven't decided if Kevin or Tony is the better target) and sneaking into the Daz offices to do my renders at night.

    But assuming we're limited to things that are legal and probable, there's no limit to how much I could spend on graphics cards.  My ideal machine would cost more than $10k because I'd probably go for one of the server mobos with two CPUs and then load it up with as much RAM and cards as it could hold, then add an external GPU enclosure with a couple more Quadro cards and a rack of backup hard drives.  I would probably not use liquid cooling because it's hard to service and it scares me, but the roar of the fans would be a real risk to hearing, so I'd probably need to wear noise-canceling headphones to work.

    Why not simply get a VCA?

  • I'd definitely up the RAM to at least 32 GB, and perhaps more. Save money by using an older case, cannibilizing components from an older machine, etc. That's what my plan is, eventually. I have an IBM Aptiva case that will be built up into a modern rig, with repurposed CD/DVD drives, a floppy drive (yes, totally old school) and then modern innards. Not only will it save a couple of shekels, you'll look totally cool when your 3D art friends see your old box and wonder how in the heck are you able to work on that thing, and how does that old 166Mhz Pentium work so fast? Is it magic?

     

    Also, I'd wait for (Ry)Zen. We're starting to get some non-AMD performance results and they look promising. One preproduction Zen CPU was allegedly pushed to 5 ghz without issues. We'll know more when the OEMs get their first production Zens and start speccing out machines, but I'd wait if possible. Granted, I'm an AMD fanboy (Athlon XP, anyone?) but if the numbers we're seeing are even close to it's actual performance, then it will outperform comparable I7s at a lower price point, allowing the buyer to use that saved moiney on moe memory, etc.

     

    Bob

  • laststand6522732laststand6522732 Posts: 866
    edited January 2017

    I'd definitely up the RAM to at least 32 GB, and perhaps more. Save money by using an older case, cannibilizing components from an older machine, etc. That's what my plan is, eventually. I have an IBM Aptiva case that will be built up into a modern rig, with repurposed CD/DVD drives, a floppy drive (yes, totally old school) and then modern innards. Not only will it save a couple of shekels, you'll look totally cool when your 3D art friends see your old box and wonder how in the heck are you able to work on that thing, and how does that old 166Mhz Pentium work so fast? Is it magic?

    Many will not know what a floppy drive is.  And when they realize you're heating the first floor with the thing, it will give you away anyhow.

    But as for cases, I highly recommend Lian Li boxes.  No plastic, almost no rivets, plenty of space inside, and plenty of accessory brackets available so they are easy to modify.  My 10YO PC60 Plus case is perfectly servicable and still looks sharp.  They do come up on eBay now and then, but they tend to hold their value.

    I would probably not use liquid cooling because it's hard to service and it scares me...

    You are wise. Water and electricity are not friends. Better to put in enough fans to suck the case across the room.

    Post edited by laststand6522732 on
  • SickleYieldSickleYield Posts: 7,649

    Moving to Salt Lake City, learning to pickpocket, stealing and duplicating someone's keys (I haven't decided if Kevin or Tony is the better target) and sneaking into the Daz offices to do my renders at night.

    But assuming we're limited to things that are legal and probable, there's no limit to how much I could spend on graphics cards.  My ideal machine would cost more than $10k because I'd probably go for one of the server mobos with two CPUs and then load it up with as much RAM and cards as it could hold, then add an external GPU enclosure with a couple more Quadro cards and a rack of backup hard drives.  I would probably not use liquid cooling because it's hard to service and it scares me, but the roar of the fans would be a real risk to hearing, so I'd probably need to wear noise-canceling headphones to work.

    Why not simply get a VCA?

    How will getting a franchised veterinarian help?
  • VCA killed my beloved black kitty, Bear the Wondercat. PLEASE do not mention their accursed name!

     

    Bob

  • Honestly if i had the money my current system just needs an upgrade.

     

    I would like to replace the silly little GTX 670 it has and replace that with 2 Maxwell titans, get 8 more gigs of ram..this only has 8. My CPU hasnt let me down yet...AMD 965 black edition.

    I just purchased this year already a 500 gig SSD so storage isnt an issue..its replacing the tiny 128 gig SSD i had so now i have a 500gig SSD as boot drive...128Gig SSD and a WD 1 terrabyte...my skimpy amount of daz content doesnt even put a dent in my WD that i use for just storage.

    Just my upgrade idea would cost about another 3k. Maxwell titans are hard to find and when you do they run bout $1260

Sign In or Register to comment.