4 GPU help

Hi, I am planning on building a computer with 4 NVIDIA GPUs, and hoping for some help. Has anyone built one? Looking for recommendations on motherboards, ones that can accommodate 4 GPUs. I read that it is more than just having the slots, but also spacing between the slots. Any help would be appreciated!
«1

Comments

  • ForceXForceX Posts: 52
    edited May 2017

    Don't forget about your PSU you will probably want 1.2K to 1.5K watts of power to feed them at full load. 

    If you are going 4x you will also need to make sure the CPU you get supports the correct number of PCI express lanes. If you run the GPUs in 8x mode for all cards then you need x32 for the GPUs and how ever many more you would need for other devices that may require PCI lanes such as M.2 which uses x4, and I think the onboard networking requires x1.

    When you choose a motherboard that supports 4way GPUs look at how many PCI lanes it can supply to all cards in a 4 way setup, be it x16, x8, or x4. Add this up along with the other PCI devices and make your CPU selection based off this number.

    Post edited by ForceX on
  • linvanchenelinvanchene Posts: 1,386
    edited May 2017

    Afaik there are only a few people on the DAZ3D forum who work with 4 GPU setups.

    Instead of writing a multi page essay I post a link to a 84 pages forum thread where people share their builds and discuss the latest technology on the Otoy forum.

    Best Practices For Building A Multiple GPU System

    https://render.otoy.com/forum/viewtopic.php?f=40&t=43597#p212504

    On the Otoy forum it is custom that users include their system setup (GPU, mainboard, processor, RAM) in the forum signature.

    -> Just browse around, check signatures and you will get some ideas about popular combinations.

    - - -

    I deceided to stick with 3 GPU. One for the display and two for rendering.

    Win 10 Pro 64bit | Rendering: 2 x Asus GTX 1080 Ti FE | Display: Asus GTX 1080 STRIX A8G |Intel Core i7 5820K | ASUS X99-E WS| 64 GB RAM

     

    Post edited by linvanchene on
  • fastbike1fastbike1 Posts: 4,078
    edited May 2017

    Might be overkill. Based on the experience Mec4D has posted, the 4th GPU may not be adding much to the result.

    Remember 4 GPUs won't render any larger/more elaborate scenes than 1.

    Post edited by fastbike1 on
  • TooncesToonces Posts: 919

    Yeh, but that's like saying a V8 engine is overkill compared to a V6, when they both have to follow the same highway speed limits. :)

    Personally, I think a V8 and a 4GPU machine are both overkill, but I imagine they'd be fun for a time.

  • Do you mean there is a limit you hit with CUDA cores? I thought the more CUDA cores you have, the better. That is why I was thinking of buidling a machine with up to 4 video cards. The machine I have now has two 980ti cards. When I went from one card to two cards, my render time cut way down dramatically. It still takes a long time (relatively) to render a darker scene.  And if I want to render an animation, that is quite a few renders. I figured a 4 card machine would put Iray animations within reach.

  • Do you mean there is a limit you hit with CUDA cores? I thought the more CUDA cores you have, the better. That is why I was thinking of buidling a machine with up to 4 video cards. The machine I have now has two 980ti cards. When I went from one card to two cards, my render time cut way down dramatically. It still takes a long time (relatively) to render a darker scene.  And if I want to render an animation, that is quite a few renders. I figured a 4 card machine would put Iray animations within reach.

    Iray will use all the CUDA cores you throw at it, as far as I know; what generally happens is you hit a limit on how much scene data can be stored in VRAM, since no matter how many video cards you install Iray only uses the VRAM on one card. If nVidia can work past that to load scene data onto multiple cards it may increase speed.

  • linvanchenelinvanchene Posts: 1,386
    edited May 2017

    edited: removed off topic information comparing render engine animation features that would just head in another direction than what the original poster was asking.

    You can still find the most important points in the response by Toonces - the next post below this one.

    - - -

     

    Post edited by linvanchene on
  • TooncesToonces Posts: 919

    'You will loose time converting Iray to OctaneRender materials.' - Bummer. Would be interesting to see a side-by-side, however. I.e., same figure rendered in both Iray and Octane.

    'OctaneRender will use all cuda cores of four installed GPU.' - this isn't an advantage. Daz does this too assuming scene fits in VRAM.

    'cannot animate the material values for the Iray Uber Shader' - now *this* would be cool. There are occassions where I've wanted to animate opacity.

    'no official DAZ Studio tool for batch rendering' - true, but the 'unofficial' tools for this work fine.

  • fastbike1fastbike1 Posts: 4,078

    You missed my point. There is some evidence from experienced users that the additional render speed benefit from more than 2 GPUs drops off pretty quickly. It's not linear.

    Toonces said:

    Yeh, but that's like saying a V8 engine is overkill compared to a V6, when they both have to follow the same highway speed limits. :)

    Personally, I think a V8 and a 4GPU machine are both overkill, but I imagine they'd be fun for a time.

     

  • fastbike1fastbike1 Posts: 4,078

    Users of multiple (>2) cards have reported that the speed increase from the third cards is significantly less, and the fourth card even more so.

    Do you mean there is a limit you hit with CUDA cores? I thought the more CUDA cores you have, the better. That is why I was thinking of buidling a machine with up to 4 video cards. The machine I have now has two 980ti cards. When I went from one card to two cards, my render time cut way down dramatically. It still takes a long time (relatively) to render a darker scene.  And if I want to render an animation, that is quite a few renders. I figured a 4 card machine would put Iray animations within reach.

     

  • TooncesToonces Posts: 919
    fastbike1 said:

    You missed my point. There is some evidence from experienced users that the additional render speed benefit from more than 2 GPUs drops off pretty quickly. It's not linear.

     

    I'd be interested in viewing this evidence, even if anecdotal, if you happen to have a link. I checked out Mec4D's 4 card rig:

    It's already a year old and the viewport update is almost instantaneous. I can only imagine it would be faster (and much cheaper) with 4x 1080 TI's (over 14k total cudas vs 12k).

  • linvanchenelinvanchene Posts: 1,386
    edited May 2017

    update / edit: I refound the link to a blogpost that helped me a lot to get a better understanding about multi GPU builds.

    This blogpost is presented in the form of an entertaining discussion between Tom and Smicha. It covers many challenges of creating a multi GPU build.

    http://tomglimps.com/value-based-multi-gpu-build-for-octane-render/

    It was written some time ago. The conclusions remain still valid.

    - Do not spare with the power supply (use 1500W) !!!

    - A rig with water cooling can outperform an air cooled rig

    etc.

    - - -

     

    Post edited by linvanchene on
  • I built a 4 x 980ti system (Asus Rampage V Extreme, i7-5930K) and best advice... take the thermals of what you're building very, very seriously.  If I could do it over, hybrid / water cooled GPUs are the way to go.  I went air cooled - spacious box (Cooler Master HAF X), lots of fans, played with all kinds of fan profiles and push/pull configs - and was just never comfortable with the temps in the box running all four.  I now render with three and the fourth just drives my monitors.  I've lost a DIMM and a hybrid CPU cooler - and while I can't say for sure, it wouldn't surprise me if those failures were aggravated by thermal stress.  Outside the box, my PC is basically the equivalent of a noisy space heater when it's rendering.  You can feel the heat pouring out the thing and I have to turn up the volume on whatever I'm listening to due to the fan noise.  All that said, it is a faster render experience.  You can pan and zoom the viewport in iray and it's pretty responsive, but change the scene / lighting and there's a delay reloading.  To keep the space heater from running, in practice, I still do as much as possible texture shaded.

  • JCThomasJCThomas Posts: 254

    I don't have any numbers to back this up, and I rarely use iray, but I've seen little performance drop with 5 GPUs in my previous rig in Octane. I was on a Z97 Classified motherboard, 4790K with 1 Titan Black, 2 x Titan X Maxwell, and 1 Titan Z (which is a dual GPU). Performance in octane scaled almost perfectly. If I remember correctly, my octane bench score with this system was mid-500. Unfortunately, that system only supported 32 GB RAM, so I mvoed up to X99.

    @AGnawKneeMoose is right though, you need to do a lot of research into your cooling. The easiest thing to do is to go all hybrid cooler. Find a case that supports 3 120mm radiators up front, and 1 120mm radiator in the rear. You could get away with your bottom card being air cooled.

     

  • What about an open case like a Thermaltake P5 core case?
  • PA_ThePhilosopherPA_ThePhilosopher Posts: 1,039
    edited May 2017

    Ok, this is going to be a long post. But I feel it is an important topic, especially for those wanting to get into animation.

    I have been using a quad 780Ti setup for over a year now. And trust me when I say, it is worth it, especially if you want to animate (Imagine rendering each frame in 20-30 seconds. That is my life). Most importantly, viewport feedback is instantaneous. And with the latest Iray updates, it is actually real-time (in other words, you don't see the grey shaded image while orbiting/panning in the viewport. You actually see real-time render when you move the camera, even in large scenes). Of course, you need water.

    As for gains.... I ran tests when I installed each new card, and while the gains weren't proportional (they gradually decrease with each additional card), they were still massive (note, they ARE proportional in Octane. But not quite proportional in Iray). As long as you keep things cool---namely, use water with a good pump and large radiator---then the gains will be huge. But again, water is a must. A quad system on air is a waste of resources, so don't even try it. Some things to remember;

    1. CPU Lanes: CPU must have 40 lanes (Core i7 ideal... I have the older Haswell line. But there may be newer CPU's with 40 lanes) 
    2. CPU cores/overclock: The more cores on the CPU the better... While the GPUs do all the work, the CPU enables them to operate optimally. A good CPU is like a shot of adrenaline to the GPUs. So overclocking helps bigtime (my CPU is overclocked to 4.5 Ghz). Don't overlock your GPU's though.
    3. Motherboard: Motherboard must support 4 PCI at 8x ea. minimum (Don't skimp on the Motherboard. Buy the best you can buy. I recommend the Asus Rampage series... although I have the EVGA X99). Should be extended ATX.
    4. Power Supply: PSU should be 1500 Watts (EVGA and Corsair are the best). Although newer cards don't require as much power, you still need to factor in a large buffer zone for power-spikes, which can wear out a lower-end psu over time. Plus, it has to have enough PCI connections to power all four cards (remembering that some GPU's require two PCI x8 connectors, potentially requiring up to eight 8x connectors total). Typically, only 1500W psu's have enough with room to spare.
    5. RAM: Minimum 32 GB RAM is a good rule of thumb
    6. SSD: SSD recommended 
    7. Case: Case needs to support extended ATX motherboards.

    The biggest learning curve for me was watercooling, as the tutorials online are not very clear. Let me know if you have any questions about that, such as radiator size, tubing, etc. For my setup, I went with a custom setup for my GPUs, and an all-in-one prefab for my CPU (The corsair H110). Here is a few things to remember;

    1. PUMP: D5 pump is pretty much industry standard. You don't need more than one, since it is so powerful. (The variant of the D5 is the MCP655. The is the one I have, which has a variable speed adjuster on it, and I have it turned down to below 50% power.) Note, I use a separate water cooler for my CPU (Corsair H110, which is an "all-in-one" unit). 
    2. RADIATOR: A large radiator for you GPU's is mandatory. Mine holds 3 140mm fans. Here is the link. Use a separate radiator for your CPU, so you can tune your fans separately (I use the NZXT Grid+ so I can customize my fan profiles. But some people don't worry about that, and just run their fans at max always).
    3. TUBING: Tubing size doesn't really matter. I prefer 3/8" but some use 1/2." 
    4. WATERBLOCKS: Stick to EK waterblocks. They are well-established and reputable. 
    5. FITTINGS: Fittings are a matter of preference. I use the cheaper barb fittings, rather than compression.
    6. With this setup, my GPU temps are 27 degrees at idle. and at load they hardly move into the 50's. 
    What about an open case like a Thermaltake P5 core case?

    The case needs to support Extended ATX motherboards, plus a pump, radiators, and reservoir. This is case I use (see image below). It's cheap, big, and easy to work on, since the board lays flat... plenty of space inside.

    https://www.newegg.com/Product/Product.aspx?Item=N82E16811133275

    Post edited by PA_ThePhilosopher on
  • TooncesToonces Posts: 919

    I have a quad 780Ti's. And trust me, it is worth it. I ran tests when I installed each new card, and while the gains weren't porportional, they were still significant (I could never go back to anything less, as it frees up my workflow 10-fold). As long as you keep things cool---namely, use water not air---then the gains will still be significant. Note, a quad system on air is a waste of resources, so don't even try it.

    Yeh, I didn't think it would be linear, but as long as there's a 20% boost in adding a 4th GPU, it seems worthwhile assuming it's affordable (which it gets more affordable with these new less expensive monster cards like the 1080ti).

    I'm not sure about your comment regarding 'air'. I have 2x 1080 GTXs and they rarely go over 75 Celsius. I can't imagine adding 2 more cards would kick it over 90 degrees. I think the older cards (e.g., 780TI) simply ran hotter, at least from what I've read, necessitating water or hybrid cooling.

    I'd be tempted to stick with 2 slots and buy Nvidia's answer to the radeon pro duo (2 cards in 1 slot), assuming they create something cheaper than the titan z, or just wait till next year when they come out with new lines of inexpensive 5k cuda cards.

  • PA_ThePhilosopherPA_ThePhilosopher Posts: 1,039
    edited May 2017

    I figured a 4 card machine would put Iray animations within reach.

    Yes. Yes. And yes. The reason why I am able to design and sell animation products is made possible with my quad system. Without it, I could never release the products I am making, because each product requires me to compose a promo reel, which can be up to 2-3 minutes of animation (somewhere in the neighborhood of ~3,000 frames). With a quad system, I am able to render each frame in maybe 15-30 seconds, depending on the product. So I can sleep while it renders and it will be done the next day.

    Keep in mind though, I'm using older 780 Ti cards, which, perfomance-wise, falls somewhere between a 1070 and 1080. A quad 1080Ti setup will probably be at least 50% faster than me. Maybe even more. (The overclock on the 10xx series is substantial.)

    -P

    Post edited by PA_ThePhilosopher on
  • Thanks for the great information. I am running dual monitors now. If I go with a quad GPU build, how much of a performance hit do I take running the monitors from one of the GPUs? Would it make sense to use a lower end card to just drive my monitors and use 3 GPUs for rendering?
  • PA_ThePhilosopherPA_ThePhilosopher Posts: 1,039
    edited May 2017

    You should be fine as long as they are not 4K monitors. Those tend to bog down the GPU a little too much. I had dual 1440p monitors for a while and the slow-down in rendering times was only slight. 

     

    Post edited by PA_ThePhilosopher on
  • Just for a visual reference, here is a little video I just put together showing the vieport performance on my quad setup;

     

  • linvanchenelinvanchene Posts: 1,386
    edited May 2017

    Offtopic enquiry: 

    Just for a visual reference, here is a little video I just put together showing the vieport performance on my quad setup;

     

    I am trying to gather some feedback which Nvidia Iray viewport drawing settings people use.

    https://www.daz3d.com/forums/discussion/166886/nvidia-drawing-style-which-settings-for-real-time-viewport#latest

    Do you have time to share the drawing settings you used when creating that video?

     

    Post edited by linvanchene on
  • That was impressive. Going back to the dual monitors, should I run each monitor from a separate card, or both from the same? Does it matter? I started looking at components for a 4 card build, and appreciate the info you provided, especially about the CPU. I didn't realize the lanes of the CPU were that important. I am looking at a minimum of $600 for a 6850 CPU. Glad I found that out now before I bought a lesser CPU.
  • Any recommendations on motherboards that have the necessary lanes and slots spaced out for four GPUs?
  • linvanchenelinvanchene Posts: 1,386
    edited May 2017

    Going back to the dual monitors, should I run each monitor from a separate card, or both from the same? Does it matter?

    For working:

    @ Connect all monitors to one card

    It matters in so far that when you connect a monitor to a GPU a part of the VRAM is reserved for handling the monitor and other software.

    If all monitors are handled by the same card  the other GPU can be used for rendering without interference.

    - - -

    @ manually set which software is using the display card

    Other software like photoshop that uses GPU support may not always select the display GPUs or the rendering GPU the way you want.

    -> Use the Nvidia control panel to manually set which software should use which GPU

    Example: You want to render with 3 GPU in Iray and edit images with the display GPU in photoshop...

    In windows 10 you can find the Nvidia Control Panel

    - in the category Control Panel\Hardware and Sound

    - or by right clicking on the desktop

    - Manage 3D Settings / Program Settings

    - - -

    Nvidia Control Panel.jpg
    1920 x 1080 - 249K
    Post edited by linvanchene on
  • PA_ThePhilosopherPA_ThePhilosopher Posts: 1,039
    edited May 2017
    Any recommendations on motherboards that have the necessary lanes and slots spaced out for four GPUs?

    The motherboard will depend on the socket type of your CPU. As far as I know (someone please correct me if I am wrong), there are very few socket types that support 40 lane CPU's. The only one I am aware of are the LGA 2011 and LGA 2011 v3 (Haswell and Broadwell). 

    So for example, if you go with an i7-6850K (Broadwell-E 6-Core 3.6 GHz), which is an LGA 2011-V3 socket type, then you would need to buy a motherboard that is made for the LGA 2011-V3 and is Extended ATX (see list here).

    As for Motherboard brands, many of these should suffice (just be sure to check the specifications for something like this; "16/x8/x8/x8 mode with 40-LANE CPU"). I own the EVGA X99 FTW, which has served me fine. However, if I had to do it over again, I would probably go with the Asus Rampage Extreme line instead, as they sell a much higher volume of those motherboards and thus are more established and tested. So for example, something like this.

    should I run each monitor from a separate card, or both from the same? Does it matter? 

    When I ran my dual monitor setup, I plugged each monitor into a separate card. But I'm not sure it really matters all that much. 

     

    I am trying to gather some feedback which Nvidia Iray viewport drawing settings people use.

    https://www.daz3d.com/forums/discussion/166886/nvidia-drawing-style-which-settings-for-real-time-viewport#latest

    Do you have time to share the drawing settings you used when creating that video?

     

    Done. 

    Post edited by PA_ThePhilosopher on
  • Not sure if others have hit this problem (Rampage V Extreme, but likely similar for other MBs), but another issue I ran into was the cables/headers on the edge of the motherboard physically bumping into the heatsink of the 4th GPU.  Eventually added a PCIe riser card for clearance to get it to seat.  Another thing to plan for with the Rampage V Extreme is the number of internal USB ports you'll need.  I had to add a hub to connect to the CPU cooler, power supply, front panel temperature display gizmo, etc.  The small things you don't think of that put your build on hold...

  • PA_ThePhilosopherPA_ThePhilosopher Posts: 1,039
    edited May 2017

     

    Not sure if others have hit this problem (Rampage V Extreme, but likely similar for other MBs), but another issue I ran into was the cables/headers on the edge of the motherboard physically bumping into the heatsink of the 4th GPU.  Eventually added a PCIe riser card for clearance to get it to seat.  Another thing to plan for with the Rampage V Extreme is the number of internal USB ports you'll need.  I had to add a hub to connect to the CPU cooler, power supply, front panel temperature display gizmo, etc.  The small things you don't think of that put your build on hold...

    This shouldn't be a problem with EK waterblocks, since their profile is so slim... certainly much slimmer than the stock air cooler (which should never be used in a quad setup).

    Post edited by PA_ThePhilosopher on
  • For the Asus Rampage MB, is there plenty of clearance with the PCI slots to fit four GPUs with space between them?
  • PA_ThePhilosopherPA_ThePhilosopher Posts: 1,039
    edited May 2017
    For the Asus Rampage MB, is there plenty of clearance with the PCI slots to fit four GPUs with space between them?

    Yes. But as with all motherboards that support 4 GPU's, the fit is tight with the stock air coolers. There is only a few millimeters clearance between the cards in their stock form, which is why you are forced to go with water. The heat generated with 4 GPU's will be so tremendous, air cooling will not be enough to dissipate 4 cards stacked together like sardines.

    Here is a photo of a quad setup on air. A big no-no;

    And here is one on water... same spacing between cards;

    -P

    Post edited by PA_ThePhilosopher on
Sign In or Register to comment.