Second GPU

2

Comments

  • GatorGator Posts: 1,319
    ebergerly said:

    Darn. $770 for the highly rated MSI GTX 1080ti on newegg. 

    So outrider, did you mean the 1070ti is almost a 1080, or almost a 1080ti? Cuz I dont' think a 1080 is really that much faster than a 1070 is it? I dont recall any benchmarks with the 1080 on Sickleyield's scene. 

    From what I read, almost a 1080.  Eh, seems like they are starting to get too many SKUs.

  • outrider42outrider42 Posts: 3,679

    Yes, the 1070ti will be nearly a 1080. The 1080 is faster than a regular 1070, it has a solid CUDA advantage. You can overclock a 1070 and potentially hit very close or even match some 1080 marks, but as I said before, overclocking for something as demanding as Iray is not recommended (at least not too much.) The 1070ti exists thanks to AMD's Vega 56. Vega 56 is beating the 1070 in most gaming scores, so Nvidia is countering with a revised 1070ti model. They are locking the clock speed as to keep the 1070ti from cannibalizing the 1080, as the locked clock will prevent most from beating the 1080 performance. So we can thank AMD for the 1070ti.

    Here is a big table, if it will work in the forum. Lets see.

    Graphics Card Name NVIDIA GeForce GTX 1050 NVIDIA GeForce GTX 1050 Ti NVIDIA GeForce GTX 1060 3 GB NVIDIA GeForce GTX 1060 3 GB NVIDIA GeForce GTX 1060 6 GB NVIDIA GeForce GTX 1070 NVIDIA GeForce GTX 1070 Ti NVIDIA GeForce GTX 1080 NVIDIA Titan X NVIDIA GeForce GTX 1080 Ti NVIDIA Titan Xp
    Graphics Core GP107 GP107 GP106 GP104 GP106 GP104 GP104 GP104 GP102 GP102 GP102
    Process Node 14nm FinFET 14nm FinFET 16nm FinFET 16nm FinFET 16nm FinFET 16nm FinFET 16nm FinFET 16nm FinFET 16nm FinFET 16nm FinFET 16nm FinFET
    Die Size 132mm2 132mm2 200mm2 314mm2 200mm2 314mm2 314mm2 314mm2 471mm2 471mm2 471mm2
    Transistors 3.3 Billion 3.3 Billion 4.4 Billion 7.2 Billion 4.4 Billion 7.2 Billion 7.2 Billion 7.2 Billion 12 Billion 12 Billion 12 Billion
    CUDA Cores 640 CUDA Cores 768 CUDA Cores 1152 CUDA Cores 1152 CUDA Cores 1280 CUDA Cores 1920 CUDA Cores 2432 CUDA Cores 2560 CUDA Cores 3584 CUDA Cores 3584 CUDA Cores 3840 CUDA Cores
    Base Clock 1354 MHz 1290 MHz 1506 MHz 1506 MHz 1506 MHz 1506 MHz 1607 MHz 1607 MHz 1417 MHz 1480 MHz 1480 MHz
    Boost Clock 1455 MHz 1392 MHz 1708 MHz 1708 MHz 1708 MHz 1683 MHz 1683 MHz 1733 MHz 1530 MHz 1583 MHz 1582
    FP32 Compute 1.8 TFLOPs 2.1 TFLOPs 4.0 TFLOPs 4.0 TFLOPs 4.4 TFLOPs 6.5 TFLOPs 8.1 TFLOPs 9.0 TFLOPs 11 TFLOPs 11.5 TFLOPs 12.5 TFLOPs
    VRAM 2 GB GDDR5 4 GB GDDR5 3 GB GDDR5 3 GB GDDR5 6 GB GDDR5 8 GB GDDR5 8 GB GDDR5 8 GB GDDR5X 12 GB GDDR5X 11 GB GDDR5X 12 GB GDDR5X
    Memory Speed 7 Gbps 7 Gbps 8 Gbps 8 Gbps 9 Gbps 8 Gbps 8 Gbps 11 Gbps 10 Gbps 11 Gbps 11.4 Gbps
    Memory Bandwidth 112 GB/s 112 GB/s 192 GB/s 192 GB/s 224 GB/s 256 GB/s 256 GB/s 352 GB/s 480 GB/s 484 GB/s 547 GB/s
    Bus Interface 128-bit bus 128-bit bus 192-bit bus 192-bit bus 192-bit bus 256-bit bus 256-bit bus 256-bit bus 384-bit bus 352-bit bus 384-bit bus
    Power Connector None None Single 6-Pin Power Single 6-Pin Power Single 6-Pin Power Single 8-Pin Power Single 8-Pin Power Single 8-Pin Power 8+6 Pin Power 8+6 Pin Power 8+6 Pin Power
    TDP 75W 75W 120W 120W 120W 150W 180W 180W 250W 250W 250W
    Display Outputs 1x Display Port 1.4
    1x HDMI 2.0b
    1x DVI
    1x Display Port 1.4
    1x HDMI 2.0b
    1x DVI
    3x Display Port 1.4
    1x HDMI 2.0b
    1x DVI
    3x Display Port 1.4
    1x HDMI 2.0b
    1x DVI
    3x Display Port 1.4
    1x HDMI 2.0b
    1x DVI
    3x Display Port 1.4
    1x HDMI 2.0b
    1x DVI
    3x Display Port 1.4
    1x HDMI 2.0b
    1x DVI
    3x Display Port 1.4
    1x HDMI 2.0b
    1x DVI
    3x Display Port 1.4
    1x HDMI 2.0b
    1x DVI
    3x Display Port 1.4
    1x HDMI 2.0b
    3x Display Port 1.4
    1x HDMI 2.0b
    Launch Date October 2016 October 2016 September 2016 TBD 13th July 2016 10th June 2016 26th October 2017 27th May 2016 2nd August 2016 10th March 2017 6th April 2017
    Launch Price $109 US $139 US $199 US $199 US $249 US $349 US ~$429 US $499 US $1200 US $699 US $1200 US
  • JamesJABJamesJAB Posts: 1,766

    If you are rendering scenes that are coming close to filling up your 8GB card....   Yes, get the 1080ti.

    Here's an observation that I've made on my system after adding a GTX 1080 ti...   according to MSI Afterburner, here is my GPU memory usage right now with lots of things open.
    GTX 1080 ti hooked up to one screen @ 3840x2160 : 1092MB used
    GTX 1060 6GB no connected screens and set as Physx dedicated card in the Nvidia Control Panel : 80MB used

    The 1060 stays at 80MB at all times unless being used in Iray rendering or as a Physx processor in a game.

  • Takeo.KenseiTakeo.Kensei Posts: 1,303

    If you don't need it don't buy it then and send me the money :) I'll make a good use

    More seriously a graphic card may not only be used to play game or render with DS. There are other field like AI that can push you to buy one. Just have to find what could push you

    DAZ just introduced dforce that could also benefit from GPU? So big simulations are also a possibility

    The fact that you don't think you need it also shows that you may not be rendering big enough scenes. So make bigger scenes and render more.

    I'm thinking of buying one too but I've been mostly using 3delight so I don't really feel the need of it. And I was too occupied to eventually have time to dabble into AI but I know I'll do and will buy it (or a Volta)

    One other thing that will push me to upgrade hardware is Blender Evee when it's out. I've tested a few scenes and having big complicated scenes in real time will be certainly be a pleasure

    Otherwise if you don't really have interrest in upgrading, spend elsewhere in something that will really serve you (Zbrush, Photoshop, Wacom Tablet or whatever)

     

  • ebergerlyebergerly Posts: 3,255

    I can't believe you guys are forcing me to spend almost 800 bucks on a card I don't really need against my will !!!!! smiley

    This is so bad...

    But I do'ed it. I just bought an MSI 1080ti from newegg. And go figure....I got a $60 discount for some reason. Cool. Total damage just under $720. Should have it by Friday. 

    But I was very strict with myself. As takeo.kensei suggested, I told myself if you're going to waste your money on something you don't need, then make sure you need it. And I promised myself I'll start learning how to develop C# apps to use the GPU instead of the stuff I've done in the past using multithreaded CPU's.

    You guys are so bad smiley 

  • ebergerly said:
    I just bought an MSI 1080ti from newegg. And go figure....I got a $60 discount for some reason. Cool. Total damage just under $720. Should have it by Friday.

    Pretty decent deal in this market. Hope you have fun with it, I sure am loving mine.

  • ebergerlyebergerly Posts: 3,255

    Oh, I see why the $60 discount. They give you a "free" game worth $60, but that $60 is included in the price of the card on the website. And then when you check out they deduct the $60 to make you feel like you got a deal.

    It worked. I feel like I got a deal. smiley 

  • ebergerlyebergerly Posts: 3,255

    Oh  wait, they charged me $60 for the game, but it's a separate charge on the credit card. But I don't want the game. But it's a package I guess. Damn. 

    I wonder if I can get the $60 back. Anyone know how newegg works on stuff like this?

  • I would talk to their customer service. https://help.newegg.com/contactus

    But I've never had to return or cancel anything through them.

  • ebergerlyebergerly Posts: 3,255

    Yeah I emailed them will all the info. We'll see what they say. Thanks. 

  • ebergerlyebergerly Posts: 3,255

    Okay well my new GTX 1080ti will arrive tomorrow. And since I told myself I MUST find a use for it, I started researching the available libraries for Visual Studio so I can write some C# software that uses the GPU. 

    And I found a free package called Alea GPU which seems to be just what I need. Takes away the device-level programming and sounds like it makes life a lot easier. In fact they have a Parallel.For method that directly uses the GPU, just like the regular C# parallel.for uses the CPU threads. 

    I'm curious, are there any software developers out there who know if this is the best best for CUDA development with C#/Visual Studio? I saw some others, but this seems like the most tailored to VS and C#.

    Thanks. 

  • GatorGator Posts: 1,319

    No idea, but if you're looking for uses for it it is a kick-azz gaming card.  

    I was surprised.  The new Battlefront 2 beta was a week or two ago, it didn't support SLI.  That was OK as it ran 60 FPS all the time at 4K on a single 1080 Ti (EVGA Hybrid).  

  • ebergerlyebergerly Posts: 3,255
    edited October 2017

    Okay, so I finally installed the beast. The monster. So now I have a GTX 1070 in parallel with a GTX 1080ti. 

    Surprisingly it went without a hitch. Luckily my motherboard came with a second set of power supply cables (2 x 8 pin) for the cards, so I just plugged it in and fired it up. The GTX 1070 uses only one 8 pin power connector, and the 1080ti uses two, BTW. It was immediately recognized, probably because I already have a GTX 1070 installed and it uses the same drivers. 

    GPU-Z showed the second card, and I loaded up Studio and it automatically recognized the second card and automatically enabled it, so I rendered a scene. Maximum power draw from the wall for my entire computer (Ryzen 7 1700, 3 x 27" monitors, 64GB RAM) while I'm rendering using both cards at 99% utilization was around 380 watts maximum. My PS is 750 watts, so those who have expressed concern over needing more power might want to re-think their position. 

    I haven't done any benchmark testing yet, but that will come next. My big hope is it will make a huge improvement in 3D viewport iray response.

    BTW, I did notice that with both cards during rendering the sound level is as quiet as with only the GTX 1070.  So that's nice. 

    Post edited by ebergerly on
  • ebergerlyebergerly Posts: 3,255

    Wow. I did a quick render, and with only the GTX 1070 I got 5 minutes, and with both the 1070 and 1080ti I got only 2 minutes. That's a 60% improvement. Nice. 

    So that's $760 for a 60% improvement, or $12 per % improvement. And I was hoping to stay under $10 or so. Whatever. 

  • bluejauntebluejaunte Posts: 1,990
    edited October 2017

    Sell the 1070 and that per % improvement will skyrocket.

    Post edited by bluejaunte on
  • ebergerlyebergerly Posts: 3,255

    Sell the 1070 and that per % improvement will skyrocket.

    I'm trying to figure what you mean...

    The $12 per % improvement is relative to just a 1070. Adding a 1080ti gives $12 per % from just the 1070.

    So if I sell the 1070 I'm just back to a $760 1080ti. So there's no % improvement relative to anything. Yeah, I gain $400 or whatever, but I lose the contribution of the 1070. So the only comparison is to using the CPU or something? 

  • Nice!
  • ebergerly said:

    Sell the 1070 and that per % improvement will skyrocket.

    I'm trying to figure what you mean...

    The $12 per % improvement is relative to just a 1070. Adding a 1080ti gives $12 per % from just the 1070.

    So if I sell the 1070 I'm just back to a $760 1080ti. So there's no % improvement relative to anything. Yeah, I gain $400 or whatever, but I lose the contribution of the 1070. So the only comparison is to using the CPU or something? 

    I think the suggestion is that you try rendering with just the 1080Ti - some people have reproted that rendering with a slower clock speed card drops the clock speed of the faster card, though as I recall others have reported not seeing this.

  • bluejauntebluejaunte Posts: 1,990
    ebergerly said:

    Sell the 1070 and that per % improvement will skyrocket.

    I'm trying to figure what you mean...

    The $12 per % improvement is relative to just a 1070. Adding a 1080ti gives $12 per % from just the 1070.

    So if I sell the 1070 I'm just back to a $760 1080ti. So there's no % improvement relative to anything. Yeah, I gain $400 or whatever, but I lose the contribution of the 1070. So the only comparison is to using the CPU or something? 

    Haha, guess I misunderstood. The % improvement would obviously be the 1080 Ti vs the 1070. With the high prices of the 1070 I'm pretty sure the performance gained from switching to a 1080 Ti in relation to the money spent (after selling the 1070) would be more $ per % improvement than you have now.

  • GatorGator Posts: 1,319

    Basically what bluejaunte said.

    From the mining, mid range cards are expensive.  GTX 1070's are selling for a good price used.  You can probably get what you paid for it or more.  Then all you're in for the 1080 Ti is the purchase price minus what you got for the 1070.

  • ebergerlyebergerly Posts: 3,255

    Some more render results...

    I did a longer render, and with just the 1070 it took 38 minutes, 45 seconds. 

    With both the 1070 and 1080ti it took 14 minutes.

    That's an improvement of about 64%, which is almost identical to the 66% improvement reported by others for the same set of cards in the Sickleyield benchmark scene. 

    So for those who have doubts about the correctness of the Sickleyield benchmark I think you can rest easy. A small scene doesn't mean it doesn't reflect all the stuff going on in larger scenes. And I think that's because we're not talking about absolute times, we're talking relative times between two configs of GPU's. 

    Now if someone has data to contradict all that, and which shows that Sickleyield's scene doesn't represent, then please feel free to post what you have. 

  • ebergerlyebergerly Posts: 3,255

    By the way, I'm not convinced that the installed GPU power has much significant effect on responsiveness of the 3D View in Iray mode. Yeah, the GPU's render the Iray 3D view, but as you manipulate the view I see the utilization of any GPU's drop, and the CPU utilization increase while I'm manipulating. Not by a huge amount, but significant. For example, when I start to manipulate my CPU utilzation will jump from maybe 10-15% to 20-25%, and my GPU utilization for both GPU's will drop from 100% to maybe 70-80%. Maybe there's a big transfer of data back and forth to DAZ Studio's UI as the GPU calculates and that's what uses CPU power, but I really don't notice much or any improvement in responsiveness when I add a 1080ti to the existing 1070. 

    I think the other settings that have been mentioned do FAR more to improve responsiveness in Iray mode. But with or without the addition of the 1080ti my responsiveness is virtually immediate as long as the settings are optimized. 

  • bluejauntebluejaunte Posts: 1,990

    You are using the 1080 Ti as the main though? There's not going to be both cards used in the normal viewport.

  • ebergerlyebergerly Posts: 3,255

    Not sure what you mean by the "main". I have both cards enabled to do Iray Photoreal, no CPU, and both cards are at 99% when the 3dView is in Iray mode. And like I said, both drop to 70-80% as I manipulate the 3d View. 

    I do have my 3 monitors connected to the 1070 if that's what you mean. 

  • bluejauntebluejaunte Posts: 1,990

    Yeah so pretty sure the viewport still runs on your 1070 only. You need to swap.

  • bluejauntebluejaunte Posts: 1,990

    I mean, don't get me wrong. I don't think you'll see massive gains. That viewport frankly never felt very optimized to me. It seems to slow down not only by number of polys but also number of items in the scene or something. Still I'd swap, there's not reason for your 1070 to be the main card. Any realtime opengl/directx viewport or game is going to run on whatever card the monitors are connected to.

  • ebergerlyebergerly Posts: 3,255

    Below is a screenshot of both GPU-z windows showing my 1070 and 1080ti when Studio is just sitting there with the viewport in Iray mode. As you can see they are both running flat at 99% as they both render the viewport. I'm not sure why you're saying that only one card is rendering the viewport. And when I start manipulating the view they both drop in unison to 70-80%. But they're both still rendering the viewport together.

    GPUs.PNG
    942 x 339 - 22K
  • bluejauntebluejaunte Posts: 1,990
    edited October 2017

    I thought we were talking about the none Iray, raw viewport performance.

    Ah my bad, you said ' 3D View in Iray mode' so... well, maybe try connecting the monitors to the 1080 Ti and test what that does? What happens to VRAM, is Iray Preview just as slow? Should at least be interesting to have some first hand experience.

    Post edited by bluejaunte on
  • ebergerlyebergerly Posts: 3,255

    Clearly with both cards connected the actual "render time" of the 3d View is much shorter than with only the 1070. Which means that both are contributing to the viewport render. 

    My guess is that when you manipulate the view, things change. Now you have a lot of dynamics to update the Studio user interface in real time, when before you had a static image and you were just updating it. Now as you rotate and move your view there's a lot more changing, so maybe theres a lot more communication between the GPU's and the CPU and maybe main RAM. And maybe THAT is what is the limiting factor in how responsive the viewport is as you rotate and stuff. Now you have to worry about the PCI-e bus and CPU and memory bus and so on. Maybe all of that is the bottleneck to responsiveness of the viewport manipulation.

    Heck, I don't know.....

  • bluejauntebluejaunte Posts: 1,990

    Yeah I think the responsiveness is definitely linked to the raw viewport performance. If you have a complex scene and try to move during Iray Preview it's basically near impossible. Meanwhile with a near empty scene it's almost realtime. So if you can make the viewport faster with the 1080 Ti then moving in Iray Preview may also be faster. All I can say is with my 1080 Ti it's still often near impossible to move around druing Iray Preview, there's definitely a lot of slow down that seems to have nothing to do with the video card and more to do with just UI, CPU or such. It's a bit like when you link the eyes to the camera the fps drop dramatically.

Sign In or Register to comment.