Power consumption of RTX 3060 and 3080 - Which to buy for Iray? 3070?

2»

Comments

  • Ghosty12Ghosty12 Posts: 2,080
    edited March 2021

    ebergerly said:

     Looks like the RTX 3080 and 3080ti are rated at around 320 watts, and the 3070 around 220 watts, and the 3060 around 170 watts. When running Iray I'm guessing both would draw less than 300 and 200 watts respectively. And that varies as your render proceeds. It bounces all over the place. If you add up all of your computer components' power draws, and allow for the fact that your CPU (the second largest draw-er of power) probably isn't running anywhere near it's maximum at the same time your GPU is rendering, you'll probably be hard pressed to get your entire computer drawing over 400 watts with either of those GPU's.

    Personally, I think an NIVIDA recommendation of 750 watts is somewhat overkill for this. There's virtually zero noticeable benefit to running your power supply at 50%, since the difference in efficiency (wasted power in the power supply) is negligible. It might save you a few $$ in power costs a year. You'd be better off turning of unused light bulbs. 

    I have a GTX-1080ti (rated 250 watts) running alongside an RTX-2070 Super (215 watts), and during Iray renders the entire computer takes less than 450 watts from the wall outlet. That includes both GPU's, the CPU, 3 fans, the power loss in the 750 watt power supply, and everything else. So the total rating of both GPU's is 465 watts, but the entire computer draws less than 450 peak during an Iray render. No matter how hard I tried I could never get anywhere near its rating of 750 watts. Something over just 500 watts would be fine. 

    Now of course you can justify buying as big a power supply as you want based on possible future upgrades. But as others have said, I'd worry far less about getting a big power supply and worry more about getting more VRAM, especially if you're like many and enjoy building big scenes with a lot of components. Also, keep in mind you'll need at least 2 or 3 times the amount of GPU VRAM for your system RAM, since scenes use a ton of system RAM, then get optimized down to much less data that is sent to the GPU to do the rendering. But it looks like you're covered with the planned 64GB. 

    You never ever want to skimp out on the power supply, there is a reason Intel, Nvidia and AMD have suggested wattage ratings for their hardware.. The rule with these things is that it is always better to over spec the power supply than under spec..

    It is somewhat of an insurance policy to have a PSU that can handle the expected load with some breathing space.. Also not all power supplies are good, some are great while others are cheap garbage..

    kyoto kid said:

    PerttiA said:

    Matt_Castle said:

    ebergerly said:

    So the total rating of both GPU's is 465 watts, but the
    entire computer draws less than 450 peak during an Iray render. No matter how hard I tried I could never get anywhere near its rating of 750 watts. Something over just 500 watts would be fine.

    I've just lost quite a long post here because the forum failed to save a draft, but please don't give this as advice.

    Power supplies degrade from their nominal spec over time, there *are* things that can use more power than rendering (including exceeding the nominal TDP of the components), and brownouts can cause all kinds of problems for a system. There's good reason that Nvidia's recommendations about power supply have a good overhead. The PSU is the backbone of your system and it's not something you want to fall short.

    +1

    There are also recent examples here on the forum of cases where inadequate PSU has been the problem in DS, even when games and some testing programs were running without problems.

    ...+2

    I had a 750w PSU in my system I built 9 years ago that rarely if ever got pushed beyond half its capacity. When I installed a Titan-X and upgraded the memory to 24 GB that maybe raised it's peak output to around just 2/3 rds  (I routinely monitored performance).   A few months ago while I was doing routine stuff (not rendering but just watching a short video on Youttube) it went *poof", basically from old age.  It also experienced a few "unplanned shutdowns" due to power outages during it's lifetime as well the last this past summer (I always turned everything  off and unplug it before to avoid spikes when the power came back even though I have protection there as well). Fortunately it didn't take any other components with it. (which can happen).

    I've always been a firm believer in overbuilding systems to have that overhead instead of skating on thin ice to save a few zlotys. Currently running an 850w which just gives me more "breathing space".

    Another good thing to have is a UPS, it is great for those times when the power does go out saves not only your stress but your computer as well..

    kyoto kid said:

    melissastjames said:

    I'd go with the 12GB 3060 over the 10GB 3080. In fact, I'd keep my 11GB 1080TI over the 10GB 3080 just for that 1GB of additional VRAM. The entire concept of them DECREASING the amount of VRAM on the 3XXX line just boggles my gourd. 

    That being said...good luck finding one. I've been watching and waiting for a 3090 for a while now and the only options I've come across for purchase are (no lie) double MSRP and I'm sorry but I'm not paying $3500US for a graphics card. 

    ....yeah it sort of gets me that the base 3060 has 12 GB but the 3060Ti has only 8,  True, the Ti has about 1,300 more CUDA, 50 more Tensor and 13 more RT cores, a slightly facer base clock and wider memory interface (256 bit).. So yeah, better on the speed factor but the trade-off is a lower limit on scene size which translates to the process dumping to much slower CPU mode (particularly if you don't have an HCC Xeon or Threadripper.).

     Yeah it has me wondering and I do find it strange that the 3060 has more vram than the 3070, 3070Ti and the 3080, 3080Ti seems like a weird thing to do but a case of who knows..

    Post edited by Ghosty12 on
  • Matt_CastleMatt_Castle Posts: 3,010

    Ghosty12 said:

    Yeah it has me wondering and I do find it strange that the 3060 has more vram than the 3070, 3070Ti and the 3080, 3080Ti seems like a weird thing to do but a case of who knows..

    If I had to guess? My theory is that the RTX 3060 was originally intended as a 6 GB card.

    However, Nvidia released and announced the Geforce 30 series at the start of September 2020. At the end of October, AMD announced the RX6000 series - where *none* of the GPUs announced had less than 16GBs, and their performance was really giving the Geforce cards serious competition.

    At that point, changing the direction for the 3060 Ti probably wasn't viable - that came out only slightly over a month later, in early December, so production would have already been in very heavy swing. The 3060 was the first card that Nvidia could actually respond with, and in terms of design, upgrading the board from using 1 GB GDDR6 chips to using 2GB GDDR6 chips is fairly simple. (A lot of modders have actually done the same with other cards - it usually works on a hardware level, but then has a load of driver and firmware issues, but Nvidia can obviously solve that).

    I'm not entirely sure why though the current rumours indicate they've abandoned the earlier concepts of a 16GB 3070 and 20 GB 3080, although it may be that they simply don't think they needed to. While the -80 cards are their flagships, they're selling as fast as they can make them even without the extra VRAM, but the -60 cards are their bread-and-butter where they really didn't want to lose ground to AMD.

    However, I'm not an industry insider, so that is rampant speculation.

     

  • rstrst Posts: 6
    edited April 2021

    Hi!

    I finally was able to order a GPU and it is a... ASUS TUF RTX3060 OC 12GB
    The price was reasonable for this times and I'm really happy. Fingers crossed that it will arrive soon.

     

    Post edited by rst on
  • Matt_CastleMatt_Castle Posts: 3,010

    Which means you've actually ended up with the exact same model as I have, the TUF OC version. It's a big card, but if you've got the space for it, the large heatsink gives excellent full load temperatures; it's not summer here yet, but I'm expecting it to do well even on hot days.

  • outrider42outrider42 Posts: 3,679

    How much VRAM you need is entirely up to what you make. Since you are just starting out, it is extremely hard to predict what you will likely use until you just sit down and do it.

    Since you have a 3060 now, you could possbly sell your 5700 and get that money back, if you haven't already. You could then in turn invest in another GPU. One nice thing about Iray is that you can throw all the GPUs you can it at, as long as each GPU can fit the scene. Your x570 can hold 2 GPUs. You just need to make sure they will physically fit (that sure would suck) and that your power supply can run them. In this situation, you would almost certainly need a bigger power supply to do mutiple cards.

    This is also where the power supply talk comes into play. With a 3060, you don't need much. But having a larger power supply gives you more options down the road. With a bigger power supply you could add a second card to your system and vastly increase your render speed. It doesn't even have to be a 3060, either. You could put a 3080 in there with your 3060 if you wanted to. In this example, the way it works, is if the scene you create uses less than 10GB on the 3080, it will run with your 3060. If by chance your scene is larger than 10GB, but still less than 12GB, then the 3080 will not run while the 3060 does. If you exceed 12GB of VRAM then neither card will run. That is basically how VRAY works with Iray in a nutshell.

    Windows uses a little VRAM itself, so the trick here would be to have the 3060 running the monitor and the 3080 as the secondary card. This would give the 3080 as much VRAM as possible so it can fit the scenes. 

  • nonesuch00nonesuch00 Posts: 18,729

    Well it is already April 2021 and launch of the 3000s was in August 2020 and you have to ask yourself looking at the current prices and lack of supply if you aren't better off blowing the 3000s series off until they do another die shrink and up the RAM with their next set of GPUs because the truth is those crytocurrencies are going to get squeezed sooner or later. They can't maintain what they're doing.

Sign In or Register to comment.