I don't usually ask for hardware advice, but...

2»

Comments

  • RayDAntRayDAnt Posts: 1,156
    edited March 2019
    ebergerly said:
    I'm not sure how you can equate current monitor with power monitor

    What can I say - I make typos sometimes.

    Post edited by RayDAnt on
  • ebergerlyebergerly Posts: 3,255

    Out of curiosity I decided to do some testing to see how much of an effect having a second GPU (1070) in an adjacent slot has on my 1080ti. So I rendered the same scene with and without the 1070 rendering alongside the 1080ti, and plotted the 1080ti GPU temps for both cases. Now keep in mind that any temperature tests like this can vary based on stuff like ambient room temperature variations at the time of test, "latent" heat inside the computer based on how long it has been on and/or rendering, complex GPU driver/BIOS stuff that controls cooling and frequencies, and so on. So I generally assume any temperature test data will only give a general idea.

    But anyway, what I found (and is shown in the GPU-Z chart below) is that there's something like a 5C difference in max 1080ti temps with and without a 150 watt GTX-1070 sitting beside it and rendering (ie, 79C vs 74C). In other words, the 1080ti ran about 5C hotter with the other card next to it.

    I rendered the scene for about 16 minutes in both cases. In both cases the 1080ti frequency was pretty flat at 1.9GHz throughout, and power consumption in both cases was almost identical the entire time (around 180 watts), as was GPU Load (around 99% the entire time). Without the 1070, the fan speed maxed at about 15% slower than with the 1070. 

    This seems to say that even with a 100+ watt GPU in an adjacent slot, it doesn't seem to affect the 1080ti's performance substantially. So for those of us who aren't interested in taking special measures like undervolting and special cooling to squeeze out a few degrees C of performance I suppose that's good news.  

    GPU.JPG
    1911 x 856 - 151K
  • jmtbankjmtbank Posts: 187

    My 1070 only draws 75-90w in Daz.  

  • ebergerlyebergerly Posts: 3,255
    edited March 2019

    As another indication of the effects (or not) of two adjacent GPU's affecting each others' thermals and performance, if you look at the attached summary of the posted render times by the community over the last 4 years in rendering the Sickleyield benchmark the following seems clear:

    • The generally agreed upon render time for a GTX-1070 is about 3 minutes, with the GPU rendering on its own without the affects of any other GPU's. So if it renders the entire scene in 3 minutes, that means it renders 1/3 of the scene in 1 minute, or 0.33 scene/minute.
    • Likewise, the render time for a GTX-1080ti without the effects of any other GPU's is about 2 minutes. So if it renders the entire scene in 2 minutes, that means it renders 1/2 of the scene in 1 minute, or 0.5 scene/minute.
    • Therefore, ideally, if they're both working together and not affecting each other at all, you'd expect them to render 0.33 scene/minute + 0.5 scene/minute,  or 0.83 scene/minute. So if they render .83 of the scene in 1 minute, that means they'd render the entire scene in 1/.83, or about 1.2 minutes (aka, 1 minute 12 seconds). In fact, the render time I get with both rendering in adjacent slots and thermally affecting each other is very close to that: 1.3 minutes (aka, 1 minute 18 seconds). 

    Again, we can't get down to the second with all of this since we're not certain of everyone's setup and how they did the test and so on, but in general it seems that the average data across users seems to support the idea that maybe two GPU's running side-by-side isn't nearly as bad as the prevalent paranoia might suggest. As does the stuff I posted above showing performance along with the temperature chart. 

    Now if others have actual data to support another point of view, I'd love to see it. But I hope we can bypass the unsupported, intuitive statements and fears and deal with facts and data instead. 

    BenchmarkNewestRTXCost.jpg
    540 x 576 - 69K
    Post edited by ebergerly on
  • ebergerlyebergerly Posts: 3,255
    edited March 2019
    jmtbank said:

    My 1070 only draws 75-90w in Daz.  

    Yeah, I think "in DAZ" is a bit misleading. My 1070, on this particular scene I'm working on, is bouncing around between 85-105 watts (see chart below). But I think it depends on a lot of stuff like the particular scene it's working on and how much work the GPU can do, how your particular GPU drivers and stuff is configured by the manufacturer (and you), and a bunch of other stuff. With a different workload it might draw much more, and I recall the max is around 150 watts. 

    Again, with most of this stuff I don't think you can get down to +/-  1 degree C or +/- 1 watt or anything like that. There are a lot of complications and thermodynamics and software and electrical stuff going on simultaneously that, while many like to simplify it, depends on a lot of stuff and often can't be simplified. 

    1070 Power.JPG
    1557 x 768 - 172K
    Post edited by ebergerly on
  • kenshaw011267kenshaw011267 Posts: 3,805
    ebergerly said:
    I'm not sure how you can equate current monitor with power monitor(amps and watts are different things), or say that an outlet consumes or dissipates power(it's just a plug with wires), but hey whatever works.

    Because with wall voltage fixed all you need to measure to know power consumption is current. The time when you need to directly measure both voltage and current, there is no other way to measure Watts consumed, is when you're past the Power supply and trying to measurepower consumption of individual components.

    So if you're measuring at the wall amps is watts since W = VC and V is a fixed number.

  • ebergerlyebergerly Posts: 3,255
    edited March 2019
    ebergerly said:
    Because with wall voltage fixed all you need to measure to know power consumption is current. The time when you need to directly measure both voltage and current, there is no other way to measure Watts consumed, is when you're past the Power supply and trying to measurepower consumption of individual components.So if you're measuring at the wall amps is watts since W = VC and V is a fixed number.

    Ummm, no. Agan, basic electricity comes in to play here. When you draw current from the wall outlet, it causes a voltage drop in the wires in your house. That lowers the voltage at the outlet. The more current your computer draws, the more voltage drop in the wires in the house (and ultimately more voltage drop in the wires on the utility system). .So current and voltage are related, and you need both to determine power (watts). 

    And that's why if you go to the electrical meter attached to your house it's a "watt-hour" meter, not a current meter (aka, "ammeter"). It measures current AND voltage, which tells you the rate of energy usage (aka, watts), and it tracks that usage continually so they can determine how many watt-hours (total energy) you used that month so they can send you a bill. And that's why your bill tells you how many "kilowatt hours" (thousands of watts X hours) of energy you used that month. And they charge you something like 12 cents for each kilowatt hour. It doesn't say how much current you used because that's irrelevant.  

     

    Post edited by ebergerly on
  • kenshaw011267kenshaw011267 Posts: 3,805
    ebergerly said:
    ebergerly said:
    Because with wall voltage fixed all you need to measure to know power consumption is current. The time when you need to directly measure both voltage and current, there is no other way to measure Watts consumed, is when you're past the Power supply and trying to measurepower consumption of individual components.So if you're measuring at the wall amps is watts since W = VC and V is a fixed number.

    Ummm, no. Agan, basic electricity comes in to play here. When you draw current from the wall outlet, it causes a voltage drop in the wires in your house. That lowers the voltage at the outlet. The more current your computer draws, the more voltage drop in the wires in the house (and ultimately more voltage drop in the wires on the utility system). .So current and voltage are related, and you need both to determine power (watts). 

    And that's why if you go to the electrical meter attached to your house it's a "watt-hour" meter, not a current meter (aka, "ammeter"). It measures current AND voltage, which tells you the rate of energy usage (aka, watts), and it tracks that usage continually so they can determine how many watt-hours (total energy) you used that month so they can send you a bill. And that's why your bill tells you how many "kilowatt hours" (thousands of watts X hours) of energy you used that month. And they charge you something like 12 cents for each kilowatt hour. It doesn't say how much current you used because that's irrelevant.  

     

    Nonsense. The amount of current it takes to cause a noticeable drop in household voltage is enough to blow the circuit breaker. The step down transformer that lowers the voltage to household level is built to maintain a stable voltage, that's what step down transformers do. The line out at the street is carrying kilovolts and you drawing a few amps is simply too small to have any noticeable impact on that either.  If things worked the way you think there'd be a tremendous amount of ripple on wall power far beyond the ability of most PSU's to filter out. Since, according to you, any device elsewhere in the house being turned on or off cause a noticeable change in wall voltage. You also seem to think the tiny drop that occurs when a device is turned on is persistent. again I have to wonder what you think transformers do and whether you understand the voltage of the outside lines and the amount of current they carry.

    Further you are aware, I hope, that there are devices in your home that run directly off AC power with no PSU at all. The voltage drops you are claiming happen would have a noticeable impact on those, light bulbs being the most common. Did the lights, back in the days of incandescent bulbs, in your home dim every time something was turned on and brighten when things were turned off?

    If you think there is a huge difference between 120 and 119.95V then I cannot begin to help you.

  • MistaraMistara Posts: 38,675

    waiting for intel9s to come down from astronimical

    glittering prices and endless comprimises shatter the illusion of integrity

  • ebergerlyebergerly Posts: 3,255
    Kenshaw, youre free to do whatever you want, but why not just use a watt meter and get the right answer like the rest of the world?
  • Richard HaseltineRichard Haseltine Posts: 108,327

    Locked since it is descending into acrimony.

This discussion has been closed.