I don't usually ask for hardware advice, but...
This discussion has been closed.
Adding to Cart…
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.
Comments
What can I say - I make typos sometimes.
Out of curiosity I decided to do some testing to see how much of an effect having a second GPU (1070) in an adjacent slot has on my 1080ti. So I rendered the same scene with and without the 1070 rendering alongside the 1080ti, and plotted the 1080ti GPU temps for both cases. Now keep in mind that any temperature tests like this can vary based on stuff like ambient room temperature variations at the time of test, "latent" heat inside the computer based on how long it has been on and/or rendering, complex GPU driver/BIOS stuff that controls cooling and frequencies, and so on. So I generally assume any temperature test data will only give a general idea.
But anyway, what I found (and is shown in the GPU-Z chart below) is that there's something like a 5C difference in max 1080ti temps with and without a 150 watt GTX-1070 sitting beside it and rendering (ie, 79C vs 74C). In other words, the 1080ti ran about 5C hotter with the other card next to it.
I rendered the scene for about 16 minutes in both cases. In both cases the 1080ti frequency was pretty flat at 1.9GHz throughout, and power consumption in both cases was almost identical the entire time (around 180 watts), as was GPU Load (around 99% the entire time). Without the 1070, the fan speed maxed at about 15% slower than with the 1070.
This seems to say that even with a 100+ watt GPU in an adjacent slot, it doesn't seem to affect the 1080ti's performance substantially. So for those of us who aren't interested in taking special measures like undervolting and special cooling to squeeze out a few degrees C of performance I suppose that's good news.
My 1070 only draws 75-90w in Daz.
As another indication of the effects (or not) of two adjacent GPU's affecting each others' thermals and performance, if you look at the attached summary of the posted render times by the community over the last 4 years in rendering the Sickleyield benchmark the following seems clear:
Again, we can't get down to the second with all of this since we're not certain of everyone's setup and how they did the test and so on, but in general it seems that the average data across users seems to support the idea that maybe two GPU's running side-by-side isn't nearly as bad as the prevalent paranoia might suggest. As does the stuff I posted above showing performance along with the temperature chart.
Now if others have actual data to support another point of view, I'd love to see it. But I hope we can bypass the unsupported, intuitive statements and fears and deal with facts and data instead.
Yeah, I think "in DAZ" is a bit misleading. My 1070, on this particular scene I'm working on, is bouncing around between 85-105 watts (see chart below). But I think it depends on a lot of stuff like the particular scene it's working on and how much work the GPU can do, how your particular GPU drivers and stuff is configured by the manufacturer (and you), and a bunch of other stuff. With a different workload it might draw much more, and I recall the max is around 150 watts.
Again, with most of this stuff I don't think you can get down to +/- 1 degree C or +/- 1 watt or anything like that. There are a lot of complications and thermodynamics and software and electrical stuff going on simultaneously that, while many like to simplify it, depends on a lot of stuff and often can't be simplified.
Because with wall voltage fixed all you need to measure to know power consumption is current. The time when you need to directly measure both voltage and current, there is no other way to measure Watts consumed, is when you're past the Power supply and trying to measurepower consumption of individual components.
So if you're measuring at the wall amps is watts since W = VC and V is a fixed number.
Ummm, no. Agan, basic electricity comes in to play here. When you draw current from the wall outlet, it causes a voltage drop in the wires in your house. That lowers the voltage at the outlet. The more current your computer draws, the more voltage drop in the wires in the house (and ultimately more voltage drop in the wires on the utility system). .So current and voltage are related, and you need both to determine power (watts).
And that's why if you go to the electrical meter attached to your house it's a "watt-hour" meter, not a current meter (aka, "ammeter"). It measures current AND voltage, which tells you the rate of energy usage (aka, watts), and it tracks that usage continually so they can determine how many watt-hours (total energy) you used that month so they can send you a bill. And that's why your bill tells you how many "kilowatt hours" (thousands of watts X hours) of energy you used that month. And they charge you something like 12 cents for each kilowatt hour. It doesn't say how much current you used because that's irrelevant.
Nonsense. The amount of current it takes to cause a noticeable drop in household voltage is enough to blow the circuit breaker. The step down transformer that lowers the voltage to household level is built to maintain a stable voltage, that's what step down transformers do. The line out at the street is carrying kilovolts and you drawing a few amps is simply too small to have any noticeable impact on that either. If things worked the way you think there'd be a tremendous amount of ripple on wall power far beyond the ability of most PSU's to filter out. Since, according to you, any device elsewhere in the house being turned on or off cause a noticeable change in wall voltage. You also seem to think the tiny drop that occurs when a device is turned on is persistent. again I have to wonder what you think transformers do and whether you understand the voltage of the outside lines and the amount of current they carry.
Further you are aware, I hope, that there are devices in your home that run directly off AC power with no PSU at all. The voltage drops you are claiming happen would have a noticeable impact on those, light bulbs being the most common. Did the lights, back in the days of incandescent bulbs, in your home dim every time something was turned on and brighten when things were turned off?
If you think there is a huge difference between 120 and 119.95V then I cannot begin to help you.
waiting for intel9s to come down from astronimical
glittering prices and endless comprimises shatter the illusion of integrity
Locked since it is descending into acrimony.