Adding to Cart…
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.
Comments
Yeah, maybe, but I think most circuits in the US at least are probably 15 amps right? So at 120 volts that makes a total of 1,800 watts. You'd need a whole lot of stuff to get that much power.
Here's the specs from the NVidia website for my GTX 1070:
Thermal and Power Specs
Maximum GPU Temperature (in C): 94
Maximum Graphics Card Power (W): 150
Minimum System Power Requirement (W): 500
So if you start adding it up, with maybe 200 watts for each video card, and then some more for the computer itself, and whatever else you have connected to the outlet, you may not even be close to 1,800 watts. Unless you're using the same outlet as a microwave oven or something :)
BTW, you can buy a cheap power meter which will tell you the answer real quick. I think it's only about $30.
And by the way, if you're interested in monitoring your CPU and GPU for temperatures and speeds and stuff, you can download some cool, free software apps that tell you a whole lot. I use CPUID HW Monitor. Tells me the temps of my CPU cores and GPU and fan speeds and % utilization and all kinds of stuff.
Well, yes - but. My house was built 30 years ago, and the wall outlets on the second (guest) bedroom and the den are on the same circuit. And I have two 1300 VA upses and an 1100 VA ups on that circuit (three tower systems, a laptop, and all related devices) and a laser printer. As near as I can tell, I'm hitting between 10 and 11 Amps on that 15 Amp circuit. And I have seriously considered bringing in the electrician.
I upgraded to a 1500 watt power supply, so yes, it needs a dedicated AC circuit. It functions as a room heater too.
If I was you I'd get a power meter and end all doubt. Belden makes a nice one, $30 on amazon. It's a lot cheaper than an electrician if you end up not needing the extra circuit
BTW, just because you have a 1500 watt power supply doesn't mean you're actually drawing 1500 watts right?
FWIW, for comparison purposes, my system has 3 - 27inch monitors (30 watts each), a i7-6700 desktop computer with 48GB RAM and with a GTX 1070 GPU (maybe 200w for the card) and on a good day it's tough to get above a total load of about 600 watts. And I have another desktop with 2 monitors running on the same circuit, and I'd be hard pressed to get up to 900-1,000 watts total.
Just sayin'...I wouldn't want anyone to go to the expense of a new circuit if you don't really need it.
Of course not, it just means 1500 watts is the maximum amount of power it could supply. There is pleanty of 'head room' so the voltages don't sag under load. I was running a 1000 watt prior and that just wasn't enough for 3 GTX 580s and the CPU. I'll bet it gets up to 750 watts based on the amount of heat it produces when rendering hard. I do have an AC power meter and but never put it on the 'beast' to see what is was actually drawing. I'll do that and see how it goes up during a render once the 1080TI arrives.
Personally, I just figure that waiting for renders makes me a more zen person.
This is the one thing I think a lot of people forget that about 3DL, is that you don't need to spend hundreds if not thousands of dollars over and above what would give you a relatively good computer to get decent looking good renders..
That being said I plan on getting a new system what CPU to get at the moment depends on whether Intel do something about their new CPU's heat issues, and if they don't more than likely go AMD..
As for GPU's plan on only getting one as the 1080TI's are rather expensive and would need a good excuse to get more than one..
We were discussing power requirements and graphics cards, and I forgot to post my actual power measurements, in case anyone's interested.
My system is is an i7-6700, 48 GB RAM, with a GTX 1070, single hard drive, 3-27 inch monitors, and that's about it.
Under normal running just browsing the internet and stuff, the power requirement for the entire system is pretty flat at 50 watts.
With D|S rendering a heavy scene, the GPU going to 100% utilization, the power requirement is pretty flat at 170 watts. And that's with the 8 cores averaging about 30%.
Now clearly if you start adding peripherals (hard drives, fans, etc.), it can go up substantially. But you can see that even with a high powered GPU, depending on your system config, you might be very hard pressed to get up near, say, 500 watts. It's easy to assume worst case, but until you actually measure with a power meter you never really know.
The GTX 1080ti arrived and I installed it last night. But first I made a power measurement rendering with Iray using all 3 of my GTX 580s to get a baseline. Just running windows the power reads about 200 watts. Opening DAZ Studio made it jump around 250, then doing a render made it go up to around 750 WATTS. I have a huge water cooled system with two radiators and you could feel some heat. The temp readings on the boards went up to around 40C, well in the safe zone. After a while the power went back to around 250 and I noticed the system wasn't using GPU at all anymore and had fallen back to CPU even though I didn't have CPU checked for render. I guess when it gets into trouble on the GPU it falls back to CPU. In checking my log I could see a bunch of errors cropped up after a short time I don't know why exactly. My scene may be buggy or exceeding some limit.
I planned to keep one of the 580s but that wan't possible because the pluming on the waterblocks doesn't line up, so I removed all three 580s and put in the 1080ti. Installation went smoothly, After restarting I was able to turn up the resolution on my display to full 4K, but only at 30Hz. I was surprised by the low refresh rate but it looks awesome and I didn't really notice any flicker.
I ran the same scene as before and after the initial processing the power went up to around 350 watts and stayed there for the duration of the render. The GPU temp didn't get over 35C. The render was noticably faster and stayed on the GPU for the whole time.
I didn't get a lot of time to expiriment with overclocking, but did turn the GPU clock way up to see what would happen. With a 100MHz boost and running at well over 2000 Mhz the temp went up about 1 degree to 36C. I guess my water cooling is doing the job with a mere 350 watts to dissapate compared to the 750 watts with 3 cards.
Looks like I am saving for another 1080ti and will be selling 3 EGVA 580 Hyrdrocooper classifieds, PM me if interested.
Wow. 200 watts just sitting there without rendering anything? Are you sure you don't have an air conditioner plugged into the same outlet?
I'll post a picture when I go home. We call it the beast.... 2 radiators, 6 fans on the GPU radiator and 3 on the CPU radiator, plus at least three more case fans, plus the water pump, led lighting and so on. The GPU radiator is mounted outside the case. I know, overkill, but it does say cool.
For your 4k @ 30hz issue, there are two posible causes.
Your screen only does 4k @ 30hz (early 4k TVs only did 4k at 30hz)
Your HDMI cable does not have enough bandwidth. (running a HDMI adapter from the DVI port on your card can cause this issue)
Any GTX card at or above 1060 supports 4k @ 60hz full range RGB
I have two 1080ti FEs now on a 750W... not having any power issues yet. (G2 supernova)
Other components:
I have a wifi card, two SSDs, and an M2 drive also plugged in.... 5-6 case fans (120s + 2 x 140s)... a bluetooth dongle for wirless mouse and keyboard... 3 external HDs, though 2 have their own power... and 3 montiors (2 4k + 1 1080p)
For those curious... with the fan setup, my cards are peaking at 77C during a render batch.