Adding to Cart…
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2026 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2026 Daz Productions Inc. All Rights Reserved.
Comments
BTW, considering present super-inflated prices of graphics cards, I think it's pretty much a moot point. No way I'm paying $700-800 for a graphics card, no matter how awesome. Especially since I paid only $400 a couple months ago for the exact same card.
Guess I'll wait to see what shakes out.
As long as you bought a good name brand, 750W is plenty for two GTX 1070 cards.
If you where running a pair of 1080 or 1080 ti cards I would suggest a 1000W.
Though with the singe card being a tight fit, I would worry about being able to keep two cards running together from overheating.
Yeah, I think I'll wait a few months and see what shakes out for prices, then get a new full tower box with better cooling, and new motherboard, and transfer my graphics and 48GB memory and PS over.
Gains of adding an extra card vary; in part due to the system that supports them.
Improvement can be significant, but can also be as low as stated.
I have a 980ti and a 970.
The 970 drives 3 x monitors (2560x1440). If i add it to a render I do get a reasonable increase (especially considering what else it is doing), but the extra noise as the two cards are fighting each other over cooling, prevents me doing it very often.
One figure, clothes, hair; the scene was pre-loaded into GPU memory, so did not affect times.
980ti only:
2017-06-28 15:35:11.092 Finished Rendering
2017-06-28 15:35:11.136 Total Rendering Time: 2 minutes 37.32 seconds
980ti and 970:
2017-06-28 15:38:00.136 Finished Rendering
2017-06-28 15:38:00.173 Total Rendering Time: 1 minutes 41.31 seconds
980ti only - but with OptiX prime acceleration on:
2017-06-28 15:44:44.067 Finished Rendering
2017-06-28 15:44:44.112 Total Rendering Time: 1 minutes 34.97 seconds
980ti and 970 with OptiX prime acceleration on:
2017-06-28 15:41:22.036 Finished Rendering
2017-06-28 15:41:22.076 Total Rendering Time: 59.98 seconds
The OptiX acceleration can make them faster, and it seems to be better in more recent builds from Nvidia.
I use W10 Pro, Creators Build; Nvidia driver 382.33. Daz 4.9.4.117
When you're looking at ways to increase performance, make sure your settings give you the best deal. Just because someone else's settings work for them doesn't always mean they will be best for you; they should, but sadly it doesn't always work like that.
There's no guarantee that @swordkensia'a power supply failed becasue of the two 1070s. The Pascal cards power requirements are quiate a bit less than the Maxwell cards. 750 watts should be sufficient for 2 1070s. Off course it also depends on what elese is in your build.
Just a quick note. While on paper x16 slots should be significantly faster than x8 slots, in fact you are only chasing a few(single digit) percentage points on performance.
https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_PCI_Express_Scaling/
https://www.pugetsystems.com/labs/articles/Titan-X-Performance-PCI-E-3-0-x8-vs-x16-851/
The second link includes some Octane Render benchmarks. While by all means you should take adantage of x16 if you have it, as long as your slots can accomodate the graphic card, you'll do just fine at x8, according to these benches. Sure,renders may be slightly slower, but if something takes 60 seconds instead of 57 seconds (dual x8 vs dual x16), yeah a 5% difference in performance isn't that much. And often the difference is less than 5% in some of the benchmarks.
The bump you'll get from the second card will be significant, but yeah x16 isn't that big of a deal vs x8, so don't feel bad if your GTX 1070 is configured at only x8 in the slot. The second link's Octane Render benches show x2 (actually just a smidge higher than x2 performance) with 2 Titan x's, but within the statistical margin of error when running x8/x8, x16/x8, and x16/16 (only a couple of percentage points of difference).
NOTE if you are upgrading from Gen 2 PCIe slots to Gen 3 PCIe slots, that can make a bigger difference. Still, you may only be looking at a 10% or less performance boost between Gen 2 and Gen 3 in the real world. But other factors will likely come into play with that upgrade that come along with having a newer generation motherboard (faster/newer processor, faster memory slots, etc.).
From what I've read in a few places now, you'll do just fine at a 'mere'' x8.
So, replacing my two Titan X (Maxwell) with two GTX 1080Ti Hybrid, is it a good idea? Yes I could go with the Titan Xp but seriously, all I need is a big jump from my old Titan X's. And I don't want to spend 50% more for something that is only slighly better (I mean1080Ti compared to Titan Xp). Besides, the 1080Ti I want is also water cooled. That alone is important, to me at least.
True,
Looks to be that way. Gotta decide if the extra 8% is worth $500 (according to this benchmark):
http://gpu.userbenchmark.com/Compare/Nvidia-Titan-Xp-vs-Nvidia-GTX-1080-Ti/m265423vs3918
I've been opening up a number of my scenes and monitoring the memory usage to see if the 1080 Ti will fill my needs. So far, only a couple scenes I did with 6+ Genesis 3 figures went over 11000MB.
I'm thinking building a beastly(?) machine with 3 or 4 1080 Ti.
https://www.daz3d.com/forums/discussion/179126/lots-of-exciting-hardware-coming-very-soon-building-a-beastly-daz-studio-machine
I had the same problem when I moved my two Maxwell Titan X's to the second box. It's a Corsair 750 RM gold. The calcluator said 700 watts was needed, but the system would just up and *boom* off. I replaced the supply with a 1000 watt supply, no more problems. Kill-a-Watt indicates I'm only drawing aroung 450-500 watts during rendering too. From now on, I'm going up at least 100 watts what the calculators say, or just going to 1000 watts with two high-perf video cards is probably a good rule of thumb.
Scott, no kidding ! I have exactly the same power supply. That's surprising that it couldn't handle its rated load. And it wasn't even drawing close to that? But your 500 watts for two cards lines up pretty well with the other video I saw with one GTX 1070 drawing a total of about 230 watts.
Maybe that's a good ballpark estimate of 250 watts for each video card. I wonder if maybe the kill-a-watt measures an average over a period of time, rather than a short term peak or something. Maybe they jump up to 400 watts each or something briefly, which trips the breaker in the power supply.
Strange....
Also I'm curious about the PS failures...don't they have circuit breakers inside that prevent damage from overload? I'd sure hope so.
Awesome! Looks like i'm going to replace my Titan X's. Although I'm thinking of using both with the 1080Ti's. Juse need a MB than can handle 4 GPU's. But right now the newest 2066 socket have no such MB, at least not where I usually buy my hardware. Oh, and the problem with not having enough space for the water cooling for all cards. Even though I have a massive case.
I never render very large scenes myself. Mostly only a single character or two. Though so far I have actually never used more than one. And very seldom do I use scenery like houses and the like. It's simple scenes with a small set or HDRI. Despite the fact that I have bought quite a few items like Stonemason's city scenery, they all pile up on my SSD collecting dust.
Power supplies can fail for a number of reasons. Constant loading at near max rated levels can shorten their lives. The quality and variabliltiy of your local power grid can also take a toll, depending on how 'inconsistent' said delivery is. I.E.. how frequently 'spikes', etc. hit your power supply, if power delivery levels vary significantly, brownouts, how often power is interrupted then resumed, and how heavily it 'spikes' when it is resumes, etc., high energy appliances kicking on/off frequently, hence 'disrupting' the smooth delivery of power, etc.. If you've seen your lights dim every time the furnace kicks on, yeah this is one example of what I'm talking about.
Yeah, good power supplies are built to withstand some of this, but inconsistency still 'wears down' these components over time. And then there's always the dreaded lightning strike, which hopefully your surge protector takes care of, although it might not catch all of it.
Anyways, a good rule of thumb it so leave yourself at least 20% overhead (ideally 50%), so that you aren't 'max loading' your PS all the time. Around 50% seems to be the magic number where many power supplies are at their most efficient (a few percent more efficient), so don't feel bad if you are using only 'half' of your rated power, your PS is often converting power more efficiently at that level. This can vary from power supply to power supply, see the 'efficency ratings and efficiency curves' for your particular PS. The base efficiency rating is how much power is 'converted' vs how much is lost in the conversion process (mainly via heat), but this rating varies depending on what percentage of the total power supply rating you are currently utilizing.
http://www.tomshardware.com/forum/328652-28-true-psus-efficiency-load
A couple of percentage points of additional efficiency at a partial load vs a full load may not sound like much, but that'll help you shave some pennies off of each of your power bills. And running at a partial load can help extend the life of your power supply. Power supplies at partial loads will run significanlty cooler than at their 'designed' max heat load/rating, and heat is the bane of electronic equipment.
http://www.tomshardware.com/forum/331972-28-load-life-span-psus
Also, clean out your power supply regularly. All that lint accumulating can take a toll over time.
Do keep in mind that max ratings for cpu's, graphic cards, etc. are exactly that. Your system will typically run at less than those ratings, but since we are using our systems for rendering, we probably push our GPUs and CPU's much harder than the typical user. The only way you can know for sure how many watts your system is drawing from the grid is to monitor the wattage entering the power supply, but let's face it, most people aren't going to have that sort of equipment. The discussion above suggests that your absolute max load on your PS should be about 80% for 'load spikes', but should average out to around 50% under your normal usage. Of course, our 'normal usage' is probably a bit more intense when rendering, although some games can load systems pretty heavily too.
Water cooling is awesome when done right, with the water cooled components running at significantly cooler average temps than when air cooled. Seriously consider it, your system will thank you.
This incidentally is why you may want to consider NOT overclocking your graphics card or system (or only do 'mild' overclocks) for rendering. For most users, everyday use isn't going to stress that overclock much, but here our systems are going 'full out' at times when rendering, yeah things can wear out more quickly. Again, heat is the bane of electronics... Of course, if you don't mind replacing stuff more often, then maybe it'll be worth it to you, but for those trying to squeeze the life out of their systems over several years... food for thought.
Oh, one thing my power supply is probably not dead. I replaced it as the power was suddenly getting cut and I didn't want something more expensive dying.
As from tj_1ca9500b's post, maybe it lost efficiency, I don't think I was rendering with two cards in it for too long, perhaps a year or so. Used a year or two longer than that at a light load.
There's a few programs that will monitor the GPU memory usage. I use nVidia Inspector. You can load it up and monitor the use with one of your Stonemason scenes. I have many myself, I'll try to load one up. I have a few of the bigger ones.
One or two characters with a simple backdrop you'll be safe with the 11GB on the 1080 Ti.
http://www.guru3d.com/files-details/nvidia-profile-inspector-download.html
Never thought I'd need 64 Ram but boy is it coming in handy nowJust had two PC's custom built, both with windows Pro 10. One has two Titanx (Pascals) and 64 ram and the other a 1080Ti with 32 ram.The one with the two Titans has been replaced twice in three months. Once for a power issue the second time (one month after the second build) for a faulty CPU. Two days later the 1080Ti messed up my new 2 week old 3 TB external Harddrive which appears to be typical for windows 10 and WD drives. Tried everything to get the data including dos commmands like chkdsk /f but the 1080ti machine could not do squat. The Titan was alble to slowly but surely discover folders and files. I put the drive on icepacks 24/7 and that titan has managed to recoup 310 (198,366 files so far) of 510 GB of design/graphic files and all of my Daz/Rendo downloads/ freebie etc. Unfortunately I didn't get a chance to back up the WD external drive because it was so new and I shop too much to ever have it up to date. When I get through this mess I will test out the titans vs the 1080 ti and render something identical and share some stats. One thing a friend who is an architect using auto cad shared was that the GPU VRam and Bus drives were supposed to be identical if one had multiple cards but I didn't ask why.
Edited to say 1080ti system for some reason is not near as powerful as the Titanx Pascal system.
In my short time here, I've seen a couple of posts now where people mention baking their GPU's. This may be one of those times when a regular 'consumer grade' GPU might be inferior to say the Quardo parts as far as the components used. One would hope that the vastly increased costs of the 'Professional' grade parts translates into a longer life expectancy, at least I'd hope so, nothing would suck more tham paying $6000 for a GPU, only to have it fry itself after 6 months to a year of use... hopefully there's a warranty for that!
I concur with your assessment that your PSU probably has degraded over the last several years. And you are right, once it starts giving you issues, that's a great reason to swap it out for a newer one. You might be able to repurpose the current power supply in a sytem that requires a lighter load/is less critical (say a media center or something), but yeah that's one area of the system where it pays to just upgrade if you are beginning to have issues, as so many components rely on the PSU, as you correctly pointed out.
..Indeed, you have to look at the MB expansion slot specs closely. More often than not, those which advertise say 3 PCIe x16 slots it will actually list "x16/ x16/ x8" if all three slots are used.
...GPU rendering also drives all GPUs in a system (as well as other components) at constant peak output and power demand until the process is complete.
This is why I wish there were more benchmark tests for CG software as well as just the latest popular games.
...another way to deal with power fluctuations especially if you live in a house or flat with older wiring, get a good UPS as it will regulate and balance the power going to the system.
Another suggestion, don't scrimp on case size. a larger case may be a pain to move, but it will give the components inside more "breathing room" and allow improved of access for maintenance and upgrading. Also the more fans for airflow, the better.
As to overclocking, I believe Lao Tsu put it best: "The flame that burns twice as bright burns half as long". I'd rather have a render process take an extra hour than risk burning out the CPU, GPU, or PSU.
Lawd, I think I've forgotten more than I know with PC tech. You do decrease the lifespan of components with overclocking. And from what I read, the decrease was steep too... but I don't remember all the details. And there are many variables so it's hard to have a simple one size fits all statement. If you care about the longetivity, do your homework and overclock correctly.
Personally, I don't have the time to fiddle anymore. I run all my stuff stock speed. It's engineered for it, so even though the cards run what I consider warm, all my cases are very well ventilated large cases in air conditioned rooms. For the cards, I'm sure I'm within an enviroment for which they are designed to do, I just run them at 100% for longer.