iRay rendering on consumer cards.....
I ran a utility to track the load on my new video cards whilst doing different activities. Learned a lesson.
Gaming on a high end consumer card - even at ultra settings - seldom stresses the GPU to 100%. There are moments when you move quickly or or some such when the card hits 100% utilization - but there are almost always enough times at lower load so that the GPU can back down to 30% - 40%. Thus the fan is set to respond to temperature conservatively since gamers like their computers to be quiet.
An iRay render, or even setting the preview window to iRay mode, immediately slams the GPU load to 100% and keeps in there. Nvidia does not compromise. They push the card as hard as they possibly can. That causes the GPU to heat up rapidly. The thing I learned is that it may be wise to set a custom GPU cooling fan profile while running Daz that ramps the GPU fans up to 100% and keeps them there.
By the by, this tells me right away why Quadro cards are so expensive. Seems to me Nvidia is not kidding when they say they hand select chips for Quadro cards to stand up to continuous maximum load.
Oh, one other comment. Nvidia cards come out of the box configured to throttle down the GPU load once the card hits 83C - because the card could fail if it hits 90C. This throttling seems to work very well. I would NOT suggest bypassing it by overclocking your card. Doing so could make you very sad.
Makes me wonder if I would've been better off water cooling my GPU's? I'd heard water cooling isn't worth the money. That's probably true for a gamer - but for doing Daz renders it might be much better than air cooling. Unfortunately, I can't test that. Anyone else have experience with water cooled cards running iRay?
Oh, and one more comment, you probably don't want to leave your viewport in iRay preview mode all the time. No need to put your GPUs under continuous maximum load all day long.

Comments
GPU wear concerns me greatly too. I don't run quadro cards but i do run very expensive cards non-the-less. I set custom fan profiles for them and i do wince when my poor GPUs sound as if they are about to launch out of the case. This is probably an ill founded fear however, since card mechanical degradation tends to only effect moving parts. There was a great linus tech tips video on this subject here. He pits an old test bench 480 with plenty of air time against a brand new one, and there is barely any difference.
Funny enough, I have found that Iray is much more stressfull on my processor(s) than on my gfx card ;). The 980Ti seems to hold its own while the procs do get a little bit hot. I don't leave a render running all night long anymore for that reason and I keep an eye on the temps when the render is running.
Laurie
My CPU holds up pretty well, but I don't use it to support iRay. I have heard the CPU fan ramp up duing 3Delight renders - but those tend to be much shorter than iRay renders. I also put a pretty well rated after market cooler on my CPU. It only cost $25, but it's pretty effective.
Great link, thanks. Makes me feel a lot better about those GPU loads.
...however if your scene exceeds GPU memory, it dumps to the CPU until it finishes. To avoid this requires getting a card with enough memory to render say 90% - 95% of your scenes For myself that means at least Titan-X (12 GB) if not a Quadro P5000 (16 GB).
Yeah, I have 6 gigs on my card but I very frequently make scenes that go well over that, so I push those Xeons to the max. And boy, do they get hot ;)
Laurie
GPUs need to consider switching their silicon to Gallium Nitride (GaN) which is an absolute beast at handling high temperatures. I've personally seen it used for output amplifiers in extremely high powered RF communications circuits with continuous wave (i.e. always blasting full power) applications that would fry anything else in seconds and it shrugs it off like a bad case of sniffles and keeps trucking. To test it we actively tried to burn it out and we couldn't, everything else on the boards started popping but the GaN amplifiers were fine. It still needs to be cooled of course, but it tolerates a lot more constant use and will last a lot longer I beleive.
Still a ways off though, it's sort of niche stuff for the time being. But this stuff moves quickly once it begins getting some momentum.
Quite true, but this is more of a glass-half-full/glass-half-empty argument in my opinion. I spent years working with a GTX 750M - which has only 2GB of VRAM. I learned how to be VERY efficient at creating my renders to keep them under 2GB most of the time. Now that I have a 6GB card I'm already getting, "lazy", and letting my renders bloat up.a bit. I can see the day not too far off when everything I do will be right on the edge of 6GB. If you buy a card that can go 12GB, or 16GB, or 32GB, you will just get used to working within the limitation you have. It's like that old saying about the household budget, "expenses always rise to consume all available income." So....it just comes down to how much you want to spend
...for my scenes, a Quadro P5000 would probably do, just that I do not have 2,500$ burning a hole in my pocket. I was hoping that the Pascal Titan-X would have been upgraded to 16 GB especially given the price increase. That would give me enough overhead for the rendering process as well as the base filesize. Keep in mind that the Daz programme's UI also takes up VRAM.
And then there's Windows eating some of it I've heard, though I'm not sure if it's only Windows 10.
That's where having a card to manage the display functions and do no rendering helps. Yes, Windows will still eat up some on the non-display card, but that's all that will.
...as I understand, it is pretty much W10 that reserves an increasingly larger percentage of VRAM the more your card has. W7 will use up more if you use the Aero UI (which I have turned off since day one). GPU-Z indicates that in idle mode, 63 MB of the full 1024 availble on my old GTX 460 is being used for running the dual displays.
The Daz programme takes up about 258 MB.
Use an aggressive fan profile. I'm just on air cooling and never ever breach 61°C even with long runs. Keep your case clean, and you can buy a 10 year warranty with EVGA's cards which likely would be longer then you'll own the card. That can be transferred, too, raising the resell value of the card later on for you. All problems solved!
I've been playing with Superfly aka Blender Cycles render engine for Poser Pro 11. I just upgraded and the upgrade price was right so I pulled the trigger. I really like this render engine. It is really fast with my new Iray video cards but I just got a 2.9ghz Xeon 8core that just by itself works great with cycles. If you don't want to spend alot of money on video cards and want to render iray quality photos you can get some nice Xeon processors for a fraction of the cost of high end Nvidia video and you have a really nice system to do your Poser renders in. On a side note I wish we had Blender Cycles for Daz studio. It is a really nice render engine and it is Open Source and FREE. Hint Hint
...is that the Xeon E5-2690 Sandy Bridge?
I have two 2.6ghz 8 core Xeons and 64 gigs of ram and Iray doesn't run to bad on it either ;). Haven't tried Blender Cycles yet. Never bought Poser 11.
Laurie
..so are you running the E5-2640-V3 Haswells? That's like 1,900$ just for the CPUs.
I'm looking at the 2.4 GHz E5-2630-V3 which are currently 699$ ea because I plan to use Quad Channel DDR4 memory.
Thanks for the advice. Will try it.
@kyoto kid - Mine are Sandy Bridge with DDR3 ram if it's me you're asking ;). I got the computer factory refurbished at a really good price.
Laurie
...ah so those are Xeon E5-2690s then. Not bad price-wise only about 450$ ea at Newegg.
yeah that would bring the price down by 600$ over the E5-2630-v3, however in building a new system I'd like to move forward in technology which is why I am looking at quad channel DDR4. I do notice though that teh E5-2690 and L:GA 2011 socket does support Quad channel DDR3 and some boards up to 512 GB. What I need to find out now is what the performance differential between DDR3 and DDR4 is. I am curently looking for a 4 x 32 GB kit and do not see any in DDR3 as all the 128 GB kits are 8 x 16 GB.
Are you using server memory?
(Oops, this was supposed to be here, but I accidentally posted over in anothe related thread. Anyway, I also copied it here)
A lot of people seem to over estimate the demand on GPU memory because they look at how much ram a scene uses in DS. That is not a useful measure. Here is an example of a rather complex setup that I just finished. It contains the following elements:
4 Genesis 3 figures with clothing and full textures (plus 2 instances of one)
Urban Future 4 set
7 different vehicles (4 cars, 1 trike, 1 motorcycle, 1 flying drone)
1 Robot
1 cat
1 rat
many different props (vending machine, advertisement screens, rain system, puddles)
All of these are currently at standard resolution textures. I can render this scene on my GTX 1080 in ~2.5h (in 2400*1096 pixel) and it uses 7.2Gb of vram. The GPU sits at about 60C the whole time (I have a fairly nice box with 8 fans and watercooled CPU). The displays are driven by a second card (GTX980) which I also use for rendering if the scene are below 4Gb.
I could easily add more to the scene by reducing some textures and subD levels. Anyway, as one can see a monster GPU with 12Gb or more vram is not always needed. Oh and this scene uses between 8 and 24.5Gb of ram when working with the scene or rendering via CPU.
Ciao
TD
Click for full size!
...my railway station scene with the Daz programme open is 8.9 GB. While rendering it drops into Swap Mode as I only have 10.7 GB of free memory available after Windows and system utilities.
The test I ran with a single figure in the "T" pose with basic clothing, short hair, a simple ambient environment single photometric spotlight and a plain backdrop at 600 x 600 took a total of 924 MB of my 1 GB GTX 460 (384 CUDA cores). All 3DL materials were manually converted to Iray ones before rendering. What I found interesting is the CPU test took about two minutes less than the GPU one (I made sure only one mode was checked in the advanced settings and didn't use OptiX acceleration).
At 900 x 900 resolution it exceeded the card's memory in just under two minutes.
@thd777 "I can render this scene on my GTX 1080 in ~2.5h (in 240*1096 pixel) and it uses 7.2Gb of vram"
If you didn't mis-type the render dimensions, that's nothing to brag about, especially with a 1080.
Haha. That would also be a very odd aspect ratio. It is 2400*1096 (fixed it). The reason it is taking that long is the large number of emissive surfaces. If I remove those and just use the HDRI it is done in about 15 min. But it doesn't look as cool.
TD
I live in the Caribbean and have no AC. I have a ZOTAC GTX 960 and the fans don't go aove 55% of maximum speed and the cards temp has never gone above 65C.
I will be adding a ZOTAC GTX 1080 Founders Edition. Hopefully these temps will remain the same, as the case has excellent airflow.
Intel Xeon E5-2690 SR0L0 2.90GHZ 20MB 8 GT/s LGA 2011 Eight-Core CPU $169 from Ebay I got two. These are DDR3 Processors but I don't care I upgraded two systems to 8 cores and they have Cooler Master Hyper 212 and Arctic Silver 5 so these processors don't get hot. Really great deal.
Nice render thd777, I would like daz to look into ways to automatically lower the quality of the textures of backgroud items so they will fit better in our video cards if something is in the distance why do I need HD textures for that and why if the camera is not even on something does all the textures have to be loaded into video memory.
Windows 8.0 does not consume any.
As I understand it it does, and so does Windows 7, but a much smaller amount.
can someone tell me the time for the DAZ3D benchmarch rendering file for a Xeon E5-2690 system and a 2 Xeon E5-2690 system please?
(render scene & thread here : http://www.daz3d.com/forums/discussion/53771/iray-starter-scene-post-your-benchmarks/p1 )