Show Us Your Iray Renders
This discussion has been closed.
Adding to Cart…
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2026 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2026 Daz Productions Inc. All Rights Reserved.
Comments
I have no idea. Every time I set Render Subd to 3, I had to power my computer. :/
Have you tried it since turning OpTix off? (I have no idea if that could cause the issue) Maybe try setting Render SubD to level 2 or one above the SubD level? Does this happen when you set it or when you start the render?
To me, this is somewhat like HD for 3DL. It leaves the scene in the viewport at the default SubD, so OpenGL doesn't bog down, while the higher SubD is only sent to the render engine.
Anyway, I'm trying to learn here. Trying to figure out if there is a bug or if my plan to use this in some of my products is not a good idea. Thanks!
I'm pretty sure it happened after I shut off OpTix.
The thing is, the outpost is one of those 'big collections all baked together.' Normally I'd only make stuff 'in frame' high resolution, but I couldn't, alas!
Thanks Timmins! Gotcha! Yeah, if you're having to do the whole scene, that could run it up in a hurry! I appreciate the feedback. It helps my decision process.
Glad to kibbutz! We all have a lot to learn.
SO EXCITED! I just picked up the Nvidia GTX 970 video card and while it is installed, I am having to wait for the video drivers to download. It must be that Nvidia's servers are slow because it is taking almost two hours for a 290MB file - sheesh! I download that much, or more, product from DAZ in just a minute or two. I am really excited to take DAZ + Iray for a spin with it though! I'm upgrading from a GTX680 which was actually pretty respectable so am hoping the 970 will justify the $380 cost - ouch!
WOOHOO!!! The download just finished. MEEP-MEEP!
Very nice results on the new render. :)
Hmm... well. THAT was unexpected. Rendering the EXACT same scene as before, the GTX970 seems SLOWER than the GTX680 I was using before. The picture is exactly the same in the end, but, and I would have to check it, seemed like on the GTX680 it took around 30 minutes to render the "Chickens at the Monastery" scene and just over an hour with the 970. I did a test with only GPU and now CPU+GPU and I must say there is no noticeable difference. At 15 minutes in it is at 1182 Iray iterations and 15% finished. I was hoping for $380 I could say "WOW!" instead of "Meh". I'll do some more tests and officially keep track of specs, but right this moment I'm thinking it is a good thing I kept the receipt.
A GTX 680 has 1536 CUDA cores, while a 970 has 1664 cores, so I don't think it would be significantly faster (but it shouldn't be twice as slow either....!) Where you win with a 970 is having twice as much memory to use for scene textures and geometry.
A GTX 680 has 1536 CUDA cores, while a 970 has 1664 cores, so I don't think it would be significantly faster (but it shouldn't be twice as slow either....!) Where you win with a 970 is having twice as much memory to use for scene textures and geometry.
Hmm.. Thanks for the info - admittedly I didn't do my proper research, I got all excited by Iray and jumped. I just thought, newer technology, more everything, less energy consumption. I might consider investing in another GTX680 for a SLI configuration but am not sure if it would really buy me anything since the card is still pretty expensive at around $200.
Aaaaand I just Googled and could have saved myself a trip into town and the hassle of a return. Seems like comparing the 680 to the 970 reveals very, very little bang for the buck.
So I am returning the GTX 970 and thinking of investing half that cash in another GTX 680 for SLI. I'm running an Alienware R4 desktop so know it can handle them but am wondering if anyone knows if trying to pair two different manufacturer's 680 cards will cause me grief - I think I read that it WILL. I have a straight NVidia GTX680 that came with my system but can't seem to find the same online for sale.
You will need the 4GB for Studio Iray. You can easily run out of VRAM with 2GB. Iray cannot use SLI and Iray has to be able to store the entire scene on 'each' card. It does not share ram between cards. It add up CUDA cores, but does not add up VRAM.
If you can put the 740 in with your 680, you will get about a doubling of performance, as long as the scene can fit onto the 2GB card. If your scene goes past that, the 680 should drop out and the 740 should continue.
My latest skin shader developments!
No it wont or shouldn't according to Nvidia
http://www.geforce.com/hardware/technology/sli/faq#c17
As I'm running an AMD A10 4core only, my Iray CPU render times are in hours. Just as well I'm used to long Luxrender render times.
I ran SickleYield's benchmark scene which timed out at 2hrs and 95% complete.
This scene CPU only, cooked for 4hours with 4000 iterations.
It's nice to have another render option. Thanks Daz.
Didn't Mr. Spooky mention that bridged cards are actually not optimal for iRay??
No it wont or shouldn't according to Nvidia
http://www.geforce.com/hardware/technology/sli/faq#c17
Very nice render! :) Yes, I love LuxRender and have been doing hours long renders for a very long time. CPU only with Iray is not a lot different. I'm just happy that we have a free addition which can use GPU and get equal output to CPU. When I tried GPU with Lux, the quality was obviously different.
Again, good work!
Yes, in fact Nvidia states to not run in SLI mode when doing Iray renders as there can be conflicts. So, it is a bit worse than just not compatible. :)
Nice progress there!
This is beautifully composed, great work.
Yes, in fact Nvidia states to not run in SLI mode when doing Iray renders as there can be conflicts. So, it is a bit worse than just not compatible. :)
Thanks for all the info! I know 4GB of vram is way better than 2, but I run Vue Complete 2014, Carrara Pro 8.5, Poser Pro 2012/2014 (often with Photoshop at the same time!), and DAZ with Iray and so far do not have any issues. I just can't justify FOUR-HUNDRED dollars for no noticeable gain in performance. I guess if I hit that 2GB wall I will reconsider, and the prices will only come down and down as time whistles by.
I appreciate all the opinions and answers though!
Yes, in fact Nvidia states to not run in SLI mode when doing Iray renders as there can be conflicts. So, it is a bit worse than just not compatible. :)
Thanks for all the info! I know 4GB of vram is way better than 2, but I run Vue Complete 2014, Carrara Pro 8.5, Poser Pro 2012/2014 (often with Photoshop at the same time!), and DAZ with Iray and so far do not have any issues. I just can't justify FOUR-HUNDRED dollars for no noticeable gain in performance. I guess if I hit that 2GB wall I will reconsider, and the prices will only come down and down as time whistles by.
I appreciate all the opinions and answers though!
https://devtalk.nvidia.com/default/topic/493847/iray-needs-neither-sli-nor-cuda-/
Sometimes you meet the strangest people on the stairs!
Using Design Anvil's 3-point lighting
Thanks for this link. Good info there.
iRay Fighter, Postworked and raw renders
Trying Darius in Iray...
This is really impressive and far better than I have managed to achieve using the same lighting set.
The lighting on mine looks very harsh, the skin looks rather too glossy and there's some weird whitish highlights on the lips. Any thoughts on how I can improve it?
Cheers,
Alex.
Just messing around with bloom and camera settings. Scene is lit entirely by environment map. No post work.
This is really impressive and far better than I have managed to achieve using the same lighting set.
The lighting on mine looks very harsh, the skin looks rather too glossy and there's some weird whitish highlights on the lips. Any thoughts on how I can improve it?
Cheers,
Alex.
Fixed. I used most of the values suggested by 8eos8 in this thread:
http://www.daz3d.com/forums/discussion/54239/
The only ones I changed were the glossy reflectivity, which I dialed all the way down to 0.5, and the translucency HSR values, which I ended up resetting to the original values since 8eos8's suggestion is only appropriate to white anglosaxons and my lady has much darker skin.
Cheers,
Alex.