Show Us Your Iray Renders

1293032343550

Comments

  • Oso3DOso3D Posts: 15,095
    edited December 1969

    I have no idea. Every time I set Render Subd to 3, I had to power my computer. :/

  • Dumor3DDumor3D Posts: 1,316
    edited December 1969

    I have no idea. Every time I set Render Subd to 3, I had to power my computer. :/

    Have you tried it since turning OpTix off? (I have no idea if that could cause the issue) Maybe try setting Render SubD to level 2 or one above the SubD level? Does this happen when you set it or when you start the render?

    To me, this is somewhat like HD for 3DL. It leaves the scene in the viewport at the default SubD, so OpenGL doesn't bog down, while the higher SubD is only sent to the render engine.

    Anyway, I'm trying to learn here. Trying to figure out if there is a bug or if my plan to use this in some of my products is not a good idea. Thanks!

  • Oso3DOso3D Posts: 15,095
    edited December 1969

    I'm pretty sure it happened after I shut off OpTix.

    The thing is, the outpost is one of those 'big collections all baked together.' Normally I'd only make stuff 'in frame' high resolution, but I couldn't, alas!

  • Dumor3DDumor3D Posts: 1,316
    edited December 1969

    I'm pretty sure it happened after I shut off OpTix.

    The thing is, the outpost is one of those 'big collections all baked together.' Normally I'd only make stuff 'in frame' high resolution, but I couldn't, alas!


    Thanks Timmins! Gotcha! Yeah, if you're having to do the whole scene, that could run it up in a hurry! I appreciate the feedback. It helps my decision process.
  • Oso3DOso3D Posts: 15,095
    edited December 1969

    Glad to kibbutz! We all have a lot to learn.

  • GoggerGogger Posts: 2,508
    edited March 2015

    SO EXCITED! I just picked up the Nvidia GTX 970 video card and while it is installed, I am having to wait for the video drivers to download. It must be that Nvidia's servers are slow because it is taking almost two hours for a 290MB file - sheesh! I download that much, or more, product from DAZ in just a minute or two. I am really excited to take DAZ + Iray for a spin with it though! I'm upgrading from a GTX680 which was actually pretty respectable so am hoping the 970 will justify the $380 cost - ouch!

    WOOHOO!!! The download just finished. MEEP-MEEP!

    Post edited by Gogger on
  • TJohnTJohn Posts: 11,352
    edited December 1969

    Woof. So, the advice was very helpful and I learned a lot from this one effort --

    A problem with OptiX acceleration confused me for a long time, and I had thought that transparency on hair was bugged. Learned that THAT wasn't true -- the hair isn't perfect, but it looks waaaay stonking better properly set up.

    Lighting! While blu-ish institutional lighting might make thematic sense, it looks like crap, so ... art wins! Also added a soft box low down to add crucial lighting elements.

    Some of the comments about eyes made me fix a few problems (I hadn't noticed the gray base color on corneas, and I upped the reflective elements on cornea/sclera). Tweaked skin a little -- I hadn't realized I had a proper translucence map, so I bumped translucence from .5 to 1 (which AVERAGES to about .5, but more interestingly).

    Finally, I kept running into problems with the outpost model -- it's Base resolution, and I was getting shadow checkerboards in the background. Grrr.

    SO then I thought... well, duh, blur it with Depth of Field! And not only did that fix the problem, it ALSO made for a much better-looking image.

    Weirdly, the render hung at about 2400 iterations/4 hours... but it looks darn good at that point, so I'll take it.

    The blue image is the 'before.' I went with a dark cream color on the walls to help with contrast.


    Very nice results on the new render. :)
  • GoggerGogger Posts: 2,508
    edited December 1969

    Gogger said:
    SO EXCITED! I just picked up the Nvidia GTX 970 video card and while it is installed, I am having to wait for the video drivers to download. It must be that Nvidia's servers are slow because it is taking almost two hours for a 290MB file - sheesh! I download that much, or more, product from DAZ in just a minute or two. I am really excited to take DAZ + Iray for a spin with it though! I'm upgrading from a GTX680 which was actually pretty respectable so am hoping the 970 will justify the $380 cost - ouch!

    WOOHOO!!! The download just finished. MEEP-MEEP!

    Hmm... well. THAT was unexpected. Rendering the EXACT same scene as before, the GTX970 seems SLOWER than the GTX680 I was using before. The picture is exactly the same in the end, but, and I would have to check it, seemed like on the GTX680 it took around 30 minutes to render the "Chickens at the Monastery" scene and just over an hour with the 970. I did a test with only GPU and now CPU+GPU and I must say there is no noticeable difference. At 15 minutes in it is at 1182 Iray iterations and 15% finished. I was hoping for $380 I could say "WOW!" instead of "Meh". I'll do some more tests and officially keep track of specs, but right this moment I'm thinking it is a good thing I kept the receipt.

  • 8eos88eos8 Posts: 170
    edited March 2015

    Gogger said:
    Hmm... well. THAT was unexpected. Rendering the EXACT same scene as before, the GTX970 seems SLOWER than the GTX680 I was using before. The picture is exactly the same in the end, but, and I would have to check it, seemed like on the GTX680 it took around 30 minutes to render the "Chickens at the Monastery" scene and just over an hour with the 970. I did a test with only GPU and now CPU+GPU and I must say there is no noticeable difference. At 15 minutes in it is at 1182 Iray iterations and 15% finished. I was hoping for $380 I could say "WOW!" instead of "Meh". I'll do some more tests and officially keep track of specs, but right this moment I'm thinking it is a good thing I kept the receipt.

    A GTX 680 has 1536 CUDA cores, while a 970 has 1664 cores, so I don't think it would be significantly faster (but it shouldn't be twice as slow either....!) Where you win with a 970 is having twice as much memory to use for scene textures and geometry.

    Post edited by 8eos8 on
  • GoggerGogger Posts: 2,508
    edited December 1969

    8eos8 said:
    Gogger said:
    Hmm... well. THAT was unexpected. Rendering the EXACT same scene as before, the GTX970 seems SLOWER than the GTX680 I was using before. The picture is exactly the same in the end, but, and I would have to check it, seemed like on the GTX680 it took around 30 minutes to render the "Chickens at the Monastery" scene and just over an hour with the 970. I did a test with only GPU and now CPU+GPU and I must say there is no noticeable difference. At 15 minutes in it is at 1182 Iray iterations and 15% finished. I was hoping for $380 I could say "WOW!" instead of "Meh". I'll do some more tests and officially keep track of specs, but right this moment I'm thinking it is a good thing I kept the receipt.

    A GTX 680 has 1536 CUDA cores, while a 970 has 1664 cores, so I don't think it would be significantly faster (but it shouldn't be twice as slow either....!) Where you win with a 970 is having twice as much memory to use for scene textures and geometry.

    Hmm.. Thanks for the info - admittedly I didn't do my proper research, I got all excited by Iray and jumped. I just thought, newer technology, more everything, less energy consumption. I might consider investing in another GTX680 for a SLI configuration but am not sure if it would really buy me anything since the card is still pretty expensive at around $200.

    Aaaaand I just Googled and could have saved myself a trip into town and the hassle of a return. Seems like comparing the 680 to the 970 reveals very, very little bang for the buck.

  • GoggerGogger Posts: 2,508
    edited December 1969

    So I am returning the GTX 970 and thinking of investing half that cash in another GTX 680 for SLI. I'm running an Alienware R4 desktop so know it can handle them but am wondering if anyone knows if trying to pair two different manufacturer's 680 cards will cause me grief - I think I read that it WILL. I have a straight NVidia GTX680 that came with my system but can't seem to find the same online for sale.

  • Dumor3DDumor3D Posts: 1,316
    edited December 1969

    Gogger said:
    So I am returning the GTX 970 and thinking of investing half that cash in another GTX 680 for SLI. I'm running an Alienware R4 desktop so know it can handle them but am wondering if anyone knows if trying to pair two different manufacturer's 680 cards will cause me grief - I think I read that it WILL. I have a straight NVidia GTX680 that came with my system but can't seem to find the same online for sale.

    You will need the 4GB for Studio Iray. You can easily run out of VRAM with 2GB. Iray cannot use SLI and Iray has to be able to store the entire scene on 'each' card. It does not share ram between cards. It add up CUDA cores, but does not add up VRAM.

    If you can put the 740 in with your 680, you will get about a doubling of performance, as long as the scene can fit onto the 2GB card. If your scene goes past that, the 680 should drop out and the 740 should continue.

  • ToyenToyen Posts: 2,048
    edited December 1969

    My latest skin shader developments!

    skin.png
    917 x 1125 - 1M
  • Robert FreiseRobert Freise Posts: 4,623
    edited December 1969

    Gogger said:
    So I am returning the GTX 970 and thinking of investing half that cash in another GTX 680 for SLI. I'm running an Alienware R4 desktop so know it can handle them but am wondering if anyone knows if trying to pair two different manufacturer's 680 cards will cause me grief - I think I read that it WILL. I have a straight NVidia GTX680 that came with my system but can't seem to find the same online for sale.

    No it wont or shouldn't according to Nvidia

    http://www.geforce.com/hardware/technology/sli/faq#c17

  • none01ohonenone01ohone Posts: 862
    edited December 1969

    As I'm running an AMD A10 4core only, my Iray CPU render times are in hours. Just as well I'm used to long Luxrender render times.
    I ran SickleYield's benchmark scene which timed out at 2hrs and 95% complete.

    This scene CPU only, cooked for 4hours with 4000 iterations.
    It's nice to have another render option. Thanks Daz.

    harli-pool-final-4000IT-out.png
    800 x 1173 - 1M
  • RAMWolffRAMWolff Posts: 10,369
    edited December 1969

    Didn't Mr. Spooky mention that bridged cards are actually not optimal for iRay??

    Gogger said:
    So I am returning the GTX 970 and thinking of investing half that cash in another GTX 680 for SLI. I'm running an Alienware R4 desktop so know it can handle them but am wondering if anyone knows if trying to pair two different manufacturer's 680 cards will cause me grief - I think I read that it WILL. I have a straight NVidia GTX680 that came with my system but can't seem to find the same online for sale.

    No it wont or shouldn't according to Nvidia

    http://www.geforce.com/hardware/technology/sli/faq#c17

  • Dumor3DDumor3D Posts: 1,316
    edited December 1969

    As I'm running an AMD A10 4core only, my Iray CPU render times are in hours. Just as well I'm used to long Luxrender render times.
    I ran SickleYield's benchmark scene which timed out at 2hrs and 95% complete.

    This scene CPU only, cooked for 4hours with 4000 iterations.
    It's nice to have another render option. Thanks Daz.

    Very nice render! :) Yes, I love LuxRender and have been doing hours long renders for a very long time. CPU only with Iray is not a lot different. I'm just happy that we have a free addition which can use GPU and get equal output to CPU. When I tried GPU with Lux, the quality was obviously different.

    Again, good work!

  • Dumor3DDumor3D Posts: 1,316
    edited December 1969

    RAMWolff said:
    Didn't Mr. Spooky mention that bridged cards are actually not optimal for iRay??

    Gogger said:
    So I am returning the GTX 970 and thinking of investing half that cash in another GTX 680 for SLI. I'm running an Alienware R4 desktop so know it can handle them but am wondering if anyone knows if trying to pair two different manufacturer's 680 cards will cause me grief - I think I read that it WILL. I have a straight NVidia GTX680 that came with my system but can't seem to find the same online for sale.

    No it wont or shouldn't according to Nvidia

    http://www.geforce.com/hardware/technology/sli/faq#c17

    Yes, in fact Nvidia states to not run in SLI mode when doing Iray renders as there can be conflicts. So, it is a bit worse than just not compatible. :)

  • SickleYieldSickleYield Posts: 7,649
    edited December 1969

    Toyen said:
    My latest skin shader developments!

    Nice progress there!

  • NoName99NoName99 Posts: 322
    edited December 1969

    Kamion99 said:
    Finished this one. I love how the dress fabric turned out.

    Render time wasn't even bad, under 2 hours and frankly it was pretty acceptable after 20 minutes. I just let it go because it was time for dinner.

    Some minor color correction and the signature, but other than that no postwork.

    Up in my gallery at a higher resolution.

    This is beautifully composed, great work.

  • GoggerGogger Posts: 2,508
    edited December 1969

    Dumor3D said:
    RAMWolff said:
    Didn't Mr. Spooky mention that bridged cards are actually not optimal for iRay??

    Gogger said:
    So I am returning the GTX 970 and thinking of investing half that cash in another GTX 680 for SLI. I'm running an Alienware R4 desktop so know it can handle them but am wondering if anyone knows if trying to pair two different manufacturer's 680 cards will cause me grief - I think I read that it WILL. I have a straight NVidia GTX680 that came with my system but can't seem to find the same online for sale.

    No it wont or shouldn't according to Nvidia

    http://www.geforce.com/hardware/technology/sli/faq#c17

    Yes, in fact Nvidia states to not run in SLI mode when doing Iray renders as there can be conflicts. So, it is a bit worse than just not compatible. :)

    Thanks for all the info! I know 4GB of vram is way better than 2, but I run Vue Complete 2014, Carrara Pro 8.5, Poser Pro 2012/2014 (often with Photoshop at the same time!), and DAZ with Iray and so far do not have any issues. I just can't justify FOUR-HUNDRED dollars for no noticeable gain in performance. I guess if I hit that 2GB wall I will reconsider, and the prices will only come down and down as time whistles by.

    I appreciate all the opinions and answers though!

  • Robert FreiseRobert Freise Posts: 4,623
    edited December 1969

    Gogger said:
    Dumor3D said:
    RAMWolff said:
    Didn't Mr. Spooky mention that bridged cards are actually not optimal for iRay??

    Gogger said:
    So I am returning the GTX 970 and thinking of investing half that cash in another GTX 680 for SLI. I'm running an Alienware R4 desktop so know it can handle them but am wondering if anyone knows if trying to pair two different manufacturer's 680 cards will cause me grief - I think I read that it WILL. I have a straight NVidia GTX680 that came with my system but can't seem to find the same online for sale.

    No it wont or shouldn't according to Nvidia

    http://www.geforce.com/hardware/technology/sli/faq#c17

    Yes, in fact Nvidia states to not run in SLI mode when doing Iray renders as there can be conflicts. So, it is a bit worse than just not compatible. :)

    Thanks for all the info! I know 4GB of vram is way better than 2, but I run Vue Complete 2014, Carrara Pro 8.5, Poser Pro 2012/2014 (often with Photoshop at the same time!), and DAZ with Iray and so far do not have any issues. I just can't justify FOUR-HUNDRED dollars for no noticeable gain in performance. I guess if I hit that 2GB wall I will reconsider, and the prices will only come down and down as time whistles by.

    I appreciate all the opinions and answers though!

    https://devtalk.nvidia.com/default/topic/493847/iray-needs-neither-sli-nor-cuda-/

  • StalestStalest Posts: 880
    edited December 1969

    Sometimes you meet the strangest people on the stairs!

    aiko_n_snake_g.png
    1600 x 1314 - 2M
  • TJohnTJohn Posts: 11,352
    edited December 1969

    Using Design Anvil's 3-point lighting

    Bad_Reputation.jpg
    1300 x 1500 - 485K
  • RAMWolffRAMWolff Posts: 10,369
    edited December 1969
  • ZilvergrafixZilvergrafix Posts: 1,385
    edited December 1969

    iRay Fighter, Postworked and raw renders

    fight1.png
    1920 x 1080 - 4M
    fightIray1.png
    1920 x 1080 - 5M
  • JabbaJabba Posts: 1,461
    edited December 1969

    Trying Darius in Iray...

    Darius_Iray_1.jpg
    1400 x 2000 - 1M
  • alexhcowleyalexhcowley Posts: 2,406
    edited December 1969

    tjohn said:
    Using Design Anvil's 3-point lighting

    This is really impressive and far better than I have managed to achieve using the same lighting set.

    The lighting on mine looks very harsh, the skin looks rather too glossy and there's some weird whitish highlights on the lips. Any thoughts on how I can improve it?

    Cheers,

    Alex.

    Claustrum_Test_A5_Detail.jpg
    850 x 573 - 63K
    Claustrum_Test_A5.jpg
    2000 x 1333 - 455K
  • 3dTox3dTox Posts: 82
    edited December 1969

    Just messing around with bloom and camera settings. Scene is lit entirely by environment map. No post work.

    Terminated.jpg
    1680 x 1050 - 710K
  • alexhcowleyalexhcowley Posts: 2,406
    edited December 1969

    tjohn said:
    Using Design Anvil's 3-point lighting

    This is really impressive and far better than I have managed to achieve using the same lighting set.

    The lighting on mine looks very harsh, the skin looks rather too glossy and there's some weird whitish highlights on the lips. Any thoughts on how I can improve it?

    Cheers,

    Alex.

    Fixed. I used most of the values suggested by 8eos8 in this thread:

    http://www.daz3d.com/forums/discussion/54239/

    The only ones I changed were the glossy reflectivity, which I dialed all the way down to 0.5, and the translucency HSR values, which I ended up resetting to the original values since 8eos8's suggestion is only appropriate to white anglosaxons and my lady has much darker skin.

    Cheers,

    Alex.

This discussion has been closed.