Un-Biased Reneder Thread - Post Your Renders!! (Reality/Lux, Luxus/Lux, Octane Render, and others?)

191012141535

Comments

  • erik_nlerik_nl Posts: 6
    edited March 2015

    dustrider said:

    Your skin shaders are outstanding!!!

    Thanks!
    So far I've never seen the need for complicated piles of layers and effects.
    As far as I can tell nearly all depends on the quality of the original textures, all I do is set a few surface parameters to my liking.

    Cheers!

    Erik

    (oops, wrong account..... oh whatever)

    MRL_Gabi_on_Genesis_2_-_test_10.jpg
    1200 x 1200 - 319K
    Post edited by erik_nl on
  • erik leemanerik leeman Posts: 262
    edited December 1969

    These are the first renders I've done with MindVision's Actual Hair 2, and I'm impressed with how nice it looks!
    Rendered using the Octane plugin (v1.2 as before) for DAZ Studio.

    Cheers!

    Erik

    MRL_Gabi_on_Genesis_2_-_test_11.jpg
    1920 x 1200 - 255K
  • DAZ_SpookyDAZ_Spooky Posts: 3,100
    edited December 1969

    DAZ Studio Default (unbiased) render.

    http://www.daz3d.com/forums/discussion/53671/

    LYPresley_Lighttest.png
    1000 x 1300 - 2M
  • Rashad CarterRashad Carter Posts: 1,830
    edited December 1969

    DAZ Studio Default (unbiased) render.

    http://www.daz3d.com/forums/discussion/53671/

    Is this to mean you guys have written a native unbiased render engine for DazStudio that doesn't require an external plug-in? Am I reading your post correctly?

  • KatherineKatherine Posts: 331
    edited March 2015

    Rendered in DS - No postwork. :) Unbiased. (White grape juice - NOT wine!) :)

    morningswim2.jpg
    1300 x 1000 - 701K
    Post edited by Katherine on
  • DAZ_SpookyDAZ_Spooky Posts: 3,100
    edited December 1969

    DAZ Studio Default (unbiased) render.

    http://www.daz3d.com/forums/discussion/53671/

    Is this to mean you guys have written a native unbiased render engine for DazStudio that doesn't require an external plug-in? Am I reading your post correctly?Not exactly. We didn't write it. but it does mean no plug-in required and it is the default render engine. See the thread once it has more information, all will be explained.

  • kyoto kidkyoto kid Posts: 42,161
    edited March 2015

    ...apparently looks like some of the more advanced features of the Renderman 3DL engine have been opened up.

    So what does this mean for plugins like Reality and Luxus as well as Luxrender? and how CPU intensive is it?

    Also what does it do to AoA's SSS shaders, lights and cameras?

    Post edited by kyoto kid on
  • DAZ_SpookyDAZ_Spooky Posts: 3,100
    edited December 1969

    And just for fun, one more:

    Vanguard2.png
    1920 x 1080 - 1M
  • KatherineKatherine Posts: 331
    edited December 1969

    Kyoto Kid said:
    ...apparently looks like some of the more advanced features of the Renderman 3DL engine have been opened up.

    ......

    Not exactly. :)

  • DAZ_SpookyDAZ_Spooky Posts: 3,100
    edited March 2015

    Kyoto Kid said:
    ...apparently some of the other features of the full Renderman 3DL engine have been opened up.

    So what does this mean for plugins like Reality and Luxus as well as Luxrender? and how CPU intensive is it?

    All rendering is computer resource intensive. And no this is not 3Delight. (Though you can still use 3delight.)

    It doesn't change any of the plug-ins. They still work.

    Post edited by DAZ_Spooky on
  • kyoto kidkyoto kid Posts: 42,161
    edited December 1969

    ...direct pipeline to Lux?

  • DAZ_SpookyDAZ_Spooky Posts: 3,100
    edited December 1969

    Kyoto Kid said:
    ...direct pipeline to Lux?
    Go to the other thread, and be patient, or open Install Manager, and find out.
  • UHFUHF Posts: 518
    edited December 1969

    Kyoto Kid said:
    ...the most VRAM a GeForce GPU has is 6 GB. The TitanX is actually a dual GPU so even though it advertises 12GB, for rendering purposes only 6 GB applies. After reading though this thread, A 3 - 4 GB GPU is more than sufficient when using the Out of Core rendering option.

    I think you worry too much. The pros all use GeForce wit 6GB. Is your computer way awesome over speced, and horribly expensive?

    With Reality you have similar solutions to different problems. With Lux, you merge Bump and Displacement into a single normal Map to save memory and speed up the render. Alternatively, you could crank up SubD to solve displacement but it will slow down your render, and use way way more memory. More render time equals more heat, and of course a bigger electricity bill.

    I suppose its good news that they can do out of GPU RAM use. So far I haven't done anything with more than 5 characters, dozens of props, and a few models in Octane. 1 or two more characters and I'd be tapped out.

  • Zev0Zev0 Posts: 7,123
    edited December 1969
  • KatherineKatherine Posts: 331
    edited March 2015

    Kyoto Kid said:
    ...direct pipeline to Lux?
    Go to the other thread, and be patient, or open Install Manager, and find out.

    Referring to the Public Beta thread. :) http://www.daz3d.com/forums/discussion/53671/

    Post edited by Katherine on
  • DoctorJellybeanDoctorJellybean Posts: 9,826
    edited December 1969

    Rendered in DS - No postwork. :) Unbiased. (White grape juice - NOT wine!) :)

    It looks like apple juice to me ;)

  • UHFUHF Posts: 518
    edited December 1969

    Daz_Spooky: I did a Vanguard render as well... I love that ship. I just wish it had doors between different parts of the ship.

    Vanguard_SFX_Done.jpg
    1920 x 1080 - 648K
  • Dumor3DDumor3D Posts: 1,316
    edited December 1969

    Rendered in DS - No postwork. :) Unbiased. (White grape juice - NOT wine!) :)

    HAH!!!! I see Wine Me works with Iray! Nice :)

  • Peter FulfordPeter Fulford Posts: 1,325
    edited December 1969

    Kyoto Kid said:
    ...the most VRAM a GeForce GPU has is 6 GB. The TitanX is actually a dual GPU so even though it advertises 12GB, for rendering purposes only 6 GB applies.

    Perhaps you meant to say the Titan Z?

    The Titan X has only just been announced, with very little information. We know it has 8 billion transistors - so it's a single GPU. And we know it comes with 12GB of vram...

    It'll probably be released within a fortnight, so (assuming it's good) keep an eye out for original 6GB Titans coming onto the second hand market as the enthusiasts adopt the new device.

  • KatherineKatherine Posts: 331
    edited December 1969

    Dumor3D said:
    Rendered in DS - No postwork. :) Unbiased. (White grape juice - NOT wine!) :)

    HAH!!!! I see Wine Me works with Iray! Nice :)

    Yes it does. Beautifully. :)

  • DAZ_SpookyDAZ_Spooky Posts: 3,100
    edited December 1969

    Oh and one more. Doing a quick test for comparison purposes and I came up with this in 61 seconds.

    Darius-Iray-61secs.png
    1000 x 1300 - 1M
  • TotteTotte Posts: 14,941
    edited December 1969

    Rendered in DS, No postwork.

    image-09.png
    1200 x 849 - 2M
  • Dumor3DDumor3D Posts: 1,316
    edited December 1969

    This is awesome! Really super fast, too!

    CandyDishCausticsOn.jpg
    1300 x 1300 - 839K
  • DAZ_SpookyDAZ_Spooky Posts: 3,100
    edited December 1969

    Totte said:
    Rendered in DS, No postwork.
    There we go.

    Go download it like Totte and have some fun.

  • kyoto kidkyoto kid Posts: 42,161
    edited December 1969

    UHF said:
    Kyoto Kid said:
    ...the most VRAM a GeForce GPU has is 6 GB. The TitanX is actually a dual GPU so even though it advertises 12GB, for rendering purposes only 6 GB applies. After reading though this thread, A 3 - 4 GB GPU is more than sufficient when using the Out of Core rendering option.

    I think you worry too much. The pros all use GeForce wit 6GB. Is your computer way awesome over speced, and horribly expensive?

    With Reality you have similar solutions to different problems. With Lux, you merge Bump and Displacement into a single normal Map to save memory and speed up the render. Alternatively, you could crank up SubD to solve displacement but it will slow down your render, and use way way more memory. More render time equals more heat, and of course a bigger electricity bill.

    I suppose its good news that they can do out of GPU RAM use. So far I haven't done anything with more than 5 characters, dozens of props, and a few models in Octane. 1 or two more characters and I'd be tapped out.
    ...cannot afford a Titan Black (almost 1,000$) or even a 4 GB 980 (about 600$) at the moment. Currently only have 12 GB memory with only a 1GB Nvidia GPU. Looking to scrape up the finds to upgrade to the memory to 24 GB (the most the board will support)

    LuxRender still has issues with their pure GPU mode which supposedly are to be ironed out in ver. 2.0. Not sure how fast Reality will be updated to accommodate it as I understand there are currently issues between it and the Lux 1.4 beta.

    As it has been described in earlier posts here, Octane's out of core rendering uses the GPU primarily for the Geometry and CPU for processing the textures. In speed tests it still seems to be much faster than Lux's pure CPU mode. This is why, even considering the expense, Octane with Out of Core rendering has become more intriguing to me.

  • Rashad CarterRashad Carter Posts: 1,830
    edited December 1969

    Kyoto Kid said:
    UHF said:
    Kyoto Kid said:
    ...the most VRAM a GeForce GPU has is 6 GB. The TitanX is actually a dual GPU so even though it advertises 12GB, for rendering purposes only 6 GB applies. After reading though this thread, A 3 - 4 GB GPU is more than sufficient when using the Out of Core rendering option.

    I think you worry too much. The pros all use GeForce wit 6GB. Is your computer way awesome over speced, and horribly expensive?

    With Reality you have similar solutions to different problems. With Lux, you merge Bump and Displacement into a single normal Map to save memory and speed up the render. Alternatively, you could crank up SubD to solve displacement but it will slow down your render, and use way way more memory. More render time equals more heat, and of course a bigger electricity bill.

    I suppose its good news that they can do out of GPU RAM use. So far I haven't done anything with more than 5 characters, dozens of props, and a few models in Octane. 1 or two more characters and I'd be tapped out.


    ...cannot afford a Titan Black (almost 1,000$) or even a 4 GB 980 (about 600$) at the moment. Currently only have 12 GB memory with only a 1GB Nvidia GPU. Looking to scrape up the finds to upgrade to the memory to 24 GB (the most the board will support)

    LuxRender still has issues with their pure GPU mode which supposedly are to be ironed out in ver. 2.0. Not sure how fast Reality will be updated to accommodate it as I understand there are currently issues between it and the Lux 1.4 beta.

    As it has been described in earlier posts here, Octane's out of core rendering uses the GPU primarily for the Geometry and CPU for processing the textures. In speed tests it still seems to be much faster than Lux's pure CPU mode. This is why, even considering the expense, Octane with Out of Core rendering has become more intriguing to me.

    But with the advent of DS having its own native unbiased rendering options that likely perform at a faster rate than anything Octane or LuxRender could ever hope to catch. If you ask me, the game has totally changed for DS users. I cannot see any reason why they would not eventually adopt this for Carrara and hopefully for Bryce some day as well.

  • MattymanxMattymanx Posts: 7,000
    edited December 1969

    Kyoto Kid said:
    UHF said:
    Kyoto Kid said:
    ...the most VRAM a GeForce GPU has is 6 GB. The TitanX is actually a dual GPU so even though it advertises 12GB, for rendering purposes only 6 GB applies. After reading though this thread, A 3 - 4 GB GPU is more than sufficient when using the Out of Core rendering option.

    I think you worry too much. The pros all use GeForce wit 6GB. Is your computer way awesome over speced, and horribly expensive?

    With Reality you have similar solutions to different problems. With Lux, you merge Bump and Displacement into a single normal Map to save memory and speed up the render. Alternatively, you could crank up SubD to solve displacement but it will slow down your render, and use way way more memory. More render time equals more heat, and of course a bigger electricity bill.

    I suppose its good news that they can do out of GPU RAM use. So far I haven't done anything with more than 5 characters, dozens of props, and a few models in Octane. 1 or two more characters and I'd be tapped out.


    ...cannot afford a Titan Black (almost 1,000$) or even a 4 GB 980 (about 600$) at the moment. Currently only have 12 GB memory with only a 1GB Nvidia GPU. Looking to scrape up the finds to upgrade to the memory to 24 GB (the most the board will support)

    LuxRender still has issues with their pure GPU mode which supposedly are to be ironed out in ver. 2.0. Not sure how fast Reality will be updated to accommodate it as I understand there are currently issues between it and the Lux 1.4 beta.

    As it has been described in earlier posts here, Octane's out of core rendering uses the GPU primarily for the Geometry and CPU for processing the textures. In speed tests it still seems to be much faster than Lux's pure CPU mode. This is why, even considering the expense, Octane with Out of Core rendering has become more intriguing to me.

    But with the advent of DS having its own native unbiased rendering options that likely perform at a faster rate than anything Octane or LuxRender could ever hope to catch. If you ask me, the game has totally changed for DS users. I cannot see any reason why they would not eventually adopt this for Carrara and hopefully for Bryce some day as well.


    or roll Bryce into Carrara or DS

  • kyoto kidkyoto kid Posts: 42,161
    edited March 2015

    Kyoto Kid said:
    ...the most VRAM a GeForce GPU has is 6 GB. The TitanX is actually a dual GPU so even though it advertises 12GB, for rendering purposes only 6 GB applies.

    Perhaps you meant to say the Titan Z?

    The Titan X has only just been announced, with very little information. We know it has 8 billion transistors - so it's a single GPU. And we know it comes with 12GB of vram...

    It'll probably be released within a fortnight, so (assuming it's good) keep an eye out for original 6GB Titans coming onto the second hand market as the enthusiasts adopt the new device.
    ...yeah, meant the Titan-Z. Apologies. Tech curve moves way to fast at times for an "old schooler" like me.

    So if the Titan-X is to have a "true" 12 GB of VRAM, sounds like it will be prohibitively expensive for many of us as the "Z" is already about 1,600$.

    Post edited by kyoto kid on
  • Rashad CarterRashad Carter Posts: 1,830
    edited December 1969

    Mattymanx said:
    Kyoto Kid said:
    UHF said:
    Kyoto Kid said:
    ...the most VRAM a GeForce GPU has is 6 GB. The TitanX is actually a dual GPU so even though it advertises 12GB, for rendering purposes only 6 GB applies. After reading though this thread, A 3 - 4 GB GPU is more than sufficient when using the Out of Core rendering option.

    I think you worry too much. The pros all use GeForce wit 6GB. Is your computer way awesome over speced, and horribly expensive?

    With Reality you have similar solutions to different problems. With Lux, you merge Bump and Displacement into a single normal Map to save memory and speed up the render. Alternatively, you could crank up SubD to solve displacement but it will slow down your render, and use way way more memory. More render time equals more heat, and of course a bigger electricity bill.

    I suppose its good news that they can do out of GPU RAM use. So far I haven't done anything with more than 5 characters, dozens of props, and a few models in Octane. 1 or two more characters and I'd be tapped out.


    ...cannot afford a Titan Black (almost 1,000$) or even a 4 GB 980 (about 600$) at the moment. Currently only have 12 GB memory with only a 1GB Nvidia GPU. Looking to scrape up the finds to upgrade to the memory to 24 GB (the most the board will support)

    LuxRender still has issues with their pure GPU mode which supposedly are to be ironed out in ver. 2.0. Not sure how fast Reality will be updated to accommodate it as I understand there are currently issues between it and the Lux 1.4 beta.

    As it has been described in earlier posts here, Octane's out of core rendering uses the GPU primarily for the Geometry and CPU for processing the textures. In speed tests it still seems to be much faster than Lux's pure CPU mode. This is why, even considering the expense, Octane with Out of Core rendering has become more intriguing to me.

    But with the advent of DS having its own native unbiased rendering options that likely perform at a faster rate than anything Octane or LuxRender could ever hope to catch. If you ask me, the game has totally changed for DS users. I cannot see any reason why they would not eventually adopt this for Carrara and hopefully for Bryce some day as well.


    or roll Bryce into Carrara or DS

    That would be a nightmare for Bryce users. Bryce is Bryce because of its interface, if you remove that then the entire userbase will dry up overnight. Most Bryce users hate everything about he way DS and Carrara are set up. IT really would be best to give Bryce for once its own direct connection to unbiased rendering.

  • MattymanxMattymanx Posts: 7,000
    edited December 1969

    Just thinking it would be better to roll its features and abilities into another app then have all 3 DAZ apps use the same render engine yet not all be able to do what the other one does in general

Sign In or Register to comment.