Thanks!
So far I've never seen the need for complicated piles of layers and effects.
As far as I can tell nearly all depends on the quality of the original textures, all I do is set a few surface parameters to my liking.
These are the first renders I've done with MindVision's Actual Hair 2, and I'm impressed with how nice it looks!
Rendered using the Octane plugin (v1.2 as before) for DAZ Studio.
Is this to mean you guys have written a native unbiased render engine for DazStudio that doesn't require an external plug-in? Am I reading your post correctly?
Is this to mean you guys have written a native unbiased render engine for DazStudio that doesn't require an external plug-in? Am I reading your post correctly?Not exactly. We didn't write it. but it does mean no plug-in required and it is the default render engine. See the thread once it has more information, all will be explained.
...the most VRAM a GeForce GPU has is 6 GB. The TitanX is actually a dual GPU so even though it advertises 12GB, for rendering purposes only 6 GB applies. After reading though this thread, A 3 - 4 GB GPU is more than sufficient when using the Out of Core rendering option.
I think you worry too much. The pros all use GeForce wit 6GB. Is your computer way awesome over speced, and horribly expensive?
With Reality you have similar solutions to different problems. With Lux, you merge Bump and Displacement into a single normal Map to save memory and speed up the render. Alternatively, you could crank up SubD to solve displacement but it will slow down your render, and use way way more memory. More render time equals more heat, and of course a bigger electricity bill.
I suppose its good news that they can do out of GPU RAM use. So far I haven't done anything with more than 5 characters, dozens of props, and a few models in Octane. 1 or two more characters and I'd be tapped out.
...the most VRAM a GeForce GPU has is 6 GB. The TitanX is actually a dual GPU so even though it advertises 12GB, for rendering purposes only 6 GB applies.
Perhaps you meant to say the Titan Z?
The Titan X has only just been announced, with very little information. We know it has 8 billion transistors - so it's a single GPU. And we know it comes with 12GB of vram...
It'll probably be released within a fortnight, so (assuming it's good) keep an eye out for original 6GB Titans coming onto the second hand market as the enthusiasts adopt the new device.
...the most VRAM a GeForce GPU has is 6 GB. The TitanX is actually a dual GPU so even though it advertises 12GB, for rendering purposes only 6 GB applies. After reading though this thread, A 3 - 4 GB GPU is more than sufficient when using the Out of Core rendering option.
I think you worry too much. The pros all use GeForce wit 6GB. Is your computer way awesome over speced, and horribly expensive?
With Reality you have similar solutions to different problems. With Lux, you merge Bump and Displacement into a single normal Map to save memory and speed up the render. Alternatively, you could crank up SubD to solve displacement but it will slow down your render, and use way way more memory. More render time equals more heat, and of course a bigger electricity bill.
I suppose its good news that they can do out of GPU RAM use. So far I haven't done anything with more than 5 characters, dozens of props, and a few models in Octane. 1 or two more characters and I'd be tapped out.
...cannot afford a Titan Black (almost 1,000$) or even a 4 GB 980 (about 600$) at the moment. Currently only have 12 GB memory with only a 1GB Nvidia GPU. Looking to scrape up the finds to upgrade to the memory to 24 GB (the most the board will support)
LuxRender still has issues with their pure GPU mode which supposedly are to be ironed out in ver. 2.0. Not sure how fast Reality will be updated to accommodate it as I understand there are currently issues between it and the Lux 1.4 beta.
As it has been described in earlier posts here, Octane's out of core rendering uses the GPU primarily for the Geometry and CPU for processing the textures. In speed tests it still seems to be much faster than Lux's pure CPU mode. This is why, even considering the expense, Octane with Out of Core rendering has become more intriguing to me.
...the most VRAM a GeForce GPU has is 6 GB. The TitanX is actually a dual GPU so even though it advertises 12GB, for rendering purposes only 6 GB applies. After reading though this thread, A 3 - 4 GB GPU is more than sufficient when using the Out of Core rendering option.
I think you worry too much. The pros all use GeForce wit 6GB. Is your computer way awesome over speced, and horribly expensive?
With Reality you have similar solutions to different problems. With Lux, you merge Bump and Displacement into a single normal Map to save memory and speed up the render. Alternatively, you could crank up SubD to solve displacement but it will slow down your render, and use way way more memory. More render time equals more heat, and of course a bigger electricity bill.
I suppose its good news that they can do out of GPU RAM use. So far I haven't done anything with more than 5 characters, dozens of props, and a few models in Octane. 1 or two more characters and I'd be tapped out.
...cannot afford a Titan Black (almost 1,000$) or even a 4 GB 980 (about 600$) at the moment. Currently only have 12 GB memory with only a 1GB Nvidia GPU. Looking to scrape up the finds to upgrade to the memory to 24 GB (the most the board will support)
LuxRender still has issues with their pure GPU mode which supposedly are to be ironed out in ver. 2.0. Not sure how fast Reality will be updated to accommodate it as I understand there are currently issues between it and the Lux 1.4 beta.
As it has been described in earlier posts here, Octane's out of core rendering uses the GPU primarily for the Geometry and CPU for processing the textures. In speed tests it still seems to be much faster than Lux's pure CPU mode. This is why, even considering the expense, Octane with Out of Core rendering has become more intriguing to me.
But with the advent of DS having its own native unbiased rendering options that likely perform at a faster rate than anything Octane or LuxRender could ever hope to catch. If you ask me, the game has totally changed for DS users. I cannot see any reason why they would not eventually adopt this for Carrara and hopefully for Bryce some day as well.
...the most VRAM a GeForce GPU has is 6 GB. The TitanX is actually a dual GPU so even though it advertises 12GB, for rendering purposes only 6 GB applies. After reading though this thread, A 3 - 4 GB GPU is more than sufficient when using the Out of Core rendering option.
I think you worry too much. The pros all use GeForce wit 6GB. Is your computer way awesome over speced, and horribly expensive?
With Reality you have similar solutions to different problems. With Lux, you merge Bump and Displacement into a single normal Map to save memory and speed up the render. Alternatively, you could crank up SubD to solve displacement but it will slow down your render, and use way way more memory. More render time equals more heat, and of course a bigger electricity bill.
I suppose its good news that they can do out of GPU RAM use. So far I haven't done anything with more than 5 characters, dozens of props, and a few models in Octane. 1 or two more characters and I'd be tapped out.
...cannot afford a Titan Black (almost 1,000$) or even a 4 GB 980 (about 600$) at the moment. Currently only have 12 GB memory with only a 1GB Nvidia GPU. Looking to scrape up the finds to upgrade to the memory to 24 GB (the most the board will support)
LuxRender still has issues with their pure GPU mode which supposedly are to be ironed out in ver. 2.0. Not sure how fast Reality will be updated to accommodate it as I understand there are currently issues between it and the Lux 1.4 beta.
As it has been described in earlier posts here, Octane's out of core rendering uses the GPU primarily for the Geometry and CPU for processing the textures. In speed tests it still seems to be much faster than Lux's pure CPU mode. This is why, even considering the expense, Octane with Out of Core rendering has become more intriguing to me.
But with the advent of DS having its own native unbiased rendering options that likely perform at a faster rate than anything Octane or LuxRender could ever hope to catch. If you ask me, the game has totally changed for DS users. I cannot see any reason why they would not eventually adopt this for Carrara and hopefully for Bryce some day as well.
...the most VRAM a GeForce GPU has is 6 GB. The TitanX is actually a dual GPU so even though it advertises 12GB, for rendering purposes only 6 GB applies.
Perhaps you meant to say the Titan Z?
The Titan X has only just been announced, with very little information. We know it has 8 billion transistors - so it's a single GPU. And we know it comes with 12GB of vram...
It'll probably be released within a fortnight, so (assuming it's good) keep an eye out for original 6GB Titans coming onto the second hand market as the enthusiasts adopt the new device.
...yeah, meant the Titan-Z. Apologies. Tech curve moves way to fast at times for an "old schooler" like me.
So if the Titan-X is to have a "true" 12 GB of VRAM, sounds like it will be prohibitively expensive for many of us as the "Z" is already about 1,600$.
...the most VRAM a GeForce GPU has is 6 GB. The TitanX is actually a dual GPU so even though it advertises 12GB, for rendering purposes only 6 GB applies. After reading though this thread, A 3 - 4 GB GPU is more than sufficient when using the Out of Core rendering option.
I think you worry too much. The pros all use GeForce wit 6GB. Is your computer way awesome over speced, and horribly expensive?
With Reality you have similar solutions to different problems. With Lux, you merge Bump and Displacement into a single normal Map to save memory and speed up the render. Alternatively, you could crank up SubD to solve displacement but it will slow down your render, and use way way more memory. More render time equals more heat, and of course a bigger electricity bill.
I suppose its good news that they can do out of GPU RAM use. So far I haven't done anything with more than 5 characters, dozens of props, and a few models in Octane. 1 or two more characters and I'd be tapped out.
...cannot afford a Titan Black (almost 1,000$) or even a 4 GB 980 (about 600$) at the moment. Currently only have 12 GB memory with only a 1GB Nvidia GPU. Looking to scrape up the finds to upgrade to the memory to 24 GB (the most the board will support)
LuxRender still has issues with their pure GPU mode which supposedly are to be ironed out in ver. 2.0. Not sure how fast Reality will be updated to accommodate it as I understand there are currently issues between it and the Lux 1.4 beta.
As it has been described in earlier posts here, Octane's out of core rendering uses the GPU primarily for the Geometry and CPU for processing the textures. In speed tests it still seems to be much faster than Lux's pure CPU mode. This is why, even considering the expense, Octane with Out of Core rendering has become more intriguing to me.
But with the advent of DS having its own native unbiased rendering options that likely perform at a faster rate than anything Octane or LuxRender could ever hope to catch. If you ask me, the game has totally changed for DS users. I cannot see any reason why they would not eventually adopt this for Carrara and hopefully for Bryce some day as well.
or roll Bryce into Carrara or DS
That would be a nightmare for Bryce users. Bryce is Bryce because of its interface, if you remove that then the entire userbase will dry up overnight. Most Bryce users hate everything about he way DS and Carrara are set up. IT really would be best to give Bryce for once its own direct connection to unbiased rendering.
Just thinking it would be better to roll its features and abilities into another app then have all 3 DAZ apps use the same render engine yet not all be able to do what the other one does in general
Comments
Thanks!
So far I've never seen the need for complicated piles of layers and effects.
As far as I can tell nearly all depends on the quality of the original textures, all I do is set a few surface parameters to my liking.
Cheers!
Erik
(oops, wrong account..... oh whatever)
These are the first renders I've done with MindVision's Actual Hair 2, and I'm impressed with how nice it looks!
Rendered using the Octane plugin (v1.2 as before) for DAZ Studio.
Cheers!
Erik
DAZ Studio Default (unbiased) render.
http://www.daz3d.com/forums/discussion/53671/
Is this to mean you guys have written a native unbiased render engine for DazStudio that doesn't require an external plug-in? Am I reading your post correctly?
Rendered in DS - No postwork. :) Unbiased. (White grape juice - NOT wine!) :)
Is this to mean you guys have written a native unbiased render engine for DazStudio that doesn't require an external plug-in? Am I reading your post correctly?Not exactly. We didn't write it. but it does mean no plug-in required and it is the default render engine. See the thread once it has more information, all will be explained.
...apparently looks like some of the more advanced features of the Renderman 3DL engine have been opened up.
So what does this mean for plugins like Reality and Luxus as well as Luxrender? and how CPU intensive is it?
Also what does it do to AoA's SSS shaders, lights and cameras?
And just for fun, one more:
Not exactly. :)
It doesn't change any of the plug-ins. They still work.
...direct pipeline to Lux?
I think you worry too much. The pros all use GeForce wit 6GB. Is your computer way awesome over speced, and horribly expensive?
With Reality you have similar solutions to different problems. With Lux, you merge Bump and Displacement into a single normal Map to save memory and speed up the render. Alternatively, you could crank up SubD to solve displacement but it will slow down your render, and use way way more memory. More render time equals more heat, and of course a bigger electricity bill.
I suppose its good news that they can do out of GPU RAM use. So far I haven't done anything with more than 5 characters, dozens of props, and a few models in Octane. 1 or two more characters and I'd be tapped out.
Daz's new render engine:) http://www.daz3d.com/forums/discussion/53671/
Referring to the Public Beta thread. :) http://www.daz3d.com/forums/discussion/53671/
It looks like apple juice to me ;)
Daz_Spooky: I did a Vanguard render as well... I love that ship. I just wish it had doors between different parts of the ship.
HAH!!!! I see Wine Me works with Iray! Nice :)
Perhaps you meant to say the Titan Z?
The Titan X has only just been announced, with very little information. We know it has 8 billion transistors - so it's a single GPU. And we know it comes with 12GB of vram...
It'll probably be released within a fortnight, so (assuming it's good) keep an eye out for original 6GB Titans coming onto the second hand market as the enthusiasts adopt the new device.
HAH!!!! I see Wine Me works with Iray! Nice :)
Yes it does. Beautifully. :)
Oh and one more. Doing a quick test for comparison purposes and I came up with this in 61 seconds.
Rendered in DS, No postwork.
This is awesome! Really super fast, too!
Go download it like Totte and have some fun.
I think you worry too much. The pros all use GeForce wit 6GB. Is your computer way awesome over speced, and horribly expensive?
With Reality you have similar solutions to different problems. With Lux, you merge Bump and Displacement into a single normal Map to save memory and speed up the render. Alternatively, you could crank up SubD to solve displacement but it will slow down your render, and use way way more memory. More render time equals more heat, and of course a bigger electricity bill.
I suppose its good news that they can do out of GPU RAM use. So far I haven't done anything with more than 5 characters, dozens of props, and a few models in Octane. 1 or two more characters and I'd be tapped out.
...cannot afford a Titan Black (almost 1,000$) or even a 4 GB 980 (about 600$) at the moment. Currently only have 12 GB memory with only a 1GB Nvidia GPU. Looking to scrape up the finds to upgrade to the memory to 24 GB (the most the board will support)
LuxRender still has issues with their pure GPU mode which supposedly are to be ironed out in ver. 2.0. Not sure how fast Reality will be updated to accommodate it as I understand there are currently issues between it and the Lux 1.4 beta.
As it has been described in earlier posts here, Octane's out of core rendering uses the GPU primarily for the Geometry and CPU for processing the textures. In speed tests it still seems to be much faster than Lux's pure CPU mode. This is why, even considering the expense, Octane with Out of Core rendering has become more intriguing to me.
...cannot afford a Titan Black (almost 1,000$) or even a 4 GB 980 (about 600$) at the moment. Currently only have 12 GB memory with only a 1GB Nvidia GPU. Looking to scrape up the finds to upgrade to the memory to 24 GB (the most the board will support)
LuxRender still has issues with their pure GPU mode which supposedly are to be ironed out in ver. 2.0. Not sure how fast Reality will be updated to accommodate it as I understand there are currently issues between it and the Lux 1.4 beta.
As it has been described in earlier posts here, Octane's out of core rendering uses the GPU primarily for the Geometry and CPU for processing the textures. In speed tests it still seems to be much faster than Lux's pure CPU mode. This is why, even considering the expense, Octane with Out of Core rendering has become more intriguing to me.
But with the advent of DS having its own native unbiased rendering options that likely perform at a faster rate than anything Octane or LuxRender could ever hope to catch. If you ask me, the game has totally changed for DS users. I cannot see any reason why they would not eventually adopt this for Carrara and hopefully for Bryce some day as well.
...cannot afford a Titan Black (almost 1,000$) or even a 4 GB 980 (about 600$) at the moment. Currently only have 12 GB memory with only a 1GB Nvidia GPU. Looking to scrape up the finds to upgrade to the memory to 24 GB (the most the board will support)
LuxRender still has issues with their pure GPU mode which supposedly are to be ironed out in ver. 2.0. Not sure how fast Reality will be updated to accommodate it as I understand there are currently issues between it and the Lux 1.4 beta.
As it has been described in earlier posts here, Octane's out of core rendering uses the GPU primarily for the Geometry and CPU for processing the textures. In speed tests it still seems to be much faster than Lux's pure CPU mode. This is why, even considering the expense, Octane with Out of Core rendering has become more intriguing to me.
But with the advent of DS having its own native unbiased rendering options that likely perform at a faster rate than anything Octane or LuxRender could ever hope to catch. If you ask me, the game has totally changed for DS users. I cannot see any reason why they would not eventually adopt this for Carrara and hopefully for Bryce some day as well.
or roll Bryce into Carrara or DS
Perhaps you meant to say the Titan Z?
The Titan X has only just been announced, with very little information. We know it has 8 billion transistors - so it's a single GPU. And we know it comes with 12GB of vram...
It'll probably be released within a fortnight, so (assuming it's good) keep an eye out for original 6GB Titans coming onto the second hand market as the enthusiasts adopt the new device.
...yeah, meant the Titan-Z. Apologies. Tech curve moves way to fast at times for an "old schooler" like me.
So if the Titan-X is to have a "true" 12 GB of VRAM, sounds like it will be prohibitively expensive for many of us as the "Z" is already about 1,600$.
But with the advent of DS having its own native unbiased rendering options that likely perform at a faster rate than anything Octane or LuxRender could ever hope to catch. If you ask me, the game has totally changed for DS users. I cannot see any reason why they would not eventually adopt this for Carrara and hopefully for Bryce some day as well.
or roll Bryce into Carrara or DS
That would be a nightmare for Bryce users. Bryce is Bryce because of its interface, if you remove that then the entire userbase will dry up overnight. Most Bryce users hate everything about he way DS and Carrara are set up. IT really would be best to give Bryce for once its own direct connection to unbiased rendering.
Just thinking it would be better to roll its features and abilities into another app then have all 3 DAZ apps use the same render engine yet not all be able to do what the other one does in general