I also think the shaders are somewhat limited compared to what you can design in the two others but that already give great possibility to model whatever hair you could have in mind. I have fun making some hair but I often think it's like playing Barbie when I comb.
The shaders that are implemented within the two plugins are similar to 'production' shaders used with Shave and haircut, but don't have all the parameters 'exposed'...and if it was easier to go from RSL to DS shader, there would be more options...(most of what are called 'shaders' around here are actually just presets...).
It was a legacy rig since I did that in DS3 with the Figure Setup Tool. Unfortunately the .CR2 Exporter in DS4 isn't able to export some of the ERC code I use to control the hair length (toggling a node's visibility rather than adjusting scale). I might just revisit that again after I'm done with something I'm cooking for G2M.
So, you're making G2M even better than it is? Sounds tempting =) And the idea of toggling a node visibility is really cool!
Then I boosted the value for the lights since the results were very dark. I was using values of around 100 to 200 percent, but with the gamma change I have to go all the way to 800 to 1300 percent. So in effect, enabling gamma correction allows you to have much more range and precision in lighting.
Sounds exciting, and the renders you posted look fabulous even without SSS!
From the results, I'd say enabling gamma correction for the textures also allows you to better maintain texture saturation compared to just using gain/gamma alone.
That's the part that I've always found somewhat hard to understand. What do we need to do with textures? Is there a way to tell which gamma they are supposed to be rendered with?
----------------------------
One thing to be aware is that you can't expect that to react realistic. I don't know if I should play again with that as it can be frustrating
Frustrating more than anything. I generally manage to get what I like with just the anisotropic highlights in the UberSurface (when the model is mapped suitably), but then again, I'm not doing complicated stuff like Wowie's fluffy translucent blondes. Flat transmapped planes seem to be a better approximation for darker heavy, smooth long hair that does not let much light "inside" it.
I often think it's like playing Barbie when I comb
Yeah... Isn't everything we do here like playing Barbie? =)
I didn't play a lot with DS lately but I'll give a try to your Arealight network. I find it strange just looking at it, but I think it is better to test it to know
Basically the bulk of the network is premade; I only added transmission and everything that goes along with it. The code from 3Delight examples surely looks simpler (merely "transmission (P,Ps)" IIRC), but it didn´t work that way in the Builder (actually what I show is the only way I was able to make it work), so I calculated the light origin back from the light vector and the surface point. I like it. Rendered meself some bunnies LOL
Then there's the Fresnel brick - I must be out of my mind, but whatever the docs say, I still find it only gives me results that visually make sense when I'm using the complex equation for its input IOR thingie - never a simple IOR ratio that most sources give.
i'm curious, i don't know enough about ds lighting, or 3delight lighting.
There are two products in the DAZ market that produce that kind of light... omAreaLight Advanced Ambient Light
In general 'mesh lights' as they are referred to tend to render very slowly as the rendering engine treats it as if it is casting many light rays from the surface, so there are a lot of calculations to do. But the light it creates is very even and pleasing for portrait style images.
There are two products in the DAZ market that produce that kind of light... omAreaLight Advanced Ambient Light
In general 'mesh lights' as they are referred to tend to render very slowly as the rendering engine treats it as if it is casting many light rays from the surface, so there are a lot of calculations to do. But the light it creates is very even and pleasing for portrait style images.
Ummm I'm very sorry, but AoA's AdvancedAmbient is NOT a mesh light neither can it do GI (that's what Misty was actually asking about, I believe).
Then, omAreaLight has long been superceded by a better version, UberAreaLight which is included in DS for free now (it used to be part of DS3 "Advanced" which used to be a pay-for version, but that's already ancient history actually).
I would not say area lights necessarily render "very slowly" (unless you're upping shadow samples through the roof) - maybe if you're doing animations, then the overhead per frame would all add up into extra hours, but for stills... they are not "very" slow. Simply slower than spotlights.
The only thing that dramatically slows area lights down are transparencies, especially layered ones - but they slow all raytracing down, even simple spotlights with raytraced shadows.
GI is a yet another issue - generally, when it's raytraced, it's indeed "very slow".
Misty,
does uvmapping have any effect on how light diffuses on an object?
Nope. UV mapping affects how the object takes textures or procedural patterns (if the procedural shader is set to work in UV space).
Then, the box and GI in DAZ Studio... I got the Cornell box archive from here - http://graphics.cs.williams.edu/data/meshes.xml#10 - and
used the "original" one (it has a somewhat different position of the light as compared to the one tested in the page you linked to).
I imported it at Bryce scale (not too big and not too small) and applied the UberAreaLight base onto onto the dedicated "light" surface, setting shadow samples to 64.
I'm using UberEnvironment2 in GI mode (there's a shortcut to load it all set up for this but the quality in the UE2 folder - I'm not sure if it is 110% OK to leave its light colour white, but I won't change it for these tests). My quality settings are unchanged from the default ones, save for samples = 192.
I also changed Maximum Trace Distance to 1000 instead of 500 (that's in cm, DS default unit): it's not a quality control but it affects how far the rays are traced. Since the box looks to be over 5 meters wide in my scene, judging by the floor grid, we need a setting above this to let the walls bounce light onto each other.
This setting also may affect render time.
The general render settings are also important for GI, specifically the "max raytrace depth". Generally, the higher it is, the more bounces there will be, but with raytraced GI, it increases render time greatly (exponentially, IIRC).
- raytraced (DS3): 1 hours 19 minutes 39.36 seconds, raytrace depth 2 - I actually forgot to save the render, and there's no way I'm doing it again. But it didn't look much different from the next one...
- point-cloud scripted renderer (DS4.6): around 3 minutes with cloud shading rate of 1.
Weird stuff to look out for: the default DS shader loses reflectivity completely when doing UE2-driven GI, whether raytraced or point-cloud based. UberSurface works fine.
I've run a series of tests with the point-cloud script. Using a lower (=better) shading rate on the UE2 than the default 32 yields significantly more correct shading in corners, and adds very little overhead (rendering time about 5 mins). You could also up raytrace depth more or less freely (it will make everything brighter, just in case - it's supposed to be that way).
Test renders are boring, so here´s the best one only, with all its settings (the ones not shown are default; default max raytrace depth is 4 for the scripted renderer). Render time 17 minutes 48.66 seconds.
...talking to myself now, for lurkers' sake. As for Fresnel bricks in shader mixer and shader builder, I think I got it all sorted out now - found a great reference table here: http://www.renderman.org/RMR/Examples/rrf/index.html
The correct way to compute Fresnel ratio for functions in both Shader Mixer and Shader Builder is the simple one: eta1/eta2 (1 is the surrounding medium, think air; 2 is your material).
The lesson is, don't test your Fresnel attenuation on people or bunnies. Complex shapes don't help with that, just the other way around.
There are two products in the DAZ market that produce that kind of light... omAreaLight Advanced Ambient Light
In general 'mesh lights' as they are referred to tend to render very slowly as the rendering engine treats it as if it is casting many light rays from the surface, so there are a lot of calculations to do. But the light it creates is very even and pleasing for portrait style images.
Ummm I'm very sorry, but AoA's AdvancedAmbient is NOT a mesh light neither can it do GI (that's what Misty was actually asking about, I believe).
Then, omAreaLight has long been superceded by a better version, UberAreaLight which is included in DS for free now (it used to be part of DS3 "Advanced" which used to be a pay-for version, but that's already ancient history actually).
Yeah, double bad on my part. :red:
Although the AdvancedAmbient can act similar to an area light with the soft shadows, etc. it isn't the same thing.
And I keep forgetting that DAZ gave away things in 4.0 that I paid for. :long:
I would not say area lights necessarily render "very slowly" (unless you're upping shadow samples through the roof) - maybe if you're doing animations, then the overhead per frame would all add up into extra hours, but for stills... they are not "very" slow. Simply slower than spotlights.
The only thing that dramatically slows area lights down are transparencies, especially layered ones - but they slow all raytracing down, even simple spotlights with raytraced shadows.
GI is a yet another issue - generally, when it's raytraced, it's indeed "very slow".
You're probably right there that it was an overly broad general statement. However, since most DAZ artists are using human figures with transparency mapped hair (often in several layers), the performance impact is noticeable. Not to say "Don't use them, you'll wait a week for your image!" Just to be aware of the difference.
Nice job on the test renders and stats, though. :)
I don't remember if I did the test in DS but I did the Cornel Box at least once. Pretty much annoying once you have done it (and long rendertimes). But it could be usefull to compare render engines or algo
@ Mustakettu85 : I didn't have a look at your Fresnel equation (not really, but stopped right after seeing there was a reference of two mediums and thought it's wayyy too complicated). In fact I don't really understand what you want to do as we're rather dealing with suface shading and not Volume shading inside a medium (that's what you have in Luxrender and probably in many other unbiased engines) or are you trying to make an underwater shader?
With 3delight and Shader mixer you could use the built in Bricks in SM or functions Fresnel / Reflect / Refract. You just have to give correct parameters. No need to reinvent the wheel
And I don't see why you can't test Fresnel on people or bunnies. You should be able to see it depending on the lights and camera placement
I know it is not the same thing you all are talking about but if it had not been for the Cornell Box question and the omAreaLight Light statement I would not have been inspired to try it out. I loaded a UberEnvironment with the Cornell Box preset along with a omAreaLight; rendered and seen the skin looking kind of purple. So I added a yellow light to mute down that purple. I'm not using the Indirect lighting though, it is just too slow for me; instead I used Occlusion w/Directional Shadows. I also added a few other lights for specular, a backlight, a fill and a key light.
The quality is by no means as cool as the stuff I'm seeing you all do, but it did render in around 10 minutes.
I'm also using a quad with 8 gigs of ram and W7 64
...are you trying to make an underwater shader?...
And I don't see why you can't test Fresnel on people or bunnies. You should be able to see it depending on the lights and camera placement
Well, what I'm actually trying is to ensure I am indeed feeding the correct "IOR ratio" parameter to the fresnel functions (and not ruling out underwater scenes either...) =)
You see, the problem with girls and bunnies was that "eyeballing" the fresnel effects led me to believe that the simple "eta1 over eta2" ratio was not working right. I did not have any 110% reliable reference to calibrate against - I have no way to actually measure (let alone change) IORs of real-life stuff. So I did not understand that fresnel attenuation is _that_ sensitive to small changes in the material IOR (eta2). If you look at that awesome table I linked to in my previous post, it is clear that 1.3 and 1.4 are noticeably different... and I was just going with 1.3 thinking "kinda water, hence kinda skin"... Kinda wrong =)
Now that I've done the sphere tests, I've moved back to bunnies and people, and I see that it does match what I see IRL much better (when using 1.4).
I will also do the sphere tests for UberSurface2 now, and so I will finally have a way to match its mysterious fresnel parameters to material IOR, however roughly.
I'm happy it turned out to be just a single divide operator, less hassle =)
I loaded a UberEnvironment with the Cornell Box preset along with a omAreaLight; rendered and seen the skin looking kind of purple. So I added a yellow light to mute down that purple. I'm not using the Indirect lighting though, it is just too slow for me; instead I used Occlusion w/Directional Shadows. I also added a few other lights for specular, a backlight, a fill and a key light.
Looking cool actually! You don't have to worry about not using GI, it's just an option that is nice to have, but nowhere near mandatory. I actually find AO is easier to work with from an "artistic" standpoint... you can change its colour and stuff like that, and it often gives a more dramatic feel to the render than any sort of indirect lighting, I'd say. And I like contrast =)
--------------------------------
the shadows at the foot of the cubes look a lil roundish?
no bunnies were harmed in these tests :)
Haha, thanks =) Yeah the shadows are rounded maybe a bit too much, I think the GI might be eating at their angles where they are weakest. We need a real-life photo of a box with a square "area light" on its ceiling... I think I have lamps at work that might pass for a light like that, I may try if I don't forget...
-------------------------------
And I keep forgetting that DAZ gave away things in 4.0 that I paid for. :long:
... However, since most DAZ artists are using human figures with transparency mapped hair (often in several layers), the performance impact is noticeable.
Nice job on the test renders and stats, though. :)
Thanks =) And yeah, DAZ3D are kinda bad like that; I'm sorry you were one of those customers.
And transmapped hair models are evil incarnate =D But a lot of them look so cool, it's impossible to resist! Makes them three times more evil, I guess =)
What I want to do eventually is to write an area light shader that would allow flagging surfaces like AoA's lights do - so that hair could be excluded and only lit with a shadow-mapped spotlight. Or maybe he will read this and make that shader before I manage to, that would be fine as well =)
Just an idea : why not setting up the same scene in an other renderer (let's say Luxrender ) and compare?
Yeah, that's a great idea, but it wouldn't have helped when I was downright wrong about the skin IOR, I think.
I think I forgot about Lux completely, I was looking into Carrara's renderer, but its Fresnel controls are somewhat unclear, too (again, that "strength" parameter...).
And hey everyone, I was thinking of that transmapped hair vs raytraced shadows issue, and here's a crude solution I came up with: http://www.sharecg.com/v/74284/view/3/PDF-Tutorial/Hair-Compositing-Tutorial
It's about rendering your scene sans the offending hair and then rendering only the hair and the objects it casts a shadow on, to be composited in an image editor. I made a simple shader in ShaderMixer to apply to those objects, to turn them into "green screen". It's not perfect and generally may need practice, but who knows, it may come in handy.
@Mustakettu85 : I'm not fond of the compositing as you'll lose time there too. But it is an interesting option
I wrote a simple shader that I thought to render hair a little quicker but it only had a basic specular. and I was not sure the time gained was worth (something like 8 min render instead of 10 )
I can pack it so that you test it if you want
After I did some test on hairs I think that if you want t to get some interesting effect and good shading on hairs, you must be ready to pay the price aka long render time
I did a Cornell box render...no Uber...built the area light, and camera in ShaderMixer. It took forever (like 10 hrs) to render, but I think I screwed up and had one of the samples at some insane setting...I know I had the overall shading rate at nearly insane 0.1...
There is a bit of graininess in the corners, because I didn't want to go back and redo it with 'more sane' settings.
But ShaderMixer built items are usually faster than Uber, at least on my system.
verry verry interesting. looks like color from the wall is tinting the shadows
Yes, it is...that is the whole point of that particular 'test'...color bleeding is a real life phenomenon and you only get it in 3d with unbiased renderers (Luxrender) or using indirect diffuse/final gather/some other 'trick' in a biased renderer (3Delight).
@Mustakettu85 : I'm not fond of the compositing as you'll lose time there too. But it is an interesting option
Yeah, it depends on how well the image editor (and its user) perform, too. It's definitely way faster for me, when I'm on my 32bit home system (multilayered hair with area lights or other raytraced shadows = way over an hour; no hair = ten minutes with AO and SSS; just the hair - another ten minutes at most, including all the preparations; and actual compositing only adds a few minutes to the overall postwork time - and I always do postwork for my gallery renders). So I think I will be using more of my transmapped hair library now that I have the steps =D
(I'm one of those who writes stuff down not only for others, but for oneself LOL)
I wrote a simple shader that I thought to render hair a little quicker but it only had a basic specular. and I was not sure the time gained was worth (something like 8 min render instead of 10 )
I can pack it so that you test it if you want
Thanks a lot, I would love to! It can be used for other transmapped surfaces, right?
I did a Cornell box render...no Uber...built the area light, and camera in ShaderMixer. It took forever (like 10 hrs) to render, but I think I screwed up and had one of the samples at some insane setting...
I remember shader mixer GI was taking forever if the max RT depth was over 1. Even with all the photon tricks.
But ShaderMixer built items are usually faster than Uber, at least on my system.
IIRC you're running WINE or something? I wonder how many variables are at play here, because on my native 32bit Vista system shader mixer area light with shadows is seriously slower than UberArea one.
I really like the shaderbuilder dzAreaLights now that I have shadows on them... but I have to test them more with colour maps - there are all sorts of cute effects in Szark's tutorial done with that, and I'm not sure mine respond to colour maps that well. But I usually apply them to a plane, and he has a sphere, would that matter?
-------------------------
shadows are wonky looking.
3delight is unbiased? i get confused between the biased and unbiased, but iirc they have different features.
Biased. The technical difference is how they solve the rendering equation, but for the end-user, it does not really matter.
The most important thing to remember about 3Delight is that it's Renderman-compliant, and there's a lot of resources on Renderman, like this: http://renderman.pixar.com/view/renderman-university - so with enough dedication, almost everything should be possible. When the interface to DS does not get in the way, that is =)
-------------------------
Just for fun: two cornell box renders. Luxrender and 3delight. Luxrender 15 min, 3delight 3.5 min on a dual core, ie. 30 resp 7 minutes cpu time.
Thanks for the side-by-side =) Lux is on the right, correct? Is it point-cloud with 3Delight or raytraced GI?
Just for fun: two cornell box renders. Luxrender and 3delight. Luxrender 15 min, 3delight 3.5 min on a dual core, ie. 30 resp 7 minutes cpu time.
Thanks for the side-by-side =) Lux is on the right, correct?
Yes, the luxrender is the one that actually renders the light, whereas with 3delight the light itself (the square at the top) remains dark. Is it point-cloud with 3Delight or raytraced GI?
It uses a photon map on the normal geometry, so no point cloud (unless you count the photon map as a point cloud, which it internally practically is) and mixes that with the direct lighting (50-50 ratio) using raytracing (for the direct light shadows).
Yes, the luxrender is the one that actually renders the light, whereas with 3delight the light itself (the square at the top) remains dark.
Yeah, unless you enable the ambient channel, the light surface will not "glow".
I'm wondering now, is it possible to make a mesh light in LuxRender that would be invisible? I know that in 3Delight, I can make the mesh transparent and it will emit light regardless, so that I can stick invisible light sources whenever I want.
It uses a photon map on the normal geometry, so no point cloud (unless you count the photon map as a point cloud, which it internally practically is) and mixes that with the direct lighting (50-50 ratio) using raytracing (for the direct light shadows).
Did you use photon mapping from Shader Mixer or a custom shader (Shader Builder or manually scripted)?
--------------------------- huuuh. would cararra render the ceiling light?
I can't test right now (using my mother's computer again), but I know that light sources can be made invisible in Carrara and still emit light - at least, "shape lights" can. They aren't area lights proper in the sense you don't apply them to a surface of a mesh, but they operate along the same principles. I haven't used "Anything Glows" much (the equivalent to DS area light shaders).
Carrara's renderer is biased, so I guess the light-emitting surface does not _have_to_ glow, but it can be made to.
Yes, the luxrender is the one that actually renders the light, whereas with 3delight the light itself (the square at the top) remains dark.
Yeah, unless you enable the ambient channel, the light surface will not "glow".
I'm wondering now, is it possible to make a mesh light in LuxRender that would be invisible? I know that in 3Delight, I can make the mesh transparent and it will emit light regardless, so that I can stick invisible light sources whenever I want.
I'm not sure, but I don't think so...I'll run a couple of renders, later. I think the best you can do is make them 'see through'...invisible to the camera, looking through it, but still a visible 'light' if the camera/reflection 'sees' it.
I Packed the shader. Took me some time to make some DUF preset, and metadatas and licence term. I hope I didn't screw anything
See the thread http://www.daz3d.com/forums/discussion/36216/ to get it
I also rendered a Cornell Box. It took under 2 min with latest 3Delight on my Quadcore i7 notebook
@mjc1016 : my experience with Shader Mixer is also that it is slower than Shader Builder. Still SM is quick for some tests or if you need something to build easily
@Mustakettu85 : From my test with Luxrender you can't hide the light. You can put a null shader on the surface but still the object is lighted as in real life as soon as you declare it as emissive
@Millighost : you say you used photon mapping but that is not what I get with photons ( it's very blurry even with 1024 samples). Did you use Indirectdiffuse() with final gather or is it a tech I don't know?
[Edit] Screwed something in the shader definitions the first time. I've just updated the package
I Packed the shader. Took me some time to make some DUF preset, and metadatas and licence term. I hope I didn't screw anything
See the thread http://www.daz3d.com/forums/discussion/36216/ to get it
I also rendered a Cornell Box. It took under 2 min with latest 3Delight on my Quadcore i7 notebook
@mjc1016 : my experience with Shader Mixer is also that it is slower than Shader Builder. Still SM is quick for some tests or if you need something to build easily
And I've found that even though SM can be slower, it's still faster than the Uber line included in DS. The greater the complexity, the more likely SM is going to start bogging down. I'm pretty sure there are ways of tweaking it's performance, but I haven't had the chance/time to play around with figuring any out.
And if you could set the compile optimization level for Shader Builder, you could speed it up even more. I've been playing around with setting up the shader in SB, then compiling the source in the stand alone 3DL at -O3 and using that sdl to replace the one SB generated...and on some of my more complex shaders, I've seen a 15% or so improvement in render time. The trick, though, is you need to match the 3DL versions, or include the source in the standalone compile (shaderdl --embed-source).
I'm not sure, but I don't think so...I'll run a couple of renders, later. I think the best you can do is make them 'see through'...invisible to the camera, looking through it, but still a visible 'light' if the camera/reflection 'sees' it.
@Mustakettu85 : From my test with Luxrender you can't hide the light. You can put a null shader on the surface but still the object is lighted as in real life as soon as you declare it as emissive
Thanks guys, I knew there had to be a catch.
...how do you guys think, what is the right way to make a light only illuminate "flagged" surfaces? Seems to be logical to get this in the rendertime script, but I'm not sure it's possible to set other objects' attributes from there.
On the other hand, I was looking at this - http://renderman.pixar.com/resources/current/rps/appnote.29.html - I wonder if 3Delight supports this feature... (provided it is possible to set "matte" object attribute successfully from DS - either I'm doing something wrong with the rendertime script, or it doesn't work at all).
Comments
The shaders that are implemented within the two plugins are similar to 'production' shaders used with Shave and haircut, but don't have all the parameters 'exposed'...and if it was easier to go from RSL to DS shader, there would be more options...(most of what are called 'shaders' around here are actually just presets...).
Yeah... Isn't everything we do here like playing Barbie? =)
Basically the bulk of the network is premade; I only added transmission and everything that goes along with it. The code from 3Delight examples surely looks simpler (merely "transmission (P,Ps)" IIRC), but it didn´t work that way in the Builder (actually what I show is the only way I was able to make it work), so I calculated the light origin back from the light vector and the surface point. I like it. Rendered meself some bunnies LOL
Then there's the Fresnel brick - I must be out of my mind, but whatever the docs say, I still find it only gives me results that visually make sense when I'm using the complex equation for its input IOR thingie - never a simple IOR ratio that most sources give.
Here's the equation that I've been using in ShaderMixer (Rs): http://www.terathon.com/wiki/index.php/Building_a_Fresnel_shader#The_Improved_Approach
And I had to use the same in Shader Builder.
Is there something wrong with me, or with DS?
More renders, this time with SSS. Btw, anyone knows who did the Bjorn texture for M6 HD?
has anyone tried like a cornell box test with global illumination?
http://www.graphics.cornell.edu/online/box/compare.html
i'm curious, i don't know enough about ds lighting, or 3delight lighting.
There are two products in the DAZ market that produce that kind of light...
omAreaLight
Advanced Ambient Light
In general 'mesh lights' as they are referred to tend to render very slowly as the rendering engine treats it as if it is casting many light rays from the surface, so there are a lot of calculations to do. But the light it creates is very even and pleasing for portrait style images.
have you tried those light on a cornell box test? would be interesting to see how close they come to the real life photo
hmm
does uvmapping have any effect on how light diffuses on an object?
wowie - where's that thumbs-up smiley? Great work.
Nope. UV mapping affects how the object takes textures or procedural patterns (if the procedural shader is set to work in UV space).
Then, the box and GI in DAZ Studio... I got the Cornell box archive from here - http://graphics.cs.williams.edu/data/meshes.xml#10 - and
used the "original" one (it has a somewhat different position of the light as compared to the one tested in the page you linked to).
I imported it at Bryce scale (not too big and not too small) and applied the UberAreaLight base onto onto the dedicated "light" surface, setting shadow samples to 64.
I'm using UberEnvironment2 in GI mode (there's a shortcut to load it all set up for this but the quality in the UE2 folder - I'm not sure if it is 110% OK to leave its light colour white, but I won't change it for these tests). My quality settings are unchanged from the default ones, save for samples = 192.
I also changed Maximum Trace Distance to 1000 instead of 500 (that's in cm, DS default unit): it's not a quality control but it affects how far the rays are traced. Since the box looks to be over 5 meters wide in my scene, judging by the floor grid, we need a setting above this to let the walls bounce light onto each other.
This setting also may affect render time.
The general render settings are also important for GI, specifically the "max raytrace depth". Generally, the higher it is, the more bounces there will be, but with raytraced GI, it increases render time greatly (exponentially, IIRC).
Computer specs: Intel quad core Q9300 @ 2.5 GHz, 8 GB RAM, Win7 64 bit
Render times are:
- raytraced (DS3): 1 hours 19 minutes 39.36 seconds, raytrace depth 2 - I actually forgot to save the render, and there's no way I'm doing it again. But it didn't look much different from the next one...
- point-cloud scripted renderer (DS4.6): around 3 minutes with cloud shading rate of 1.
Weird stuff to look out for: the default DS shader loses reflectivity completely when doing UE2-driven GI, whether raytraced or point-cloud based. UberSurface works fine.
I've run a series of tests with the point-cloud script. Using a lower (=better) shading rate on the UE2 than the default 32 yields significantly more correct shading in corners, and adds very little overhead (rendering time about 5 mins). You could also up raytrace depth more or less freely (it will make everything brighter, just in case - it's supposed to be that way).
Test renders are boring, so here´s the best one only, with all its settings (the ones not shown are default; default max raytrace depth is 4 for the scripted renderer). Render time 17 minutes 48.66 seconds.
...talking to myself now, for lurkers' sake. As for Fresnel bricks in shader mixer and shader builder, I think I got it all sorted out now - found a great reference table here:
http://www.renderman.org/RMR/Examples/rrf/index.html
The correct way to compute Fresnel ratio for functions in both Shader Mixer and Shader Builder is the simple one: eta1/eta2 (1 is the surrounding medium, think air; 2 is your material).
The lesson is, don't test your Fresnel attenuation on people or bunnies. Complex shapes don't help with that, just the other way around.
Yeah, double bad on my part. :red:
Although the AdvancedAmbient can act similar to an area light with the soft shadows, etc. it isn't the same thing.
And I keep forgetting that DAZ gave away things in 4.0 that I paid for. :long:
GI is a yet another issue - generally, when it's raytraced, it's indeed "very slow".
You're probably right there that it was an overly broad general statement. However, since most DAZ artists are using human figures with transparency mapped hair (often in several layers), the performance impact is noticeable. Not to say "Don't use them, you'll wait a week for your image!" Just to be aware of the difference.
Nice job on the test renders and stats, though. :)
woww kewl.
the shadows at the foot of the cubes look a lil roundish?
no bunnies were harmed in these tests :)
I don't remember if I did the test in DS but I did the Cornel Box at least once. Pretty much annoying once you have done it (and long rendertimes). But it could be usefull to compare render engines or algo
@ Mustakettu85 : I didn't have a look at your Fresnel equation (not really, but stopped right after seeing there was a reference of two mediums and thought it's wayyy too complicated). In fact I don't really understand what you want to do as we're rather dealing with suface shading and not Volume shading inside a medium (that's what you have in Luxrender and probably in many other unbiased engines) or are you trying to make an underwater shader?
With 3delight and Shader mixer you could use the built in Bricks in SM or functions Fresnel / Reflect / Refract. You just have to give correct parameters. No need to reinvent the wheel
And I don't see why you can't test Fresnel on people or bunnies. You should be able to see it depending on the lights and camera placement
UberEnvironmentHdriPack1 has a Cornell Box preset.
http://www.omnifreaker.com/index.php?title=UberEnvironmentHdriPack1
I know it is not the same thing you all are talking about but if it had not been for the Cornell Box question and the omAreaLight Light statement I would not have been inspired to try it out. I loaded a UberEnvironment with the Cornell Box preset along with a omAreaLight; rendered and seen the skin looking kind of purple. So I added a yellow light to mute down that purple. I'm not using the Indirect lighting though, it is just too slow for me; instead I used Occlusion w/Directional Shadows. I also added a few other lights for specular, a backlight, a fill and a key light.
The quality is by no means as cool as the stuff I'm seeing you all do, but it did render in around 10 minutes.
I'm also using a quad with 8 gigs of ram and W7 64
Haha, thanks =) Yeah the shadows are rounded maybe a bit too much, I think the GI might be eating at their angles where they are weakest. We need a real-life photo of a box with a square "area light" on its ceiling... I think I have lamps at work that might pass for a light like that, I may try if I don't forget...
-------------------------------
And I keep forgetting that DAZ gave away things in 4.0 that I paid for. :long:
... However, since most DAZ artists are using human figures with transparency mapped hair (often in several layers), the performance impact is noticeable.
Nice job on the test renders and stats, though. :)
Thanks =) And yeah, DAZ3D are kinda bad like that; I'm sorry you were one of those customers.
And transmapped hair models are evil incarnate =D But a lot of them look so cool, it's impossible to resist! Makes them three times more evil, I guess =)
What I want to do eventually is to write an area light shader that would allow flagging surfaces like AoA's lights do - so that hair could be excluded and only lit with a shadow-mapped spotlight. Or maybe he will read this and make that shader before I manage to, that would be fine as well =)
thanks for those links page. :)
if i was younger i'd look into that college.
Just an idea : why not setting up the same scene in an other renderer (let's say Luxrender ) and compare?
Yeah, that's a great idea, but it wouldn't have helped when I was downright wrong about the skin IOR, I think.
I think I forgot about Lux completely, I was looking into Carrara's renderer, but its Fresnel controls are somewhat unclear, too (again, that "strength" parameter...).
And hey everyone, I was thinking of that transmapped hair vs raytraced shadows issue, and here's a crude solution I came up with: http://www.sharecg.com/v/74284/view/3/PDF-Tutorial/Hair-Compositing-Tutorial
It's about rendering your scene sans the offending hair and then rendering only the hair and the objects it casts a shadow on, to be composited in an image editor. I made a simple shader in ShaderMixer to apply to those objects, to turn them into "green screen". It's not perfect and generally may need practice, but who knows, it may come in handy.
What do you think, folks?
@Mustakettu85 : I'm not fond of the compositing as you'll lose time there too. But it is an interesting option
I wrote a simple shader that I thought to render hair a little quicker but it only had a basic specular. and I was not sure the time gained was worth (something like 8 min render instead of 10 )
I can pack it so that you test it if you want
After I did some test on hairs I think that if you want t to get some interesting effect and good shading on hairs, you must be ready to pay the price aka long render time
I did a Cornell box render...no Uber...built the area light, and camera in ShaderMixer. It took forever (like 10 hrs) to render, but I think I screwed up and had one of the samples at some insane setting...I know I had the overall shading rate at nearly insane 0.1...
There is a bit of graininess in the corners, because I didn't want to go back and redo it with 'more sane' settings.
But ShaderMixer built items are usually faster than Uber, at least on my system.
shadows are wonky looking.
3delight is unbiased? i get confused between the biased and unbiased, but iirc they have different features.
Just for fun: two cornell box renders. Luxrender and 3delight. Luxrender 15 min, 3delight 3.5 min on a dual core, ie. 30 resp 7 minutes cpu time.
verry verry interesting. looks like color from the wall is tinting the shadows
Yes, it is...that is the whole point of that particular 'test'...color bleeding is a real life phenomenon and you only get it in 3d with unbiased renderers (Luxrender) or using indirect diffuse/final gather/some other 'trick' in a biased renderer (3Delight).
Biased. The technical difference is how they solve the rendering equation, but for the end-user, it does not really matter.
The most important thing to remember about 3Delight is that it's Renderman-compliant, and there's a lot of resources on Renderman, like this: http://renderman.pixar.com/view/renderman-university - so with enough dedication, almost everything should be possible. When the interface to DS does not get in the way, that is =)
-------------------------
Thanks for the side-by-side =) Lux is on the right, correct? Is it point-cloud with 3Delight or raytraced GI?
Yes, the luxrender is the one that actually renders the light, whereas with 3delight the light itself (the square at the top) remains dark.
huuuh. would cararra render the ceiling light?
Did you use photon mapping from Shader Mixer or a custom shader (Shader Builder or manually scripted)?
---------------------------
I can't test right now (using my mother's computer again), but I know that light sources can be made invisible in Carrara and still emit light - at least, "shape lights" can. They aren't area lights proper in the sense you don't apply them to a surface of a mesh, but they operate along the same principles. I haven't used "Anything Glows" much (the equivalent to DS area light shaders).
Carrara's renderer is biased, so I guess the light-emitting surface does not _have_to_ glow, but it can be made to.
Yeah, unless you enable the ambient channel, the light surface will not "glow".
I'm wondering now, is it possible to make a mesh light in LuxRender that would be invisible? I know that in 3Delight, I can make the mesh transparent and it will emit light regardless, so that I can stick invisible light sources whenever I want.
I'm not sure, but I don't think so...I'll run a couple of renders, later. I think the best you can do is make them 'see through'...invisible to the camera, looking through it, but still a visible 'light' if the camera/reflection 'sees' it.
imo, for space station lights, it's handy to see the light panels lit
I Packed the shader. Took me some time to make some DUF preset, and metadatas and licence term. I hope I didn't screw anything
See the thread http://www.daz3d.com/forums/discussion/36216/ to get it
I also rendered a Cornell Box. It took under 2 min with latest 3Delight on my Quadcore i7 notebook
@mjc1016 : my experience with Shader Mixer is also that it is slower than Shader Builder. Still SM is quick for some tests or if you need something to build easily
@Mustakettu85 : From my test with Luxrender you can't hide the light. You can put a null shader on the surface but still the object is lighted as in real life as soon as you declare it as emissive
@Millighost : you say you used photon mapping but that is not what I get with photons ( it's very blurry even with 1024 samples). Did you use Indirectdiffuse() with final gather or is it a tech I don't know?
[Edit] Screwed something in the shader definitions the first time. I've just updated the package
And I've found that even though SM can be slower, it's still faster than the Uber line included in DS. The greater the complexity, the more likely SM is going to start bogging down. I'm pretty sure there are ways of tweaking it's performance, but I haven't had the chance/time to play around with figuring any out.
And if you could set the compile optimization level for Shader Builder, you could speed it up even more. I've been playing around with setting up the shader in SB, then compiling the source in the stand alone 3DL at -O3 and using that sdl to replace the one SB generated...and on some of my more complex shaders, I've seen a 15% or so improvement in render time. The trick, though, is you need to match the 3DL versions, or include the source in the standalone compile (shaderdl --embed-source).
Thanks guys, I knew there had to be a catch.
...how do you guys think, what is the right way to make a light only illuminate "flagged" surfaces? Seems to be logical to get this in the rendertime script, but I'm not sure it's possible to set other objects' attributes from there.
On the other hand, I was looking at this - http://renderman.pixar.com/resources/current/rps/appnote.29.html - I wonder if 3Delight supports this feature... (provided it is possible to set "matte" object attribute successfully from DS - either I'm doing something wrong with the rendertime script, or it doesn't work at all).