Adding to Cart…
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.
Comments
Thanks.
Unfortunately, the BounceGI settings I used for those renders don't work too well with my other presets. :(
IPR is very nice indeed. As you mentioned, it will save a lot of time in light and material setup. Haven't tried it yet, to see which settings will trigger a render restart and which one won't.
It triggers a new render when moving or adding geometry and also when you assign a new shader. You can change colors/values/map in the shaders, add/activate/deactivate lights, move camera/view wihtout a new render
Ah, thanks, but I know this page by heart, and it's of no help. I need some other source to understand how a shader may influence DOF results. You mean that when 3Delight knows there is only trace() in the scene, it will use a different DOF sampling algo than it would use if it were an "oldschool" scene?? Or do pathtracing-enabled renderers always sample DOF in a different way from oldschool raytracers, and so the shaders play a bigger part?? I really don't know and can't seem to find an article that explains this. Do you have anything on DOF algos specific to path tracers?
---------------
Very cool! The bounce red on the floor is sooo neat.
I guess the chrome is okay, but you need more stuff to reflect in it. If you're using a skydome, maybe a "busier" image in it. I think you realise that you can have a skydome that only reflects and does not influence GI at all (with UberSurface, "raytrace" on, "occlusion" off).
--------------
I've tested the 4.7 build, and I'm happy to report that the subsurface() shadeop is up-to-date now in the sense that it finally can accept parameters without bothering with RiAttributes! So we can get variable albedo at last.
It means that in many more cases, we can do without geometry shells and overlays for makeup and brows. This quickie A3 only shows how variable albedo makes you see the makeup as makeup and not a tattoo (her eyebrows are a geometry overlay specific to Gen3 figures). EDIT: oh yeah, the eyewhites also show how nice it is to have variable albedo. There is no diffuse there, just SSS, and you still see the veins.
But I am going to ask in the dedicated thread how to make render scripts respect the background transparency. As you see here, it's red instead.
Thanks for the links, I will look into them after the "ARES digital net" tonight, that is starting in a few minutes.:coolsmile:
Reflecting light sources with default shaders, no.
Reflecting light sources with a switch in 3delight, no.
Faking reflected light sources using more cams, invisible planes, and other "Spaghetti Magic", is the only way. :coolhmm: I get the feeling this is going to be another Microwave Distributed Element Filter "Electron Voodoo" adventure that I may never fully recover from being lost in wires going everywhere. lol. There is a reason I've avoided that "Shader Mixer" at all costs thus far.
Really nice render of A3 there, and that truck looks phenomenal. I gave GI a quick test, and it was good, tho not what I was looking for, so I think the different UE2 modes all have there specific niches. I think 'We' are looking for something different for our respective goals, lol.
(EDIT, pic from Wikipedia. A "Distributed Element Filter" using just PCB traces. "Electron Voodoo")
Basically, unfortunately, yes. As of right now. I believe the 3Delight developers will eventually give us an easier way to do it, but we'll have to wait and see.
And thanks for the nice comment about my A3 =) Not bad for a quick test indeed. But there's more work needed, particularly with the eyes (and maybe not just shader work).
The all white sphere is intended. I was getting too much reflections in the chrome, making it brighter than it should be. The sphere is actually UE2 environment prop, so I can load the various HDRI image, like Park which I used before.
I"ve scaled back the GI bounce setting so it will not cause problema with my other presets, but the color bleeding is far less noticeable. Still looks good and I"ll post the render later. I posed 5 V6 around the car along with some clothes with updared fabric presets. The precompute pass does take awhile to finish. :)
http://www.daz3d.com/forums/discussion/47663/
Especially with delays in excess of a minute or so. make an adjustment, run a test render, and wait over half an hour to begin to see the results... There has got to be a better way.
Basic primitive spheres also work for loadiing latlong panoramas on.
Five Vickies sounds like a real challenge =) I'm still waiting for the DAZ support decision regarding my editing bits of their script samples in my scripts for redistribution - but I hope they aren't going to just forbid me from releasing anything. It's not like I am building a commercial set for some competing store, it's just a freebie to make DS use the render engine better LOL. So hopefully you'll be able to see this year if the pure raytraced SSS works for you. I haven't yet tried rendering so many figures at once, but with one or two I don't notice that "time to first pixel" delay at all.
What makes me really worried is the ShareCG being down way too often these days. I'm thinking of creating an alternative download location for my freebies, at least so that people would be able to access them when ShareCG is misbehaving.
Fake GI on top, BounceGI on bottom.
The fake GI setup took something like 5 min 30 secs, while the BounceGI setup about 28 min 7 sec. SSS prepass took about 9 min (8 min 50 secs) with BounceGI setup and that's just for the skin. I also use SSS for the fleshy bits (lacrimals, gums, tongue, inner mouth shares the same group ID, teeth has it's own). Mind you, this is with your script. Without it, it will take much longer.
One note about the GI bounce setup - looks like you don't need to add a geometry shell to have colored shadows. I didn't use it for either render, but looking at the two, using the geometry shell trick can add that little bit of extra touch for the fake GI setup.
Oh yeah. I've finally broke 4 GB memory usage. :) So adding a character or two, or a complete background will definitely make you hit the memory limits of those GPU renderers. Unless you have enough money and power budget to use models with 6 GB or more. But then you'll hit it again if you use more characters/props.
A more elaborate fake GI setup. Previous I'm just using two UberArea light planes. This time, I'm using a modified version of my newest light set with 6 UberArea light planes arranged in a semi circle. I did have to scale the planes and moved them back a little (by simply scaling the parented null). Then I adjusted the intensity.
Again, it's not the same as true BounceGI, but the results looks close enough and quite fast at 2 min 55 secs. The set is tuned for performance so there's still a bit of noise (I think I was using 24 samples for the previous render, this one was set at 8). With the samples raised to 24, render time is 6 min 35 secs.
That's a brilliant render, Wowie. They seem to be like, "I am driving this beauty! - No, you are not, I am! See, I even took my shoes off! - Wait, you two! I am driving, and I took my shoes off first!"
You mean, the coloured shadows on the hair? I guess if you keep the hair included in the GI calculation, colour bleed will do it right. But it takes more time, of course, than excluding hair from GI and doing the shell trick.
As of right now, if the hair model intersects the figure mesh, it may also lead to small artefacts with the raytraced SSS (when the GI is not disabled on hair). I believe this should be fixed with the 11.0.112 3Delight build. I also think that careful positioning of the hair and/or enabling smoothing on it can also help with that, but I haven't had time for many tests with GI enabled on hair.
I think HD morphs and smoothing will also increase memory usage, so the GPU memory limit could happen even faster than five characters.
Yes. You don't need to do the hair trick with GI since you'll have color bleeding anyway. With GI, I think you have to went the other way by adding a geometry shell to 'force' occlusion like effects (adding a shell below the hair, acting as a second shadow blocker). But that's the great thing about using biased renderes - you have a lot of options to get the look you want (and not necessarily correct or accurate results).
Outside of those tricks though, the MAT presets I've made seems to do a very good job of maintaining consistency between rendering methods and lighting setups. I do find it ridiculous people still want different MATs for different light scenarios. Surface properties don't change regardless of how you light your objects. :) But I guess old, bad habits die hard.
Technically, HD morphs looks very good, but they add a lot to render times so I prefer to add those kind of details with displacement. Smoothing doesn't inflate memory consumption like HD morphs though, since you don't actually load new morph deltas for the subd model. I don't think GPU renderers have support for OpenSubDiv yet, but I could be wrong.
Had to do some real life stuff yesterday and got a good look at cars and car paints. Looking back at the renders, I've decided to change the car paint preset a bit. Changed it to be more 'glossy' and updated the fresnel settings. The reflection looks more like the BounceGI render now.
Oh yeah, that's a sad, sad fact. Also sad that people are so hard to persuade to use gamma correction (even though in DS it's so easy now).
I love Gamma Correction now that I understand the "why"
Old carpaint vs new carpaint. Plus some renders with KH Park and a section of City Roads. Had to set ambient and diffuse strength of the UE2 environment ball to 100% to get a very bright enviroment.
The lights is the fake GI setup.
I believe the solution is simple - trust the renders not what you're seeing in the viewport. I generally work with the preview lights turned off. Now that there's IPR, it's even easier. Note - turning on/off an UberArea light will trigger a re-render.
I think y'all missed the point. or do you just blindly throw stuff in scenes, and trust the dial numbers without using the View-port at all?
So there is a fix in the works to make things better. That is good news. Not having things appear in the View-port as they will in the render defeats most of the goodness of sticking to defaults. Not being able to see anything I'm doing is just completely useless.
(EDIT)
In order for a scene to look close to correct with GC on, the lights need to be cranked down below ten percent intensity. At those levels, everything in the view-port is practically pitch black.
the viewport was never WYSiWYG which was something I learnt very quickly and learnt to light scenes and set up materials via test rendering. ;)
Ditto. For me the viewport is basically only used to arrange the scene and never used for judging the lights beyond light position. Eventually one gets better at the lighting set-up and nowadays arranging the lights works really quick. Not to mention that many shaders in the viewport look awful as well. That is and was always the nature of the OpenGL viewport and I got used to it.
It is not a fix. It's an added feature.
As much as I like DS to implement what-you-see-is-what-get viewport, I don't think there's any 3D application that has such a feature. The only apps I know that offer something like that are game engine kits ie. CryEngine, Unreal etc. Maybe Marmoset, but I wouldn't call that a complete 3D app like DS, Max or Maya.
Even with the latest advances, there's only so much you can do with current GPUs to 'mimic' what can be done in renderers. Particularly at interactive frame rates.
Actually, I think the opposite is true.
I'm curious, what do you mean by 'correct'? Linear workflow (using gamma correction and setting a target gamma), generally means textures are converted to linear color space with the target gamma used for output . The end result is textures will render darker not brighter.
Okay, I see. We (aka the 3D community at large) really really need an indepth tech-oriented interview with someone like Aghiles K. to see how the "new" 3Delight grew out of the old one. And where exactly the watershed lies.
But, well, let's forget about the DragonSlayer scene. It's basically a black box, ain't it... What do your own shaders show? Any DOF smoothness difference between shaders using trace(), bsdf(), specular()/diffuse() or custom illuminance loops?
-------------
In all honesty, Zarcon, would you please listen to "us oldtimers" and please ditch that viewport dependency thing already! LOL The viewport is only there to show you _where_ to put stuff, not "how" it is going to look. The OpenGL renderer... well, it's not even Gamebryo we're talking about in DS (ever played TESIV:Oblivion or The Guild II? Remember that wacky, quirky, glitchy, seemingly hardly ever optimised engine? That was Gamebryo). The viewport wasn't ever close to the "software" (aka 3Delight) render in DS, and I was there when DS was in version 2-something... i.e. way before automagical GC was ever a thing.
Did you like my Aiko test? Did you know _how_ it looks in the viewport?! Nothing like the finished render.
Okay, I see. We (aka the 3D community at large) really really need an indepth tech-oriented interview with someone like Aghiles K. to see how the "new" 3Delight grew out of the old one. And where exactly the watershed lies.
But, well, let's forget about the DragonSlayer scene. It's basically a black box, ain't it... What do your own shaders show? Any DOF smoothness difference between shaders using trace(), bsdf(), specular()/diffuse() or custom illuminance loops?
Didn't try. I didn't use DS a lot these last month. I sometimes play with some shader write but not much
Alright then.
So you haven't looked into this new background issue with scripted rendering? I can't seem to figure it out on my own, and my questions regarding this get either overlooked or ignored in the release threads (I'm about to begin to take it personal).
Might be a better idea to send a ticket to the Tech department as that thread was moving that fast it probably got lost in the haze.
My Guess : DAZ Team made some change in that field :
Taken from the changelog :
These could affect the scripted rendering. I guess there are some Scripting API change for these and Rbtwhiz answered me in the beta thread he didn't have time to update. So you may have to wait
These could affect the scripted rendering. I guess there are some Scripting API change for these and Rbtwhiz answered me in the beta thread he didn't have time to update. So you may have to wait
Yes, thanks, I noticed that. I was just hoping you may have figured it out by yourself. It's possible to simply make new controls in the render script (bypassing the whole Backdrop/Viewport mess), but still. This ongoing problem with documentation being updated years after "introducing a feature" is becoming very frustrating. I guess it's one of the "payment" options for free software...
From the DS release thread:
Looks like this is classified info, Araneldon! =D But if you aren't adverse to bypassing the whole backdrop thing (setting up your own controls in the render script instead), there is a way out, I believe.
I invite both of you to discuss the "scripted renderer" thing in the "laboratory" thread here: http://www.daz3d.com/forums/discussion/21611/
Thank you for the invitation :) Taming the Scripted Renderer -- for multipass rendering among other things -- is one of my goals. I've only recently started looking under the hood so to speak but haven't gotten very far yet.
Hi, everybody. I guess I've come late to this party (lol) I am trying to learn lighting and getting good skin tone. Sigh...... I am posting a render I just did and I need all the help I can get to fix this. I am trying to read the thread but there's a LOT of info and my brain is having fits. So any quick tips would be appreciated whilst I'm reading. I did read a few recent posts and I'm clueless as to the gamma correction? Where is that?Any way here's my render. I used 2 spot lights and the skin texture for Meridiana. Thank you
Kalazar