3Delight Surface and Lighting Thread

1242527293052

Comments

  • wowiewowie Posts: 2,029
    edited December 1969


    Was just testing the car paint and IPR render. Very effective and you gain a lot of time for light and material adjustment

    I like your last renders of the car better.

    Thanks.

    Unfortunately, the BounceGI settings I used for those renders don't work too well with my other presets. :(

    IPR is very nice indeed. As you mentioned, it will save a lot of time in light and material setup. Haven't tried it yet, to see which settings will trigger a render restart and which one won't.

  • Takeo.KenseiTakeo.Kensei Posts: 1,303
    edited November 2014

    It triggers a new render when moving or adding geometry and also when you assign a new shader. You can change colors/values/map in the shaders, add/activate/deactivate lights, move camera/view wihtout a new render

    Post edited by Takeo.Kensei on
  • Mustakettu85Mustakettu85 Posts: 2,933
    edited November 2014


    Sorry about the 'double' there. Is there a laymens explanation for the interaction between the three parts? What is that plane doing? is it a light source of sorts?


    I second Wowie's suggestion about watching the tutorials. They don't tell you "how it works", but explain "how to do it" in a more or less clear way.

    "How it works". I am not sure I am the best person to ask for laymen explanations, but I'll try.

    The light needs to have a special option so as to cast not just "normal light", but photons (they're separate things in this case, it's not the real world). Default DS lights don't have it. You can set it with scripts for some, but shader mixer lights should be easier for a non-tech oriented user: they always have this option on.

    Then, the camera. It gives the renderer additional commands to actually make and store the "photon map", and it lets you control its quality (number of photons).

    The special materials (again, shader mixer is just the easiest way to get them without actual coding) actually tell the photons what it was that they hit ("matte" = the photons will get absorbed; "glass"/"water" = they will get refracted and reflected; "chrome" = photons reflect but do not refract through the material; "transparent" = photons go right through).

    If you want your mirrors to throw lights around (like kids do with mirrors when playing with kittens, right?), they should be "chrome".


    -----------------


    Ah, thanks, but I know this page by heart, and it's of no help. I need some other source to understand how a shader may influence DOF results. You mean that when 3Delight knows there is only trace() in the scene, it will use a different DOF sampling algo than it would use if it were an "oldschool" scene?? Or do pathtracing-enabled renderers always sample DOF in a different way from oldschool raytracers, and so the shaders play a bigger part?? I really don't know and can't seem to find an article that explains this. Do you have anything on DOF algos specific to path tracers?


    ---------------

    Almost finished. Still need to tweak the glass and probably the chrome.

    Very cool! The bounce red on the floor is sooo neat.

    I guess the chrome is okay, but you need more stuff to reflect in it. If you're using a skydome, maybe a "busier" image in it. I think you realise that you can have a skydome that only reflects and does not influence GI at all (with UberSurface, "raytrace" on, "occlusion" off).


    --------------

    I've tested the 4.7 build, and I'm happy to report that the subsurface() shadeop is up-to-date now in the sense that it finally can accept parameters without bothering with RiAttributes! So we can get variable albedo at last.

    It means that in many more cases, we can do without geometry shells and overlays for makeup and brows. This quickie A3 only shows how variable albedo makes you see the makeup as makeup and not a tattoo (her eyebrows are a geometry overlay specific to Gen3 figures). EDIT: oh yeah, the eyewhites also show how nice it is to have variable albedo. There is no diffuse there, just SSS, and you still see the veins.

    But I am going to ask in the dedicated thread how to make render scripts respect the background transparency. As you see here, it's red instead.

    a3test47.png
    600 x 600 - 501K
    Post edited by Mustakettu85 on
  • ZarconDeeGrissomZarconDeeGrissom Posts: 5,414
    edited November 2014

    Thanks for the links, I will look into them after the "ARES digital net" tonight, that is starting in a few minutes.:coolsmile:
    Reflecting light sources with default shaders, no.
    Reflecting light sources with a switch in 3delight, no.
    Faking reflected light sources using more cams, invisible planes, and other "Spaghetti Magic", is the only way. :coolhmm: I get the feeling this is going to be another Microwave Distributed Element Filter "Electron Voodoo" adventure that I may never fully recover from being lost in wires going everywhere. lol. There is a reason I've avoided that "Shader Mixer" at all costs thus far.


    Really nice render of A3 there, and that truck looks phenomenal. I gave GI a quick test, and it was good, tho not what I was looking for, so I think the different UE2 modes all have there specific niches. I think 'We' are looking for something different for our respective goals, lol.
    (EDIT, pic from Wikipedia. A "Distributed Element Filter" using just PCB traces. "Electron Voodoo")

    1024px-Microstrip_Hairpin_Filter_And_Low_Pass_Stub_Filter.jpg
    1024 x 393 - 80K
    Post edited by ZarconDeeGrissom on
  • Mustakettu85Mustakettu85 Posts: 2,933
    edited December 1969


    Faking reflected light sources using more cams, invisible planes, and other "Spaghetti Magic", is the only way. :coolhmm:

    Basically, unfortunately, yes. As of right now. I believe the 3Delight developers will eventually give us an easier way to do it, but we'll have to wait and see.

    And thanks for the nice comment about my A3 =) Not bad for a quick test indeed. But there's more work needed, particularly with the eyes (and maybe not just shader work).

  • wowiewowie Posts: 2,029
    edited November 2014

    The all white sphere is intended. I was getting too much reflections in the chrome, making it brighter than it should be. The sphere is actually UE2 environment prop, so I can load the various HDRI image, like Park which I used before.

    I"ve scaled back the GI bounce setting so it will not cause problema with my other presets, but the color bleeding is far less noticeable. Still looks good and I"ll post the render later. I posed 5 V6 around the car along with some clothes with updared fabric presets. The precompute pass does take awhile to finish. :)

    Post edited by wowie on
  • ZarconDeeGrissomZarconDeeGrissom Posts: 5,414
    edited December 1969

    wowie said:
    The all white sphere is intended. I was getting too much reflections in the chrome, making it brighter than it should be. The sphere is actually ue2 environment prop, so I can load the various HDRI image, like Park which I used before.

    I"ve scaled back the GI bounce setting so it will not cause problems with my other presets, but the color bleeding is far less noticeable. Still looks good and I"ll post the render later. I posed 5 V6 around the car along with some clothes with an updared fabric presets. The precompute pass does take awhile to finish. :)

    hmmm. I tend to find that impossible to work with trying to set up lights (intensities, angles, etc), to the point I gave up using some figures all together.
    http://www.daz3d.com/forums/discussion/47663/
    Especially with delays in excess of a minute or so. make an adjustment, run a test render, and wait over half an hour to begin to see the results... There has got to be a better way.
  • Mustakettu85Mustakettu85 Posts: 2,933
    edited December 1969

    wowie said:
    The all white sphere is intended. I was getting too much reflections in the chrome, making it brighter than it should be. The sphere is actually ue2 environment prop, so I can load the various HDRI image, like Park which I used before.

    I"ve scaled back the GI bounce setting so it will not cause problema with my other presets, but the color bleeding is far less noticeable. Still looks good and I"ll post the render later. I posed 5 V6 around the car along with some clothes with an updared fabric presets. The precompute pass does take awhile to finish. :)

    Basic primitive spheres also work for loadiing latlong panoramas on.

    Five Vickies sounds like a real challenge =) I'm still waiting for the DAZ support decision regarding my editing bits of their script samples in my scripts for redistribution - but I hope they aren't going to just forbid me from releasing anything. It's not like I am building a commercial set for some competing store, it's just a freebie to make DS use the render engine better LOL. So hopefully you'll be able to see this year if the pure raytraced SSS works for you. I haven't yet tried rendering so many figures at once, but with one or two I don't notice that "time to first pixel" delay at all.

    What makes me really worried is the ShareCG being down way too often these days. I'm thinking of creating an alternative download location for my freebies, at least so that people would be able to access them when ShareCG is misbehaving.

  • wowiewowie Posts: 2,029
    edited November 2014

    hmmm. I tend to find that impossible to work with trying to set up lights (intensities, angles, etc), to the point I gave up using some figures all together.
    http://www.daz3d.com/forums/discussion/47663/
    Especially with delays in excess of a minute or so. make an adjustment, run a test render, and wait over half an hour to begin to see the results... There has got to be a better way.

    Well, you can cheat a bit. Always use a quick preview setup for testing. I normally don't use indirect lighting or GI bounce and prefer direct lighting with fake bounce light or an ambient/AO light to setup materials and try new tricks. If you setup your materials correctly, they will look correct even with just direct lighting.

    Without SSS, my direct lighting and fake GI setup renders the car in about 45 to 60 seconds. Of course, I'm using a mid-end desktop processor (Core i7 4770), but even with my older Phenom II X955, it won't take more than 3 to 5 minutes.

    The IPR features should make this a lot easier and faster. As Takeo pointed out, the IPR render don't have to re-render the image if you're just tweaking materials.

    Another way to cheat the SSS precompute pass is to use the same group ID for all figures. General skin properties actually don't vary too much between various shades of light skin. For dark skin, just put the diffuse map in the SSS color channel slot. It will acts a good SSS control map and tone down the SSS strength at the same time. I've found the only time i needed to make a new SSS profile is to get pale/pinkish and yellowish skin (particularly for textures that has too much baked colorsin).

    Yet another alternative is to swap the DS default lighting with AoA's light, which has a different sample setting for SSS. Unfortunately, this only applies to the point/spot/distant lights since there's no GI mode for the Advanced Ambient Light. That's why I'm really looking forward to the raytraced SSS.


    Five Vickies sounds like a real challenge =)
    -snip-
    So hopefully you'll be able to see this year if the pure raytraced SSS works for you. I haven't yet tried rendering so many figures at once, but with one or two I don't notice that "time to first pixel" delay at all.

    Fake GI on top, BounceGI on bottom.

    The fake GI setup took something like 5 min 30 secs, while the BounceGI setup about 28 min 7 sec. SSS prepass took about 9 min (8 min 50 secs) with BounceGI setup and that's just for the skin. I also use SSS for the fleshy bits (lacrimals, gums, tongue, inner mouth shares the same group ID, teeth has it's own). Mind you, this is with your script. Without it, it will take much longer.

    One note about the GI bounce setup - looks like you don't need to add a geometry shell to have colored shadows. I didn't use it for either render, but looking at the two, using the geometry shell trick can add that little bit of extra touch for the fake GI setup.

    Oh yeah. I've finally broke 4 GB memory usage. :) So adding a character or two, or a complete background will definitely make you hit the memory limits of those GPU renderers. Unless you have enough money and power budget to use models with 6 GB or more. But then you'll hit it again if you use more characters/props.

    bounceGI.jpg
    1280 x 720 - 564K
    fake.jpg
    1280 x 720 - 549K
    Post edited by wowie on
  • wowiewowie Posts: 2,029
    edited November 2014

    A more elaborate fake GI setup. Previous I'm just using two UberArea light planes. This time, I'm using a modified version of my newest light set with 6 UberArea light planes arranged in a semi circle. I did have to scale the planes and moved them back a little (by simply scaling the parented null). Then I adjusted the intensity.

    Again, it's not the same as true BounceGI, but the results looks close enough and quite fast at 2 min 55 secs. The set is tuned for performance so there's still a bit of noise (I think I was using 24 samples for the previous render, this one was set at 8). With the samples raised to 24, render time is 6 min 35 secs.

    fakeGI.jpg
    1280 x 720 - 570K
    Post edited by wowie on
  • Mustakettu85Mustakettu85 Posts: 2,933
    edited December 1969

    wowie said:

    One note about the GI bounce setup - looks like you don't need to add a geometry shell to have colored shadows. I didn't use it for either render, but looking at the two, using the geometry shell trick can add that little bit of extra touch for the fake GI setup.

    Oh yeah. I've finally broke 4 GB memory usage. :) So adding a character or two, or a complete background will definitely make you hit the memory limits of those GPU renderers. Unless you have enough money and power budget to use models with 6 GB or more. But then you'll hit it again if you use more characters/props.

    That's a brilliant render, Wowie. They seem to be like, "I am driving this beauty! - No, you are not, I am! See, I even took my shoes off! - Wait, you two! I am driving, and I took my shoes off first!"

    You mean, the coloured shadows on the hair? I guess if you keep the hair included in the GI calculation, colour bleed will do it right. But it takes more time, of course, than excluding hair from GI and doing the shell trick.

    As of right now, if the hair model intersects the figure mesh, it may also lead to small artefacts with the raytraced SSS (when the GI is not disabled on hair). I believe this should be fixed with the 11.0.112 3Delight build. I also think that careful positioning of the hair and/or enabling smoothing on it can also help with that, but I haven't had time for many tests with GI enabled on hair.

    I think HD morphs and smoothing will also increase memory usage, so the GPU memory limit could happen even faster than five characters.

  • wowiewowie Posts: 2,029
    edited November 2014


    That's a brilliant render, Wowie. They seem to be like, "I am driving this beauty! - No, you are not, I am! See, I even took my shoes off! - Wait, you two! I am driving, and I took my shoes off first!"

    Thanks. :)
    I didn't add shoes since those things usually require you to adjust the figure's y position (or the hip). Viewport performance was quite sluggish with all those objects in view. I had to turn off two hair props just to get more fluid frame rates to move the camera around..


    You mean, the coloured shadows on the hair? I guess if you keep the hair included in the GI calculation, colour bleed will do it right. But it takes more time, of course, than excluding hair from GI and doing the shell trick.

    Yes. You don't need to do the hair trick with GI since you'll have color bleeding anyway. With GI, I think you have to went the other way by adding a geometry shell to 'force' occlusion like effects (adding a shell below the hair, acting as a second shadow blocker). But that's the great thing about using biased renderes - you have a lot of options to get the look you want (and not necessarily correct or accurate results).

    Outside of those tricks though, the MAT presets I've made seems to do a very good job of maintaining consistency between rendering methods and lighting setups. I do find it ridiculous people still want different MATs for different light scenarios. Surface properties don't change regardless of how you light your objects. :) But I guess old, bad habits die hard.


    I think HD morphs and smoothing will also increase memory usage, so the GPU memory limit could happen even faster than five characters.

    Technically, HD morphs looks very good, but they add a lot to render times so I prefer to add those kind of details with displacement. Smoothing doesn't inflate memory consumption like HD morphs though, since you don't actually load new morph deltas for the subd model. I don't think GPU renderers have support for OpenSubDiv yet, but I could be wrong.

    Had to do some real life stuff yesterday and got a good look at cars and car paints. Looking back at the renders, I've decided to change the car paint preset a bit. Changed it to be more 'glossy' and updated the fresnel settings. The reflection looks more like the BounceGI render now.

    Post edited by wowie on
  • Mustakettu85Mustakettu85 Posts: 2,933
    edited December 1969

    wowie said:

    I do find it ridiculous people still want different MATs for different light scenarios. Surface properties don't change regardless of how you light your objects. :) But I guess old, bad habits die hard.

    Oh yeah, that's a sad, sad fact. Also sad that people are so hard to persuade to use gamma correction (even though in DS it's so easy now).

  • SzarkSzark Posts: 10,634
    edited December 1969

    I love Gamma Correction now that I understand the "why"

  • wowiewowie Posts: 2,029
    edited November 2014

    Old carpaint vs new carpaint. Plus some renders with KH Park and a section of City Roads. Had to set ambient and diffuse strength of the UE2 environment ball to 100% to get a very bright enviroment.

    The lights is the fake GI setup.

    oldHDRI3.jpg
    1280 x 720 - 786K
    oldHDRI2.jpg
    1280 x 720 - 855K
    oldHDRI.jpg
    1280 x 720 - 741K
    new.jpg
    1280 x 720 - 451K
    old.jpg
    1280 x 720 - 454K
    Post edited by wowie on
  • ZarconDeeGrissomZarconDeeGrissom Posts: 5,414
    edited November 2014

    wowie said:

    I do find it ridiculous people still want different MATs for different light scenarios. Surface properties don't change regardless of how you light your objects. :) But I guess old, bad habits die hard.

    Oh yeah, that's a sad, sad fact. Also sad that people are so hard to persuade to use gamma correction (even though in DS it's so easy now).

    I love Gamma Correction now that I understand the "why" With the deepest and utmost respect...
    http://www.daz3d.com/forums/discussion/43316/
    It can be near impossible to see what your doing with the lights dimmed to a good render level that way. I have many other things on my list of must learn, so it will wait a tad longer.
    Post edited by ZarconDeeGrissom on
  • wowiewowie Posts: 2,029
    edited November 2014

    With the deepest and utmost respect...
    http://www.daz3d.com/forums/discussion/43316/
    It can be near impossible to see what your doing with the lights dimmed to a good render level that way. I have many other things on my list of must learn, so it will wait a tad longer.

    I believe the solution is simple - trust the renders not what you're seeing in the viewport. I generally work with the preview lights turned off. Now that there's IPR, it's even easier. Note - turning on/off an UberArea light will trigger a re-render.

    Post edited by wowie on
  • Takeo.KenseiTakeo.Kensei Posts: 1,303
    edited November 2014

    Ah, thanks, but I know this page by heart, and it's of no help. I need some other source to understand how a shader may influence DOF results. You mean that when 3Delight knows there is only trace() in the scene, it will use a different DOF sampling algo than it would use if it were an "oldschool" scene?? Or do pathtracing-enabled renderers always sample DOF in a different way from oldschool raytracers, and so the shaders play a bigger part?? I really don't know and can't seem to find an article that explains this. Do you have anything on DOF algos specific to path tracers?

    All the questions about the Algos should be addressed at 3delight team, however what the changelog stated is that, the 3delight team must have done something in the sampling algorithm when when using DOF and Path tracing so that you get less noise. And with old school shaders, I don't think that activating the Raytrace Hider will magically make them do path tracing unless the 3DL team did something that way. And I didn't see anything in the changelog suggesting they did something like that . Testing the Dragon Slayer scene by activating DOF and Raytrace Hider confirmed that there was no benefit with the latest DS beta

    wowie said:

    I do find it ridiculous people still want different MATs for different light scenarios. Surface properties don't change regardless of how you light your objects. :) But I guess old, bad habits die hard.

    Oh yeah, that's a sad, sad fact. Also sad that people are so hard to persuade to use gamma correction (even though in DS it's so easy now).

    I love Gamma Correction now that I understand the "why" With the deepest and utmost respect...
    http://www.daz3d.com/forums/discussion/43316/
    It can be near impossible to see what your doing with the lights dimmed to a good render level that way. I have many other things on my list of must learn, so it will wait a tad longer. As some suggested, get the latest beta and use IPR Rendering. You could also add an Uberenvironment light to light up the whole scene a bit and disable it at final render
    Post edited by Takeo.Kensei on
  • ZarconDeeGrissomZarconDeeGrissom Posts: 5,414
    edited November 2014

    I think y'all missed the point. or do you just blindly throw stuff in scenes, and trust the dial numbers without using the View-port at all?

    So there is a fix in the works to make things better. That is good news. Not having things appear in the View-port as they will in the render defeats most of the goodness of sticking to defaults. Not being able to see anything I'm doing is just completely useless.
    (EDIT)
    In order for a scene to look close to correct with GC on, the lights need to be cranked down below ten percent intensity. At those levels, everything in the view-port is practically pitch black.

    Post edited by ZarconDeeGrissom on
  • SzarkSzark Posts: 10,634
    edited December 1969

    the viewport was never WYSiWYG which was something I learnt very quickly and learnt to light scenes and set up materials via test rendering. ;)

  • RenpatsuRenpatsu Posts: 828
    edited November 2014

    Szark said:
    the viewport was never WYSiWYG which was something I learnt very quickly and learnt to light scenes and set up materials via test rendering. ;)

    Ditto. For me the viewport is basically only used to arrange the scene and never used for judging the lights beyond light position. Eventually one gets better at the lighting set-up and nowadays arranging the lights works really quick. Not to mention that many shaders in the viewport look awful as well. That is and was always the nature of the OpenGL viewport and I got used to it.

    Post edited by Renpatsu on
  • wowiewowie Posts: 2,029
    edited November 2014

    I think y'all missed the point. or do you just blindly throw stuff in scenes, and trust the dial numbers without using the View-port at all?

    The viewport generally does a good job with DS standard lights, It doesn't provide any feedback if you use UberArea lights and/or UE2 lights although those lights will be processed by the renderer.


    So there is a fix in the works to make things better. That is good news. Not having things appear in the View-port as they will in the render defeats most of the goodness of sticking to defaults. Not being able to see anything I'm doing is just completely useless.

    It is not a fix. It's an added feature.

    As much as I like DS to implement what-you-see-is-what-get viewport, I don't think there's any 3D application that has such a feature. The only apps I know that offer something like that are game engine kits ie. CryEngine, Unreal etc. Maybe Marmoset, but I wouldn't call that a complete 3D app like DS, Max or Maya.

    Even with the latest advances, there's only so much you can do with current GPUs to 'mimic' what can be done in renderers. Particularly at interactive frame rates.


    In order for a scene to look close to correct with GC on, the lights need to be cranked down below ten percent intensity. At those levels, everything in the view-port is practically pitch black.

    Actually, I think the opposite is true.

    I'm curious, what do you mean by 'correct'? Linear workflow (using gamma correction and setting a target gamma), generally means textures are converted to linear color space with the target gamma used for output . The end result is textures will render darker not brighter.

    Post edited by wowie on
  • Mustakettu85Mustakettu85 Posts: 2,933
    edited December 1969

    wowie said:
    Old carpaint vs new carpaint. Plus some renders with KH Park and a section of City Roads. Had to set ambient and diffuse strength of the UE2 environment ball to 100% to get a very bright enviroment.

    The lights is the fake GI setup.

    Ohhh how neat. I'm not actually much of a car lover, but these, and the last render in particular, are awesome.

    And yeah, Omnifreaker's HDR maps are quite dark.

    -------------


    All the questions about the Algos should be addressed at 3delight team, however what the changelog stated is that, the 3delight team must have done something in the sampling algorithm when when using DOF and Path tracing so that you get less noise. And with old school shaders, I don't think that activating the Raytrace Hider will magically make them do path tracing unless the 3DL team did something that way. And I didn't see anything in the changelog suggesting they did something like that . Testing the Dragon Slayer scene by activating DOF and Raytrace Hider confirmed that there was no benefit with the latest DS beta

    Okay, I see. We (aka the 3D community at large) really really need an indepth tech-oriented interview with someone like Aghiles K. to see how the "new" 3Delight grew out of the old one. And where exactly the watershed lies.

    But, well, let's forget about the DragonSlayer scene. It's basically a black box, ain't it... What do your own shaders show? Any DOF smoothness difference between shaders using trace(), bsdf(), specular()/diffuse() or custom illuminance loops?

    -------------


    In order for a scene to look close to correct with GC on, the lights need to be cranked down below ten percent intensity. At those levels, everything in the view-port is practically pitch black.

    In all honesty, Zarcon, would you please listen to "us oldtimers" and please ditch that viewport dependency thing already! LOL The viewport is only there to show you _where_ to put stuff, not "how" it is going to look. The OpenGL renderer... well, it's not even Gamebryo we're talking about in DS (ever played TESIV:Oblivion or The Guild II? Remember that wacky, quirky, glitchy, seemingly hardly ever optimised engine? That was Gamebryo). The viewport wasn't ever close to the "software" (aka 3Delight) render in DS, and I was there when DS was in version 2-something... i.e. way before automagical GC was ever a thing.

    Did you like my Aiko test? Did you know _how_ it looks in the viewport?! Nothing like the finished render.

  • Takeo.KenseiTakeo.Kensei Posts: 1,303
    edited December 1969



    All the questions about the Algos should be addressed at 3delight team, however what the changelog stated is that, the 3delight team must have done something in the sampling algorithm when when using DOF and Path tracing so that you get less noise. And with old school shaders, I don't think that activating the Raytrace Hider will magically make them do path tracing unless the 3DL team did something that way. And I didn't see anything in the changelog suggesting they did something like that . Testing the Dragon Slayer scene by activating DOF and Raytrace Hider confirmed that there was no benefit with the latest DS beta

    Okay, I see. We (aka the 3D community at large) really really need an indepth tech-oriented interview with someone like Aghiles K. to see how the "new" 3Delight grew out of the old one. And where exactly the watershed lies.

    But, well, let's forget about the DragonSlayer scene. It's basically a black box, ain't it... What do your own shaders show? Any DOF smoothness difference between shaders using trace(), bsdf(), specular()/diffuse() or custom illuminance loops?

    Didn't try. I didn't use DS a lot these last month. I sometimes play with some shader write but not much

  • Mustakettu85Mustakettu85 Posts: 2,933
    edited December 1969

    Didn't try. I didn't use DS a lot these last month. I sometimes play with some shader write but not much

    Alright then.

    So you haven't looked into this new background issue with scripted rendering? I can't seem to figure it out on my own, and my questions regarding this get either overlooked or ignored in the release threads (I'm about to begin to take it personal).

  • SzarkSzark Posts: 10,634
    edited December 1969

    Might be a better idea to send a ticket to the Tech department as that thread was moving that fast it probably got lost in the haze.

  • Takeo.KenseiTakeo.Kensei Posts: 1,303
    edited December 1969

    Didn't try. I didn't use DS a lot these last month. I sometimes play with some shader write but not much

    Alright then.

    So you haven't looked into this new background issue with scripted rendering? I can't seem to figure it out on my own, and my questions regarding this get either overlooked or ignored in the release threads (I'm about to begin to take it personal).

    My Guess : DAZ Team made some change in that field :

    Taken from the changelog :

    -Moved setting of the background color for a scene (render) to the backdrop; viewport color (interface styling) and backdrop color (rendering) are now separate settings
    -Changed the labeling of Background Color options on the Scene page of the Preferences dialog, to reflect the recent separation of Viewport Color (interface) from Backdrop Color (scene)

    These could affect the scripted rendering. I guess there are some Scripting API change for these and Rbtwhiz answered me in the beta thread he didn't have time to update. So you may have to wait

  • Mustakettu85Mustakettu85 Posts: 2,933
    edited December 1969

    Szark said:
    Might be a better idea to send a ticket to the Tech department as that thread was moving that fast it probably got lost in the haze.

    I already have a support ticket regarding my free stuff, and it's not moving fast. So opening another one might not be a good idea - I guess requests are prioritised, and issues that are not immediately relevant to a large number of paying customers are processed last. Quite a logical thing to do business-wise.

    ------------


    My Guess : DAZ Team made some change in that field :

    Taken from the changelog :

    -Moved setting of the background color for a scene (render) to the backdrop; viewport color (interface styling) and backdrop color (rendering) are now separate settings
    -Changed the labeling of Background Color options on the Scene page of the Preferences dialog, to reflect the recent separation of Viewport Color (interface) from Backdrop Color (scene)

    These could affect the scripted rendering. I guess there are some Scripting API change for these and Rbtwhiz answered me in the beta thread he didn't have time to update. So you may have to wait

    Yes, thanks, I noticed that. I was just hoping you may have figured it out by yourself. It's possible to simply make new controls in the render script (bypassing the whole Backdrop/Viewport mess), but still. This ongoing problem with documentation being updated years after "introducing a feature" is becoming very frustrating. I guess it's one of the "payment" options for free software...

  • araneldonaraneldon Posts: 712
    edited December 1969

    From the DS release thread:

    Cypherfox said:
    This is me, drooling at the idea of rendering directly to PSD, and being able to assign different objects to render to different output layers but with a single pass render.

    I don't know about different objects rendering to different layers - but you can output stuff from 3Delight to PSD. http://www.3delight.com/en/uploads/docs/3delight/3delight_67.html - takes some "scripted renderer" work to use those extra display drivers, but it's possible (at least, when rendering to file). I haven't done any layered stuff, but I use a script to render to HDR (EXR) which can be later tonemapped.

    Having the same issue. Please advise.


    Looks like this is classified info, Araneldon! =D But if you aren't adverse to bypassing the whole backdrop thing (setting up your own controls in the render script instead), there is a way out, I believe.

    I invite both of you to discuss the "scripted renderer" thing in the "laboratory" thread here: http://www.daz3d.com/forums/discussion/21611/
    Thank you for the invitation :) Taming the Scripted Renderer -- for multipass rendering among other things -- is one of my goals. I've only recently started looking under the hood so to speak but haven't gotten very far yet.

  • kalazarkalazar Posts: 415
    edited December 1969

    Hi, everybody. I guess I've come late to this party (lol) I am trying to learn lighting and getting good skin tone. Sigh...... I am posting a render I just did and I need all the help I can get to fix this. I am trying to read the thread but there's a LOT of info and my brain is having fits. So any quick tips would be appreciated whilst I'm reading. I did read a few recent posts and I'm clueless as to the gamma correction? Where is that?Any way here's my render. I used 2 spot lights and the skin texture for Meridiana. Thank you


    Kalazar

    Merianda8.jpg
    525 x 743 - 248K
Sign In or Register to comment.