3Delight Surface and Lighting Thread

1101113151652

Comments

  • mjc1016mjc1016 Posts: 15,001
    edited December 1969

    Kamion99 said:
    It's not UVmapping here though. that sphere on the left isn't mapped at all. I was testing whether daz primitives had any hidden settings or such that could effect how it was rendering. That sphere was imported from blender and subdivided, but it has no UV maps whatsoever.

    The DAZ primitives use a simple projection mapping...cubic for the cube, etc, to give them a UV map.

    But in the case of the hair...one possible cause...it's like you've got the shader set for the U direction, but the way the mapping is, you are getting the V direction..or more specifically, what you would EXPECT the V direction to be, if it were mapped correctly. I guess the best way to describe it...you are seeing a phase shift because the mesh is mapped 'crosswise' instead of 'lengthwise'...

    And it's compounded by the 'twisting' of the planes that make up the hair. (and on that one, it looks like they may be 'cupped' too...)

  • j cadej cade Posts: 2,310
    edited December 1969

    I've done some more tests, and as far as I can tell, something is definitely up with the glossy metallic option for me in the default brick.
    the two objects below both use the same exact shader, both objects are subdivided. I personally deformed and borked the uv-mapping of a sphere in blender. everything other than basic objects goes disco.

    Can someone test to see whether it's just something messed up on my computer?

    test-anis2.png
    500 x 500 - 148K
    test-anis1.png
    500 x 500 - 137K
  • mjc1016mjc1016 Posts: 15,001
    edited February 2014

    Can you post a screenshot of your network?

    I'm in the middle of what is going to be about a 10 hr render, or I'd run some tests, now.

    Post edited by mjc1016 on
  • j cadej cade Posts: 2,310
    edited December 1969

    Here's my current test. One nice thing that's come out of this is I updated Studio. Apparently there's a new specular block. Notice how the sphere in the test scene on the left doesn't have the weird specular squares, but the instant it is applied to a more interesting object in a scene (not that interesting its a plane I draped in blender but its base is just a plane.) it goes weird.

    example.png
    1915 x 1039 - 646K
  • Richard HaseltineRichard Haseltine Posts: 98,281
    edited December 1969

    What happens if you set the tile repeats to 1?

  • j cadej cade Posts: 2,310
    edited December 1969

    Absolutely nothing in terms of the specular. My working hypothesis is that it has something to do with subdivision, the squares of the specular line up exactly with the edges of the base mesh. But beyond that I don't know, Ive tested different render settings... pretty much anything I can think of. Also, the anisotropic shader in Ubersurface works, so there must just be something somewhere I'm missing

  • Richard HaseltineRichard Haseltine Posts: 98,281
    edited December 1969

    Please make a bug report, with as much information as possible.

  • wowiewowie Posts: 2,029
    edited March 2014

    Slightly out of topic but still related.

    I've been playing around with progressive rendering and other render options. Then I started noticing that hair renders isn't just faster, but textures that have seam issues (needing the use of lower shading rates) don't have seam issues at all with progressive rendering. First I thought it was the use of the box filter, but I tried rendering the usual way with the box filter (and the same 1:1 pixel filter width) but the progressive render still finish much quicker.

    This is only for renders with hair or a lot of transmapped surfaces though. Progressive rendering still takes longer to finish with just figures and not a lot of transmapped surfaces.

    Anybody else notice this?

    Post edited by wowie on
  • Mustakettu85Mustakettu85 Posts: 2,933
    edited March 2014

    Kamion99 said:
    I too am trying to make a better hair shader. I am attempting to use anisotropic highlights by setting the daz default block to glossy metallic. The bad news is it seems dependent on the resolution of the mesh.

    Glossy metallic, Ashikhmin-Shirley, Ward (last two from the BSDF brick) - any anisotropy behaves the same in the shader mixer.


    Oh, while we're at it, let's document the whole brick in brief.

    Oren-Nayar: diffuse for rough surfaces (like, really rough). Don't expect it to do anything specular.

    Then there are specular models:

    Blinn: Fresnel: yes. Anisotropy: no.
    Ashikhmin-Shirley: Fresnel: yes. Anisotropy: yes.
    Cook-Torrance: Fresnel: yes. Anisotropy: no.
    Ward: Fresnel: no. Anisotropy: yes.

    _______________________


    wowie said:

    It's the End of Summer hair. The same hair long hair I used on earlier shots.

    Thanks! I didn't recognise it from the side view.


    Btw, I also found that with your technique, you can use the opacity controls not just to control shadows (intensity and color) but also the amount of AO as well (if you use occlusion on the shell).

    That's cool!

    Post edited by Mustakettu85 on
  • Mustakettu85Mustakettu85 Posts: 2,933
    edited December 1969

    Hi everyone,

    Two questions for today; if anyone has even the vaguest idea, I'd be very grateful:

    1) How to write a shadow catcher? I know there is a brick in the SMixer, but I want to write a non-"black box" one. Google doesn't really seem to help. I don't need exact code, but just the general procedure - I'm drawing a blank completely as to the algorithm.

    2) How to write proper gamma controls into the point-cloud script?

  • Takeo.KenseiTakeo.Kensei Posts: 1,303
    edited December 1969

    I'll be nice and try to answer.

    Q1/ There could be many techs depending on how you calculate your shadow.

    I don't know how the DS brick is working but it could be a simple message passing to the surface shader. So when calculating the shadow catcher, this could be pretty simple as you just set the color to black and opacity to 1 when "not in shadow" and put opacity to 0 otherwise

    Or you could go a step furthr and do some linear function for the opacity and color with the same tech

    Or you can calculate the occlusion on the surface and set the opacity to be equal to occlusion

    These are simple examples and you could use other derivative techs.

    Q2 / I don't see the need for a gamma control in point cloud script. You just have to adjust your gamma slider in the render panel. The only thing missing is the Gamma ON/OFF button but you can do that in the normal render panel then switch to ptc rendering and it seems to keep the parameter as it recalculate all textures with tdlmake each time I toggle the Gamma ON/OFF and render in ptc mode (I didn't dig further than that but you could check wether the TDLmade textures are Gamma corrected or not)

  • wowiewowie Posts: 2,029
    edited December 1969

    I have a question. How do you make chromatic aberration in DS?

  • mjc1016mjc1016 Posts: 15,001
    edited December 1969

    wowie said:
    I have a question. How do you make chromatic aberration in DS?

    You fake it...and do it in post.

    Dispersion can be done with some fancy shader work, but I haven't seen anything for aberration.

  • wowiewowie Posts: 2,029
    edited December 1969

    mjc1016 said:

    You fake it...and do it in post.

    Dispersion can be done with some fancy shader work, but I haven't seen anything for aberration.

    I was afraid that's gonna be the answer. While doable for stills, it's hard to do convincingly with animation.

  • Kevin SandersonKevin Sanderson Posts: 1,643
    edited December 1969

    You could do it in AE with a plugin:

    http://www.redgiant.com/videos/quicktips/item/145/

  • Mustakettu85Mustakettu85 Posts: 2,933
    edited December 1969


    Q1/ There could be many techs depending on how you calculate your shadow...


    Thanks a lot! So, "lightsource()" within illuminance... and here's to hoping that DS lights use "__InShadow" and not some obscure wording thereof.


    Q2 / it seems to keep the parameter...

    Oh, does it? *checks again* Yeah, apparently it does... it's just that on my machine(s) the scripted renderer resets its parameters to default whenever I save a render under a new name (is it just me?), so I thought it was resetting the on/off switch along with the gamma value.

    I'm not sure how to compare .tdl files to the original textures properly - i-display opens .tdl but not jpeg, and IrfanView does not open .tdl... the .tdl in i-display looks darker to me than the jpeg in IrfanView, but not as dark as the jpeg looks if I correct it to 0.45 gamma...

  • Takeo.KenseiTakeo.Kensei Posts: 1,303
    edited March 2014

    wowie said:
    mjc1016 said:

    You fake it...and do it in post.

    Dispersion can be done with some fancy shader work, but I haven't seen anything for aberration.

    I was afraid that's gonna be the answer. While doable for stills, it's hard to do convincingly with animation.

    You need an unbiased renderer with spectral light and a shader that will calculate the colors on the decomposed wavelenght

    Not doable inside DAZ with actual shaders. But you could do that with Luxrender or Octane.

    I never tried but I guess it is also doable in Blender with Cycles.

    I find Cycles very powerful and it is very affordable in term of price, and you can get quick renders depending on your Graphic card

    With Renderman it is possible to fake it but you have to write the shader depending on what you want to get and what your animation is.


    Q1/ There could be many techs depending on how you calculate your shadow...

    Thanks a lot! So, "lightsource()" within illuminance... and here's to hoping that DS lights use "__InShadow" and not some obscure wording thereof.

    You need to output a boolean when calculating the shadows and set it to 1 when there is shadow. I don't know if DS brick is made this way and I never tried to do it but that is one of the idea I'd try. I personnaly find the wording pretty clear if it is made that way



    Q2 / it seems to keep the parameter...

    Oh, does it? *checks again* Yeah, apparently it does... it's just that on my machine(s) the scripted renderer resets its parameters to default whenever I save a render under a new name (is it just me?), so I thought it was resetting the on/off switch along with the gamma value.

    I'm not sure how to compare .tdl files to the original textures properly - i-display opens .tdl but not jpeg, and IrfanView does not open .tdl... the .tdl in i-display looks darker to me than the jpeg in IrfanView, but not as dark as the jpeg looks if I correct it to 0.45 gamma...


    I don't have any problem with it beeing resetted till now, so I can't really say much about that. Have to ask other people to test to check if they have the problem with the Gamma ON/OFF knob
    To open TDL file you need an image editor that can read TIFF Files if I'm correct.


    Post edited by Takeo.Kensei on
  • mjc1016mjc1016 Posts: 15,001
    edited December 1969

    wowie said:
    mjc1016 said:

    You fake it...and do it in post.

    Dispersion can be done with some fancy shader work, but I haven't seen anything for aberration.

    I was afraid that's gonna be the answer. While doable for stills, it's hard to do convincingly with animation.

    You need an unbiased renderer with spectral light and a shader that will calculate the colors on the decomposed wavelenght

    Not doable inside DAZ with actual shaders. But you could do that with Luxrender or Octane.

    I never tried but I guess it is also doable in Blender with Cycles.

    I find Cycles very powerful and it is very affordable in term of price, and you can get quick renders depending on your Graphic card

    With Renderman it is possible to fake it but you have to write the shader depending on what you want to get and what your animation is.

    I find it kind of funny that in the world of photography that they go to such lengths to reduce/eliminate it and we in the '3d world' try to create it.

    There is one Youtube video showing a renderman shader based CA...but it is described as being an 'in house' shader and not going to be publicly released...with no mention of anything else. So it's either a very 'expensive' shader that will take a very long time to render or another way of 'faking' it.

    And there is a way to do it in Maya, but I think it involves some Maya specific camera shaders/code.

    Unbiased, physically based renederers...yeah, they'll do it.

  • Takeo.KenseiTakeo.Kensei Posts: 1,303
    edited December 1969

    mjc1016 said:

    I find it kind of funny that in the world of photography that they go to such lengths to reduce/eliminate it and we in the '3d world' try to create it.

    There is one Youtube video showing a renderman shader based CA...but it is described as being an 'in house' shader and not going to be publicly released...with no mention of anything else. So it's either a very 'expensive' shader that will take a very long time to render or another way of 'faking' it.

    I've just seen the video comments. Seems there is some TCL involved, from the author's comment, and not just pure RSL. And we don't know how he feed his shader and how the TCL was involved in that. And still he is outputing AOV so there could also be some post compositiing. All these thing would make it pretty much unusable for DS

    I'm not sure but I think I'd try to go the Batch image processing way after some experiments in a 2D picture editor.

  • mjc1016mjc1016 Posts: 15,001
    edited December 1969

    mjc1016 said:

    I find it kind of funny that in the world of photography that they go to such lengths to reduce/eliminate it and we in the '3d world' try to create it.

    There is one Youtube video showing a renderman shader based CA...but it is described as being an 'in house' shader and not going to be publicly released...with no mention of anything else. So it's either a very 'expensive' shader that will take a very long time to render or another way of 'faking' it.

    I've just seen the video comments. Seems there is some TCL involved, from the author's comment, and not just pure RSL. And we don't know how he feed his shader and how the TCL was involved in that. And still he is outputing AOV so there could also be some post compositiing. All these thing would make it pretty much unusable for DS

    I'm not sure but I think I'd try to go the Batch image processing way after some experiments in a 2D picture editor.


    Theoretically...it is all doable in DS, but like so many features of 3DL, there's no easy way to get there from here. Take the pointcloud script...it came into being long after PC was a feature of 3DL. Same with the hair plugins...the interface to be able to access the feature doesn't yet exist.

    And while the DS scripting (essentially qt) is probably up to the task, I have no idea where to begin to get from point a to point b.

    So back to my original statement...do it in post (and Kevin's recommendation of that AE plugin is probably one of the easier ways...if you've got AE).

  • wancowwancow Posts: 2,708
    edited December 1969

    mjc1016 said:
    Take the pointcloud script...it came into being long after PC was a feature of 3DL. Same with the hair plugins...the interface to be able to access the feature doesn't yet exist.

    And while the DS scripting (essentially qt) is probably up to the task, I have no idea where to begin to get from point a to point b.

    So back to my original statement...do it in post (and Kevin's recommendation of that AE plugin is probably one of the easier ways...if you've got AE).

    The more commercially viable D/S becomes, the more we'll see the kind of plugins that take advantage of 3DL. It'll take commercials and, possibly, a film to be produced to get there....

  • Takeo.KenseiTakeo.Kensei Posts: 1,303
    edited December 1969

    wancow said:
    mjc1016 said:
    Take the pointcloud script...it came into being long after PC was a feature of 3DL. Same with the hair plugins...the interface to be able to access the feature doesn't yet exist.

    And while the DS scripting (essentially qt) is probably up to the task, I have no idea where to begin to get from point a to point b.

    So back to my original statement...do it in post (and Kevin's recommendation of that AE plugin is probably one of the easier ways...if you've got AE).

    The more commercially viable D/S becomes, the more we'll see the kind of plugins that take advantage of 3DL. It'll take commercials and, possibly, a film to be produced to get there....

    3Delight is already used to make movies. The DS approach is for 3D amateurs. Very far from a usable movie pipeline. I don't see that happening in a close future. Many things are lacking to get there. It is not only the shaders but the rest.

    Blender is closer to what I'd expect for creating a movie as it has fluid, particles, smoke, dynamics, hair, etc...

  • wancowwancow Posts: 2,708
    edited December 1969

    https://www.youtube.com/watch?v=ZMBPuuqTk_M

    Made with Poser which, like D/S, is for amateurs.

  • wowiewowie Posts: 2,029
    edited March 2014


    3Delight is already used to make movies. The DS approach is for 3D amateurs. Very far from a usable movie pipeline. I don't see that happening in a close future. Many things are lacking to get there. It is not only the shaders but the rest.

    Blender is closer to what I'd expect for creating a movie as it has fluid, particles, smoke, dynamics, hair, etc...


    Well, don't forget there's also Carrara, which has close to native support for DS content (other than light/shaders). But I do agree that DAZ Studio don't have a robust interface to the renderer. I guess that's understandable because it's (still) focused on the hobbyist artist market.

    I do wish there are more renderer export options though, for both DS and Carrara. I know you can always export OBJ/FBX and Alembic and work with the application of your choice.


    You need an unbiased renderer with spectral light and a shader that will calculate the colors on the decomposed wavelenght
    Not doable inside DAZ with actual shaders. But you could do that with Luxrender or Octane.

    Don't like those two. Luxrender is way too slow and Octane has a lot of hardware restrictions (proprietary hardware/API, memory limitations). Plus a GPU is very power hungry.

    I've been playing with this - http://www.fluidray.com/ - and so far I like it, a lot. There's practically no restrictions like Octane (resolution and hardware requirements) for the free beta download. Noise and fireflies disappear much, much quicker than Luxrender 1.3. If I go the hardware route, I'd probably go with Caustics - http://www.caustic.com/visualizer/maya/

    Here's a video of it in action - http://www.youtube.com/watch?v=h5mEEho0ago
    Notice the time it took the image to clear up. Now that's FAST!

    Post edited by wowie on
  • wancowwancow Posts: 2,708
    edited March 2014

    Okay....

    I wrote this a while back. Some of you don't seem to understand what you have with the Render Engine D/S Gives you:
    http://wancow.deviantart.com/journal/3Delight-The-2-Render-Engine-in-Hollywood-366276410
    Please read it so I don't have to keep repeating its content... setting all y'all straight is tiresome...

    And Takeo: D/S can do just about anything any other software can do IF the plugin is written to do it. It's a platform. A very nice simple platform that is infinitely customizeable. It simply requires some investment... now, Piracy and a whole lot of other factors make that an economic gamble, but it could still be profitable.

    I was just watching this:
    http://www.dream-lounge.com/dreamlight/sfx_Pro_overview_final_editl.wmv
    That's been available since D/S2...

    Okay, yeah, Maya is the grandaddy of all platforms... Blender is great and all, but has anyone used it to produce a feature film? You say they can, but has it been? It's biggest obstacle has been it's less than intuitive interface.

    Yes, we need a new particle plugin and a water plugin and a dynamics plugin and a decent third party cloth plugin...

    But you guys simply don't seem to understand what you already have.

    Post edited by wancow on
  • Takeo.KenseiTakeo.Kensei Posts: 1,303
    edited December 1969

    @Wowie : Have a look at that http://www.blendernation.com/2014/03/18/super-fast-real-time-rendering-with-combined-rendering-technologies/

    Just have to wait a bit to know how it was made

    @ Wancow : I know what 3Delight is capable of. I just saying DS is a limiting factor. It's very different when used in an other 3D package. There are examples in the web you can see from maya or softimage

    For movies made with Blender you can watch

    Sintel : https://www.youtube.com/watch?v=HomAZcKm3Jo
    Tears of steal https://www.youtube.com/watch?v=R6MlUcmOul8
    Caminandes https://www.youtube.com/watch?v=XCejtdKK40o&hd=1
    Big Buck Bunny https://www.youtube.com/watch?v=t4gjl-uwUHc&hd=1
    Elephants dream https://www.youtube.com/watch?v=MFg8mBaMoPs&hd=1

    If you search for "blender movie" in Youtube you'll see a lot of others.

  • wowiewowie Posts: 2,029
    edited March 2014

    wancow said:

    But you guys simply don't seem to understand what you already have.

    Honestly, I think you're confusing things. I wholeheartedly agree that 3delight is a very capable renderer. But a renderer will only be as good as the artist's input and interface they have to it. It's not 3delight that's keeping us from achieving what we want (well, at least what I want), it's DAZ Studio interface to 3delight.

    @Wowie : Have a look at that http://www.blendernation.com/2014/03/18/super-fast-real-time-rendering-with-combined-rendering-technologies/

    Just have to wait a bit to know how it was made

    Quite nice.

    I think it's pretty similar to what Furryball for Maya (http://furryball.aaa-studio.eu) which combines ray tracing and a real time OpenGL renderer. You can pretty much do AO and SSS in realtime now (on a high end GPU), but reflections/refractions is trickier (outside of baking everything). Doing them with raytracing makes the code simpler and since it's an embarrassingly parallel problem, it's well suited to GPUs.

    Unfortunately, because GPUs don't have a true unified memory addressing with the CPU, you're still going to have memory limitations (having all the data on local memory) or performance hits (ie. moving data from/to GPU to the CPU). The most interesting pure GPU renderer I've seen so far is Redshift3D (https://www.redshift3d.com/) which allows you to just put only scene geometry data on the local GPU memory and pretty much stream texture data from main memory. Although it also uses CUDA, they do have plans for an OpenCL port, which means it should be portable to other hardware.

    ImgTech Caustics approach works a little bit different. If I understand correctly, they're using the accelerator boards just to do ray intersections. The actual shading is actually still done on the CPU side. The end effect is the same - you only need to send geometry data to the accelerator local memory and keep everything else on the CPU. So technically, you can still use the same code you're using for materials.

    The thing I love about ImgTech's approach is that their entire board (the high end one with two chips and 16 GB RAM) consumes only 60 watts at most. They claim a typical usage will only push it to 40 watts. Compare that to high end GPUs which can consume something like 250 or 300 watts. It still has the drawbacks of not having a unified memory with CPU, but it uses less power, outputs less heat and have a lot more memory than your typical consumer GPU.

    Post edited by wowie on
  • Takeo.KenseiTakeo.Kensei Posts: 1,303
    edited March 2014

    @Wowie : The two renderers are very interresting Wowie. However it's a pity it is only again limited to Autodesk Product in order to use them
    OK I know that Autodesk is pretty much almost the only 3D Package Provider in the Industry

    I thought of using the GPU a long time ago to speed up renders inside DS but didn't have time to try anything
    In term of hardware, because of ATI not pushing up OpenCL to be better you end up choosing Nvidia because almost every GPU renders are made for CUDA
    The OpenCL Standard changed last year in order to catch up with Cuda, if I'm correct, but it takes time for the implementation.
    I'd like it to be better because OpenCL would work on both AMD and Nvidia

    Btw, if you can use Blender a bit, you can try Corona render. I did a quick test the other day with it and it was blazing fast on CPU. Don't know yet how it would go if I was trying chromatic aberration or more complicated scenes. There are a bit too much new things to play with for me these days. Don't know how I could make days longer

    Post edited by Takeo.Kensei on
  • Kevin SandersonKevin Sanderson Posts: 1,643
    edited December 1969

    Those Corona renders are very nice!

  • MEC4DMEC4D Posts: 5,249
    edited December 1969

    Actually my new HDRI light maps create chromatic aberration in DS as a result of indirect direct shadows
    however chromatic aberration is result of a camera lens
    I fake it before in Carra grouping 3 light together red-green-blue for the (fake full spectrum ) with a nice result did not tried it in DS

    wowie said:
    I have a question. How do you make chromatic aberration in DS?
Sign In or Register to comment.