iRays Dev Team answer about too dark indoor scenes

2

Comments

  • FishtalesFishtales Posts: 6,219

    I also see more reflected light from the ground in the picture than there is in the render.

    Set the SS Ground Colour in the Environment settings to mid grey or lighter to get more reflectance without resorting to Glossiness which wouldn't look right.

  • Hurdy3DHurdy3D Posts: 1,076
    edited January 2020

    I may have found a solution which works for me according to the first test.

    The magic keyword is tonemapping!

    I did first of all the image as usual.

    Than I rendered the same image an exr in the beauty mode (no nodes).

    I opened that exr in Affinity Photo and tonemapped it and got an image which looks how I would expect it.

     

     

    interior.png
    1380 x 942 - 2M
    interior2 tonemapped.png
    1380 x 942 - 7M
    Post edited by Hurdy3D on
  • algovincianalgovincian Posts: 2,670
    edited January 2020

    One thing I've noticed is that the backgrounds for HDRI seem too light compared to how much light they are throwing into the scene.

    - Greg

    Post edited by algovincian on
  • RafmerRafmer Posts: 564
    gerster said:
    Rafmer said:
    gerster said:

    Or here's an other example.

    An indoor scene in the iray Sun-Sky-Only mode, with a Sun node and Exposure Value adjusted to 12.

    The light comes directly through the windows. In real life I would expect that the whohle rooms is bright, but in iray it's way too dark. I always thought that's because of the poor light bouncing in iRay.

    So you say that's not the case and it's phsically correct that I need ghost lights to bring the room to the level of brightness I would expect from these scene in real life?

    You don't need any ghost lights, just drop the Exposure Value lower. Easy.

    if I do so the exterior will be much too bright, even if my camera is in the opposite direction of the sun

    As it should be. You cant see evenly lit two environments with so different illumination. Not in a photo, not with your own eyes either.

  • TheKDTheKD Posts: 2,711
    edited January 2020

    The big thing render engines are missing IMO is an eyeball mode. A mode that can mimic more what our eyes see instead of what a camera can see. Not everyone is looking to fake a photograph, I want it to seem more like you are looking on a scene with your eyeballs, rather than looking through a camera lense. Probably easier said than done lol. For now we gotta do tricks to get what we need.

    Post edited by TheKD on
  • FishtalesFishtales Posts: 6,219
    TheKD said:

    The big thing render engines are missing IMO is an eyeball mode. A mode that can mimic more what our eyes see instead of what a camera can see. Not everyone is looking to fake a photograph, I want it to seem more like you are looking on a scene with your eyeballs, rather than looking through a camera lense. Probably easier said than done lol. For now we gotta do tricks to get what we need.

    Not going to happen anytime soon laugh

    https://medium.com/photography-secrets/whats-the-difference-between-a-camera-and-a-human-eye-a006a795b09f

    https://www.cambridgeincolour.com/tutorials/cameras-vs-human-eye.htm

  • mclaughmclaugh Posts: 221
    gerster said:

    I may have found a solution which works for me according to the first test.

    The magic keyword is tonemapping!

    In other words, it's not a matter of the iRay dev team not understanding the issue, it's your not taking the time to learn how to use iRay. 

  • RayDAntRayDAnt Posts: 1,160
    nicstt said:

    ... However, for correct lighting, one needs path tracing to be fully implemented on the 20 series RTX cards. How are we doing with that. We have no idea if the OP was using an RTX card, and if his setup was able to utilise said card if he had one - and if the required RTX functionality is available (I don't know as I don't have a card purported to use this raytracing).

    Just to be clear, path-tracing in Iray results in exactly the same thing being rendered to screen regardless of which hardware platform (RTX GPU, non RTX GPU, Intel/AMD CPU) is being used. RTX hardware has no other other advantages to it over other platforms than simply more speediness (under most conditions.)

  • DripDrip Posts: 1,250
    gerster said:

    I did an other image to show the issue which I see.

    There are two G8Ms. One in the shadow and one in the sun. I think the skin of the one in the shadow is much too dark/grey.

    I'm do much photos with my smartphone, but I don't remmeber that there's such a big difference in the lightning in real life.

    Just take a look on these photo which I found on the internet:

    Just compare the girl in the front and the two people in the show on the right. There not such a big difference in their skin, compared to my render.

    What that photo also emphasizes, is, that simple concrete reflects way more light than what people generally find aesthetically pleasing, especially in their renders. Without that photo for reference, if we were asked to render such a scene with that kind of light, most of use would end up with a render where we can clearly make out every single tile of the street, if possible with a few sprigs of grass between them. But, looking at the photo, we can clearly see a reflection from the sun instead. So, either the tiles we'd render in Daz are not reflective or glossy enough, or the light intensity we use is too low. Most probably: a fair bit of both.

    And from that, it's fairly logical to assume that the same applies for our indoor shots: the light entering our indoor scene is probably not intense enough, while at the same time, many shaders we use are too matt.
    It's logical that we set up things that way, we add details to be seen, not to turn invisible from overlighting.

    But, maybe we should start to look at it from a different angle. Why do we put all those details in our scene? Do we place them there because they mean something, or do we put them there because otherwise our scene looks so empty? That is basically returning to setting up our scene in the first place, and deciding which parts are our subjects, and which are our secondary props, or fillers. We don't want our subjects to wash out in overlighting. The scene fillers however? Let them wash out a bit, they're just fillers anyway, right? The result might not be as bad as you think. But, you will at least end up with more light for the important parts you placed in more shaded areas.

  • Hurdy3DHurdy3D Posts: 1,076
    mclaugh said:
    gerster said:

    I may have found a solution which works for me according to the first test.

    The magic keyword is tonemapping!

    In other words, it's not a matter of the iRay dev team not understanding the issue, it's your not taking the time to learn how to use iRay. 

    I don't agree.

  • Hurdy3DHurdy3D Posts: 1,076
    Fishtales said:
    TheKD said:

    The big thing render engines are missing IMO is an eyeball mode. A mode that can mimic more what our eyes see instead of what a camera can see. Not everyone is looking to fake a photograph, I want it to seem more like you are looking on a scene with your eyeballs, rather than looking through a camera lense. Probably easier said than done lol. For now we gotta do tricks to get what we need.

    Not going to happen anytime soon laugh

    https://medium.com/photography-secrets/whats-the-difference-between-a-camera-and-a-human-eye-a006a795b09f

    https://www.cambridgeincolour.com/tutorials/cameras-vs-human-eye.htm

    maybe they could implement an automatic tonemapping like affinity?

  • RobinsonRobinson Posts: 751
    gerster said:

    that they donˋt see or understand the issue. So we will probably get no better light bouncing and are doomed for all eternity to use ghost lights frown

    Very interesting.  Yes, I have had problems with lighting in this context, though it's usually solved by fiddling with tone mapping settings and dialing lights to the "correct" intensity.

  • RobinsonRobinson Posts: 751

    Problem for the OP is that iRay is acting like a camera instead of human eyes. Human eyes will adjust and the light will seem much brighter.

    You'll get results like what you eyes adjust to give by doing things like hiding the roof of the room. Of course then you'll likely have new Sun - Sky shadow patterns inside but if the roof isn't visible where you see the sky you'll get away with it because the lighting looks natural. 

    This is also a big issue for "correct" lighting.  Your eyes adjust.  The camera doesn't.

  • Hurdy3DHurdy3D Posts: 1,076
    Robinson said:
    gerster said:

    that they donˋt see or understand the issue. So we will probably get no better light bouncing and are doomed for all eternity to use ghost lights frown

    Very interesting.  Yes, I have had problems with lighting in this context, though it's usually solved by fiddling with tone mapping settings and dialing lights to the "correct" intensity.

    yeah, I figured that tone mapping trick a few hours after my post out. Never read anything about that in the thousends tutotials I read surprise

  • Sven DullahSven Dullah Posts: 7,621
    scorpio said:
    nicstt said:
    I would expect them to behave the same with the same shader. Herein's the rub, shaders are never identical. Which I for one am grateful for.

     

    Not sure I understand what you mean? Apply IRay Uber to two pieces of furniture, they will have identical shaders but the shader settings may vary.

     

    Wowie's solution to the problem with 3DL pathtracing was to implement adaptive sampling to the aweSurface shader. With awe, ray/specular/reflection bounce depth, as well as SS- and shadow samples can be set per surface. So for problematic areas you just raise the values. The adaptive sampling will shoot additional rays in those areas with little render time penalty.

    The thread is about Iray rendering not 3dl so I don't really understand the relevance of this post and it could be rather confusing to the OP.

    Fair enough!

    Quote from the opening post:

    The light bouncing for rendering in-door not optimal. In-door scenes are much to dark, even if the sun goes directly through a window. And figures skins get quickly much too dark, too. The workaround is to work with a lot of ghost lights, but these consumes a lot of time. Are there any plans to improve light bouncing for indoor rendering?

    I was just pointing out adaptive sampling as one way of dealing with lowlight/indirect light scenarios, thinking pathtracing in general. So, with current IRay, what can be done, without adding ghostlights, removing ceilings, walls or using section planes?

    1 Adjust exposure/ tonemapping/ rendersettings

    2 Adjust shader/material settings

    3 Adjust levels/gamma in postwork

     

     

  • nicsttnicstt Posts: 11,715
    RayDAnt said:
    nicstt said:

    ... However, for correct lighting, one needs path tracing to be fully implemented on the 20 series RTX cards. How are we doing with that. We have no idea if the OP was using an RTX card, and if his setup was able to utilise said card if he had one - and if the required RTX functionality is available (I don't know as I don't have a card purported to use this raytracing).

    Just to be clear, path-tracing in Iray results in exactly the same thing being rendered to screen regardless of which hardware platform (RTX GPU, non RTX GPU, Intel/AMD CPU) is being used. RTX hardware has no other other advantages to it over other platforms than simply more speediness (under most conditions.)

    Forgive me, I was more meaning the functionality unique to RTX cards; something that would happen 'naturally' as opposed to beinga cheat. I refer to their presentations, which I always take with a pinch of salt, but they are still claims.

  • FishtalesFishtales Posts: 6,219
    gerster said:
    Robinson said:
    gerster said:

    that they donˋt see or understand the issue. So we will probably get no better light bouncing and are doomed for all eternity to use ghost lights frown

    Very interesting.  Yes, I have had problems with lighting in this context, though it's usually solved by fiddling with tone mapping settings and dialing lights to the "correct" intensity.

    yeah, I figured that tone mapping trick a few hours after my post out. Never read anything about that in the thousends tutotials I read surprise

    That depends on which Tone Mapping you are using.

    There is the DAZ Studio Tone Mapping (which is actually referring to film and camera settings).

    http://docs.daz3d.com/doku.php/public/software/dazstudio/4/referenceguide/interface/panes/render_settings/engine/nvidia_iray/tone_mapping/start

    Or Photographic Tone Mapping (which changes the Colour Space of the image and remaps it to get a higher Dynamic Range).

    https://en.wikipedia.org/wiki/Tone_mapping

    These are two entirely different processes. 

  • gerster said:
    Robinson said:
    gerster said:

    that they donˋt see or understand the issue. So we will probably get no better light bouncing and are doomed for all eternity to use ghost lights frown

    Very interesting.  Yes, I have had problems with lighting in this context, though it's usually solved by fiddling with tone mapping settings and dialing lights to the "correct" intensity.

    yeah, I figured that tone mapping trick a few hours after my post out. Never read anything about that in the thousends tutotials I read surprise

    You can adjust Tone Mapping in DS, too - though the controls are simpler (gamma and white/black compression).

  • Hurdy3DHurdy3D Posts: 1,076

    I talking about post editing tonemapping with affinity photo. It's just two clicks. I don't do tonemapping in DAZ.

  • MasterstrokeMasterstroke Posts: 2,338

    Besides: In real life, you have an athmospere with dust particels in it bouncing of light as well. That is different from having a sky image background.
    You also needed a volume shader with dust particles to get closer to realism.

  • RayDAntRayDAnt Posts: 1,160
    nicstt said:
    RayDAnt said:
    nicstt said:

    ... However, for correct lighting, one needs path tracing to be fully implemented on the 20 series RTX cards. How are we doing with that. We have no idea if the OP was using an RTX card, and if his setup was able to utilise said card if he had one - and if the required RTX functionality is available (I don't know as I don't have a card purported to use this raytracing).

    Just to be clear, path-tracing in Iray results in exactly the same thing being rendered to screen regardless of which hardware platform (RTX GPU, non RTX GPU, Intel/AMD CPU) is being used. RTX hardware has no other other advantages to it over other platforms than simply more speediness (under most conditions.)

    Forgive me, I was more meaning the functionality unique to RTX cards; something that would happen 'naturally' as opposed to beinga cheat. I refer to their presentations, which I always take with a pinch of salt, but they are still claims.

    Nvidia has never made any claims that having RTX hardware would effect already purely physically based (unbiased) rendering applications like Iray in any way other than simply speeding up the rendering process. They've said plenty about it making a visual difference for biased rendering applications (ie. Games) by making the inclusion of some physically based rendering passes into the mix possible. But Iray was already 100% physically based even back before it was called Iray.

  • GoneGone Posts: 833
    gerster said:

    I talking about post editing tonemapping with affinity photo. It's just two clicks. I don't do tonemapping in DAZ.

    Excuse me???

    Earlier, you showed how important tone mapping is to getting the lighting right - but you don't use it in DS!!! If you're only using half a tool you shouldn't be complaining when you get half a result.

    Since I can only use iRay in CPU, I don't do a lot of work with it because it is so excruciatingly slow so I didn't let these run very long. Just enough to show how important tone mapping is in iRay.

    All the internal lights were turned off and the scene is lit only by an HDRI "shining" in through the windows.

    The first has the default tonemapping of ISO 100 shutter speed 128. The second has ISO 500 shutter speed 10.

    I think anyone can see that tone mapping makes a difference.

    Luca100_128.jpg
    600 x 900 - 48K
    Luca500_10.jpg
    600 x 900 - 220K
  • RayDAntRayDAnt Posts: 1,160
    edited January 2020
    Gone said:
    gerster said:

    I talking about post editing tonemapping with affinity photo. It's just two clicks. I don't do tonemapping in DAZ.

    Excuse me???

    Earlier, you showed how important tone mapping is to getting the lighting right - but you don't use it in DS!!! If you're only using half a tool you shouldn't be complaining when you get half a result.

    Since I can only use iRay in CPU, I don't do a lot of work with it because it is so excruciatingly slow so I didn't let these run very long. Just enough to show how important tone mapping is in iRay.

    All the internal lights were turned off and the scene is lit only by an HDRI "shining" in through the windows.

    The first has the default tonemapping of ISO 100 shutter speed 128. The second has ISO 500 shutter speed 10.

    I think anyone can see that tone mapping makes a difference.

    As long as it's the raw EXR file straight out of DS's temp folder that's getting tonemapped (rather than the JPG/PNG/Tif/BMP DS offers as output options) as a post-process, there's nothing wrong with that. Personally I'd even encourage that workflow for really important renders (that EXR is akin to the Camera Raw in digital still photography.) Yeah, you can do the whole thing in DS itself. But working from the EXR means that you can go back later and re-work the exposure entirely without having to re-render anything. It's the sort of post-processing latitude that conventional photographers can only dream of (I know because I used to have those dreams.)

    Post edited by RayDAnt on
  • GoneGone Posts: 833

    And that is a perfectly fine work flow if that is what you want - but this thread started as a complaint about how iRay does internal lighting poorly.

    As many have already pointed out, you won't get the results you want out of DS if you don't use the renderer the way it was intended.

  • MendomanMendoman Posts: 404
    edited January 2020

    I think Iray might still have same problems that Cycles used to have before Dynamic range ( Filmic ) was improved. Changing ISO etc. tone mapping values has a tendency to wash away details, so it's not always a solution. Those who are interested, this Andrew's post about it explains it much better than I can: https://www.blenderguru.com/tutorials/secret-ingredient-photorealism

    Post edited by Mendoman on
  • Problem for the OP is that iRay is acting like a camera instead of human eyes. Human eyes will adjust and the light will seem much brighter.

    You'll get results like what your eyes adjust to give by doing things like hiding the roof of the room. Of course then you'll likely have new Sun - Sky shadow patterns inside but if the roof isn't visible where you see the sky you'll get away with it because the lighting looks natural. 

    Also, if they are comparing to "photographs" many of such are post processed to level out the dynamic range, or combinations of several photos with different exposures.
    Reality has become very hard to judge whne viewed through the work of others.

  • FlortaleFlortale Posts: 611

    You can't rely on outdoor light to properly light an indoor scene.  You need to setup square plane ghost lights at the windows to shoot some light inside the room.

  • lilweeplilweep Posts: 2,778
    Mendoman said:

    I think Iray might still have same problems that Cycles used to have before Dynamic range ( Filmic ) was improved. Changing ISO etc. tone mapping values has a tendency to wash away details, so it's not always a solution. Those who are interested, this Andrew's post about it explains it much better than I can: https://www.blenderguru.com/tutorials/secret-ingredient-photorealism

    More reason for someone to develop a good Daz to Blender bridge.

  • Hurdy3DHurdy3D Posts: 1,076
    Gone said:
    gerster said:

    I talking about post editing tonemapping with affinity photo. It's just two clicks. I don't do tonemapping in DAZ.

    Excuse me???

    Earlier, you showed how important tone mapping is to getting the lighting right - but you don't use it in DS!!! If you're only using half a tool you shouldn't be complaining when you get half a result.

    Since I can only use iRay in CPU, I don't do a lot of work with it because it is so excruciatingly slow so I didn't let these run very long. Just enough to show how important tone mapping is in iRay.

    All the internal lights were turned off and the scene is lit only by an HDRI "shining" in through the windows.

    The first has the default tonemapping of ISO 100 shutter speed 128. The second has ISO 500 shutter speed 10.

    I think anyone can see that tone mapping makes a difference.

    I use only in DAZ the Exposure Value, but I'm very carefully about this, because of the clipping.

  • Hurdy3DHurdy3D Posts: 1,076
    Mendoman said:

    I think Iray might still have same problems that Cycles used to have before Dynamic range ( Filmic ) was improved. Changing ISO etc. tone mapping values has a tendency to wash away details, so it's not always a solution. Those who are interested, this Andrew's post about it explains it much better than I can: https://www.blenderguru.com/tutorials/secret-ingredient-photorealism

    Thank you, that video was super helpfully to unerstand.

    So basically the issue is not the light bouncing of iRay, it's the color system!

    If we would have a better color system we would have no clipping and would need no ghost lights.

Sign In or Register to comment.